EP1528464B1 - Interface utilisateur proactive ayant un agent évolutif - Google Patents

Interface utilisateur proactive ayant un agent évolutif Download PDF

Info

Publication number
EP1528464B1
EP1528464B1 EP04021148.4A EP04021148A EP1528464B1 EP 1528464 B1 EP1528464 B1 EP 1528464B1 EP 04021148 A EP04021148 A EP 04021148A EP 1528464 B1 EP1528464 B1 EP 1528464B1
Authority
EP
European Patent Office
Prior art keywords
user
intelligent agent
dna
agent
evolution
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Not-in-force
Application number
EP04021148.4A
Other languages
German (de)
English (en)
Other versions
EP1528464A2 (fr
EP1528464A3 (fr
Inventor
Jong-Goo Lee
Eyal Toledano
Natan Linder
Ran Ben-Yair
Yariv Eisenberg
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics Co Ltd filed Critical Samsung Electronics Co Ltd
Publication of EP1528464A2 publication Critical patent/EP1528464A2/fr
Publication of EP1528464A3 publication Critical patent/EP1528464A3/fr
Application granted granted Critical
Publication of EP1528464B1 publication Critical patent/EP1528464B1/fr
Not-in-force legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/72427User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality for supporting games or graphical animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/7243User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages
    • H04M1/72436User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages for text messaging, e.g. SMS or e-mail
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72469User interfaces specially adapted for cordless or mobile telephones for operating the device by selecting functions from two or more displayed items, e.g. menus or icons
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72469User interfaces specially adapted for cordless or mobile telephones for operating the device by selecting functions from two or more displayed items, e.g. menus or icons
    • H04M1/72472User interfaces specially adapted for cordless or mobile telephones for operating the device by selecting functions from two or more displayed items, e.g. menus or icons wherein the items are sorted according to specific criteria, e.g. frequency of use
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/40Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of platform network
    • A63F2300/406Transmission via wireless network, e.g. pager or GSM
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/6027Methods for processing data by generating or executing the game program using adaptive systems learning from user actions, e.g. for skill level adjustment
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/12Details of telephonic subscriber devices including a sensor for measuring a physical value, e.g. temperature or motion

Definitions

  • the present invention relates to a proactive user interface including an evolving agent, and systems and methods thereof, particularly for use with mobile information devices.
  • mobile and portable wireless devices have expanded dramatically in recent years. Many such devices having varying functions, internal resources, and capabilities now exist, and include, but are not limited to, mobile telephones, personal digital assistants, medical and laboratory instrumentation, smart cards, and set-top boxes. All such devices can be refered to are mobile information devices. The devices tend to be special purpose, limited-function devices, rather than the general-purpose personal computer. Many of these devices are connected to the Internet, and are used for a variety of applications.
  • cellular telephone One example of such a mobile information device is the cellular telephone.
  • Cellular telephones are fast becoming ubiquitous; and the use of cellular telephones is even surpassing that of traditional PSTN (public switched telephone network) telephones or "land line" telephones.
  • PSTN public switched telephone network
  • Cellular telephones themselves are becoming more sophisticated, and in fact are actually computational devices with embedded operating systems.
  • cellular telephones As cellular telephones become more sophisticated, the range of functions that they offer is also potentially becoming more extensive. However, currently available functions are typically related to extensions of functions already present in regular (land line) telephones, and/or the merging of certain functions of personal digital assistants (PDAs) with those of cellular telephones.
  • PDAs personal digital assistants
  • the user interface provided with cellular telephones is similarly non-sophisticated, typically featuring a keypad for scrolling through a few simple menus. Customization, although clearly desired by customers who have spent significant amounts of money on personalized ring tones and other cellular telephone accessories, is still limited to a very few functions of the cellular telephone.
  • cellular telephones currently lack any automatic personalization, for example the user interface and custom/tailored functionalities that are required for better use of the mobile information device, and/or the ability to react according to the behavior of the user.
  • AI software which is capable of learning has been developed, albeit only for specialized laboratory functions. For example, "artificial intelligence” (AI) software has been developed. The term “AI” has been given a number of definitions. “AI is the study of the computations that make it possible to perceive, reason, and act.”(Artificial Intelligence A Modem Approach (second edition) by Stuart Russell , Peter Norvig (Prentice Hall, Pearson Education Inc, 2003). AI software combines several different concepts, such as perception, which provides an interface to the world in which the AI software is required to reason and act. Examples include but are not limited to, natural language processing -communicating, understanding document content and context of natural language; computer vision - perceive objects from imagery source; and sensor systems - perception of objects and features of perceived objects analyzing sensory data, etc.
  • Knowledge representation is responsible for representing, extracting and storing knowledge.
  • This discipline also provides techniques to generalize knowledge, feature extraction and enumeration, object state construction and definitions.
  • the implementation itself may be performed by commonly using known data structures, such as graphs, vectors, tables, etc.
  • Automated reasoning combines the algorithms that use the knowledge representation and perception to draw new conclusions, infer questions and answers, and achieve the agent goals.
  • Rule bases system rules are evaluated against the knowledge base and perceived state for reasoning; search systems - the use of well known data structures for searching for an intelligent conclusion according to the perceived state, the available knowledge and goal (examples include decision trees, state graphs, minimax decision etc); classifiers - the target of the classifier reasoning system is to classify a perceived state represented as an experiment that has no classification tag. According to a pre-classified knowledge base the classifier will infer the classification of the new experiment (examples include vector distance heuristics, Support Vector Machine, Classifier Neural Network etc).
  • the target of learning is improving the potential performance of the AI reasoning system by generalization over experiences.
  • the input of a learning algorithm will be the experiment and the output would be modifications of the knowledge base according to the results (examples include Reinforcement learning, Batch learning, Support Vector Machine etc).
  • the article titled " An Agent System for Learning Profiles in Broadcasting Applications on the Internet" by C.Cuenca and J. - C.Huedin, Artificial Evolution Third European Conference AE, 1998, pages 107 to 119, ISBN: 3-540-64169-6 refers to a server path and a client path.
  • the server has to provide content to a subscribing client and for this reason, it has to maintain client accounts and information related to its content offer.
  • the client by means of a multi-agent system, interacts with a user, proposes him programs, keeps track of his preferences, and negotiates the information it needs with the different servers.
  • a multi-agent architecture is discussed based on a model of data fusion. Agents communicate with each other according to the fact of reducing the information volume and increasing its semantics in each level. In its current implementation the system includes five agents. Each agent can be poorly reactive, cognitive and adaptive, depending on its objectives.
  • the agents are graphical interface agent, program fusion agent, user behavior analysis, program guidelines agent and genetic algorithm agent.
  • the central part is the genetic algorithm agent that is responsible for the learning process and for the relation of the program proposition. Three basic uses behavior cases have been studied. In the first case, the user has a wide interest.
  • the user has a precise interest.
  • the user changes his mind. It has been observed that the system learns quickly the users preferences. That is a result of the continuous action of the genetic algorithm which runs independently of the users interaction contributing to a faster convergence. As a consequence, the user soon receives programs that is his taste.
  • the background art does not teach or suggest a system or method for enabling intelligent software at least for mobile information devices to learn and evolve specifically for interacting with human users.
  • the background art also does not teach or suggest an intelligent agent for a mobile information device, which is capable of interacting with a human user through an avatar.
  • the background art also does not teach or suggest a proactive user interface for a mobile device, in which the proactive user interface learns the behavior of the user is then able to actively suggest options for evolution of the agent to the user.
  • the background art also does not teach or suggest an agent for a mobile information device, which uses an avatar to interact with another avatar of another mobile information device or the user thereof. The present invention overcomes these deficiencies of the background art.
  • the proactive user interface would actively suggest options for evolution of the agent to the user, based upon prior experience with a particular user and/or various preprogrammed patterns from which the computational device could select, depending upon user behavior. These suggestions could optionally be made by altering the appearance of at least a portion of the display, for example by changing a menu or a portion thereof; providing different menus for display; and/or altering touch screen functionality. The suggestions could also optionally be made audibly. Other types of suggestions or delivery mechanisms are possible.
  • the proactive user interface preferably at least appears to be intelligent and interactive, and is preferably capable of at least somewhat "free” (e.g. non-scripted or partially scripted) communication with the user.
  • An intelligent appearance is important in the sense that the expectations of the user are preferably fulfilled for interactions with an "intelligent" agent/device. These expectations may optionally be shaped by such factors as the ability to communicate, the optional appearance of the interface, the use of anthropomorphic attribute(s) and so forth, which are preferably used to increase the sense of intelligence in the interactions between the user and the proactive user interface.
  • the proactive user interface is preferably able to sense how the user wants to interact with the mobile information device.
  • communication may be in only one direction; for example, the interface may optionally present messages or information to the user, but not receive information from the user, or alternatively the opposite may be implemented.
  • communication is bi-directional for preferred interactions with the user.
  • Adaptiveness is preferably present, in order for the intelligent agent to be able to alter behavior at least somewhat for satisfying the request or other communication of the user. Even if the proactive user interface does not include an intelligent agent for communicating with the user, adaptiveness enables the interface to be proactive. Observation of the interaction of the user with the mobile information device enables such adaptiveness to be performed, although the reaction of the proactive user interface to such observation may be guided by a knowledge base and/or a rule base.
  • such adaptiveness may include the ability to alter at least one aspect of the menu.
  • one or more shortcuts may be provided, enabling the user to directly reach a menu choice while by-passing at least one (and more preferably all) of the previous menus or sub-menus which are higher in the menu hierarchy than the final choice.
  • one or more menus may be rearranged according to adaptiveness of the proactive user interface, for example according to frequency of use.
  • Such a rearrangement may include moving a part of a menu, such as a menu choice and/or a sub-menu, to a new location that is higher in the menu hierarchy than the current location. Sub-menus which are higher in a menu hierarchy are reached more quickly, through the selection of fewer menu choices, than those which are located in a lower (further down) location in the hierarchy.
  • Adaptiveness is assisted through the use of rewards for learning by the proactive user interface. Suggestions or actions of which the user approves provide a reward, or a positive incentive, to the proactive interface to continue with such suggestions or actions; disapproval by the user causes a disincentive to the proactive user interface to continue such behavior(s).
  • Providing positive or negative incentives/disincentives to the proactive user interface preferably enables the behavior of the interface to be more nuanced, rather than a more "black or white” approach, in which a behavior would either be permitted or forbidden. Such nuances are also preferred to enable opposing or contradictory behaviors to be handled, when such behaviors are collectively approved/disapproved by the user to at least some extent.
  • Non-limiting examples of such computational devices include automated teller machines (ATM's) (this also has security implications, as certain patterns of user behavior could set off an alarm, for example), regular computers of any type (such as desktop, laptop, thin clients, wearable computers and so forth), mobile information devices such as cellular telephones, pager devices, other wireless communication devices, regular telephones having an operating system, PDA's and wireless PDA's, and consumer appliances having an operating system.
  • ATM automated teller machines
  • regular computers of any type such as desktop, laptop, thin clients, wearable computers and so forth
  • mobile information devices such as cellular telephones, pager devices, other wireless communication devices, regular telephones having an operating system, PDA's and wireless PDA's, and consumer appliances having an operating system.
  • the term “computational device” includes any electronic device having an operating system and being capable of performing computations.
  • the operating system may be an embedded system and/or another type of software and/or hardware run time environment.
  • mobile information device includes, but is not limited to, any type of
  • the present invention is preferably implemented in order to provide an enhanced user experience and interaction with the computational device, as well as to change the current generic, non-flexible user interface of such devices into a flexible, truly user friendly interface. More preferably, the present invention implements the user interface in the form of an avatar which would interact with the user.
  • Either or both of the mobile information device adaptive system and proactive user interfaces may be implemented with genetic algorithms, artificial intelligence (AI) algorithms, machine learning (ML) algorithms, learned behavior, and software/computational devices which are capable of evolution. Either or both may also provide an advanced level of voice commands, touch screen commands, and keyboard 'short-cuts'.
  • AI artificial intelligence
  • ML machine learning
  • learned behavior learned behavior
  • software/computational devices which are capable of evolution. Either or both may also provide an advanced level of voice commands, touch screen commands, and keyboard 'short-cuts'.
  • one or more intelligent agents for use with a mobile information device over a mobile information device network, preferably including an avatar (or "creature”; hereinafter these terms are used interchangeably) through which the agent may communicate with the human user.
  • the avatar can provide a user interface for interacting with the user.
  • the intelligent agent can also include an agent for controlling at least one interaction of the mobile information device over the network.
  • This embodiment may include a plurality of such intelligent agents being connected over the mobile information device network, thereby forming a network of such agents.
  • Various applications may also be provided through this embodiment, including but not limited to teaching in general and/or for learning how to use the mobile information device in particular, teaching languages, communication applications, community applications, games, entertainment, shopping (getting coupons, etc), locating a shop or another place, filtering advertisements and other non-solicited messages, role-playing or other interactive games over the cell phone network, "chat" and meeting functions, the ability to buy "presents” for the intelligent agents and otherwise accessorize the character, and so forth.
  • the agents themselves could be given "pets” as accessories.
  • the intelligent agents could also assist in providing various business/promotional opportunities for the cell phone operators.
  • the agents could also assist with installing and operating software on cell phones, which is a new area of commerce.
  • the agents could assist with the determination of the proper type of mobile information device and other details that are essential for correctly downloading and operating software.
  • interactions include any one or more of an interaction between the user of the device and an avatar or other character or personification of the device; an interaction between the user of the device and the device, for operating the device, through the avatar or other character or personification; interactions between two users through their respective devices, by communicating through the avatar, or other character or personification of the device; and interactions between two devices through their respective intelligent agents, and can be done without any communication between users or even between the agent and the user.
  • the interaction or interactions that are possible are determined according to the embodiment of the present invention, as described in greater detail below.
  • the present invention benefits from the relatively restricted environment of a computational device and/or a mobile information device, such as a cellular telephone for example, because the parameters of such an environment are known in advance. Even if such devices are communicating through a network, such as a cellular telephone network for example, the parameters of the environment can still be predetermined.
  • the current computational devices only provide a generic interface, with little or no customization permitted by even manual, direct intervention by the user.
  • the present invention is of a proactive user interface, which could optionally be installed in (or otherwise control and/or be associated with) any type of computational device.
  • the proactive user interface is preferably implemented for a computational device, as previously described, which includes an operating system.
  • the interface can include a user interface for communicating between the user and the operating system.
  • the interface can also include a learning module for detecting at least one pattern of interaction of the user with the user interface and for actively suggesting options for evolution of at least one function of the user interface to the user, according to the detected pattern. Therefore, the proactive user interface can anticipate the requests of the user and thereby assist the user in selecting a desired function of the computational device.
  • At least one pattern can be selected from the group consisting of a pattern determined according to at least one previous interaction of the user with the user interface, and a predetermined pattern, or a combination thereof.
  • the first type of pattern represents learned behavior, while the second type of pattern may be preprogrammed or otherwise predetermined, particularly for assisting the user when a particular computational device is first being operated by the user.
  • a third type of pattern could combine these two aspects, and would enable the pattern to be at least partially determined according to the user behavior, but not completely; for example, the pattern selection may be guided according to a plurality of rules, and/or according to a restrictive definition of the possible world environment state and/or the state of the device and/or user interface.
  • the pattern includes a pattern of the user's preferences for the appearance, function or characteristic of the intelligent agent.
  • the user interface preferably features a graphical display, such that at least one function of the graphical display is proactively altered according to the pattern. For example, at least a portion of the graphical display may be altered, for example by selecting a menu for display according to the detected pattern; and displaying the menu.
  • the menu may be selected by constructing a menu from a plurality of menu options, for example in order to create a menu "on the fly".
  • the user interface may feature an audio display, such that altering at least one function of the user interface involves altering at least one audible sound produced by the computational device.
  • the proactive user interface could be implemented according to a method of the present invention, which is preferably implemented for a proactive interaction between a user and a computational device through a user interface.
  • the method preferably includes detecting a pattern of user behavior according to at least one interaction of the user with the user interface; and proactively altering at least one function of the user interface according to the pattern.
  • the pattern includes a pattern of user preferences for the appearance, function or characteristic of the intelligent agent.
  • a mobile information device which includes an adaptive system. Like the user interface above, it also relies upon prior experience with a user and/or preprogrammed patterns. However, the adaptive system can be more restricted to operating within the functions and environment of a mobile information device, such as a cellular telephone for example, which currently may also include certain basic functions from a PDA.
  • a mobile information device such as a cellular telephone for example, which currently may also include certain basic functions from a PDA.
  • the adaptive system preferably operates with a mobile information device featuring an operating system.
  • the operating system can comprise an embedded system.
  • the mobile information device can comprise a cellular telephone.
  • the adaptive system is preferably able to analyze the user behavior by analyzing a plurality of user interactions with the mobile information device, after which more preferably the adaptive system compares the plurality of user interactions to at least one predetermined pattern, to see whether the predetermined pattern is associated with altering at least one function of the user interface.
  • the analysis may also include comparing the plurality of user interactions to at least one pattern of previously detected user behavior, wherein the pattern of previously detected user behavior is associated with altering at least one function of the user interface.
  • the adaptive system may be operated by the mobile information device itself. Alternatively, if the mobile information device is connected to a network, the adaptive system may be operated at least partially according to commands sent from the network to the mobile information device. For this implementation, data associated with at least one operation of the adaptive system is stored at a location other than the mobile information device, in which the location is accessible through the network.
  • the adaptive system also includes a learning module for performing the analysis according to received input information and previously obtained knowledge.
  • knowledge may have been previously obtained from the behavior of the user, and/or may have been communicated from another adaptive system in communication with the adaptive system of the particular mobile information device.
  • the adaptive system can adapt to user behavior according to any one or more of an AI algorithm, a machine learning algorithm, or a genetic algorithm.
  • one or more intelligent agents for use with a mobile information device over a mobile information device network, preferably including an avatar through which the agent may communicate with the human user.
  • the avatar can therefore provide a user interface for interacting with the user.
  • the intelligent agent can also include an agent for controlling at least one interaction of the mobile information device over the network. This embodiment may include a plurality of such avatars being connected over the mobile information device network.
  • At least one characteristic of an appearance of the avatar can be altered, for example according to a user command.
  • a plurality of characteristics of an appearance of avatar can be altered according to a predefined avatar skin.
  • the skin can be predefined by the user.
  • skin it is meant that a plurality of the characteristics is altered together as a set, in which the set forms the skin.
  • At least one characteristic of an appearance of the avatar can be altered according to an automated evolutionary algorithm, for example a genetic algorithm.
  • the evolutionary algorithm is one non-limiting example of a method for providing personalization of the avatar for the user. Personalization may also be performed through direct user selection of one or more characteristics or skins (groups of characteristics). Such personalization is desirable at least in part because it enhances the emotional experience of the user with the avatar and hence with the mobile information device.
  • the present invention is preferably capable of operating on a limited system (in terms of memory, data processing capacity, screen display size and resolution, and so forth) in a device which is also very personal to the user.
  • the device is a mobile information device, such as a cellular telephone, which by necessity is adapted for portability and ease of use, and therefore may have one or more, or all, of the above limitations.
  • the implementation aspects of the present invention are preferably geared to this combination of characteristics. Therefore, in order to overcome the limitations of the device itself while still maintaining the desirable personalization and "personal feel" for the user, various solutions are proposed below. It should be noted that these solutions are examples only, and are not meant to be limiting in any way.
  • the proactive user interface of the present invention is preferably able to control and/or be associated with any type of computational device, in order to actively make suggestions to the user, based upon prior experience with a particular user and/or various preprogrammed patterns from which the computational device could select, depending upon user behavior.
  • These suggestions could be made by altering the appearance of at least a portion of the display, for example by changing a menu or a portion thereof; providing different menus for display; and/or altering touch screen functionality.
  • the suggestions could also be made audibly.
  • the proactive user interface is preferably implemented for a computational device, as previously described, which includes an operating system.
  • the interface can include a user interface for communicating between the user and the operating system.
  • the interface is preferably able to detect at least one pattern of interaction of the user with the user interface, for example through operation of a learning module and would therefore be able to proactively alter at least one function of the user interface according to the detected pattern.
  • the proactive user interface can anticipate the requests of the user and thereby assist the user in selecting a desired function of the computational device.
  • This type of proactive behavior requires some type of learning capability on the part of the proactive interface.
  • learning capabilities may be provided through algorithms and methodologies which are known in the art, relating to learning (by the software) and interactions of a software object with the environment.
  • Software can be said to be learning when it can improve its actions over a period of time.
  • Artificial Intelligence needs to demonstrate intelligent action selection (reasoning), such that the software has the ability to explore its environment (its "world”) and to discover action possibilities.
  • the software would also have the ability to represent the world's state and its own internal state. The software would then be able to select an intelligent action (using the knowledge above) and to act.
  • Learning for example by the learning module of the interface, can be reinforced by rewards, in which the learning module is rewarded for taking particular actions according to the state of the environment. This type of learning actually involves training the learning module to behave in a certain manner. If more than one behavior is allowed, then the learning process is non-deterministic and can create different behaviors.
  • the reward includes causing the learning module to detect when an offered choice leads to a user selection, as opposed to when an offered choice causes the user to seek a different set of one or more selections, for example by selecting a different menu than the one offered by the proactive user interface.
  • the proactive user interface should seek to maximize the percentage of offerings which lead to a direct user selection from that offering, as this shows that the interface has correctly understood the user behavior.
  • FIG. 1 is a block diagram of an exemplary learning module according to the present invention for reactive learning.
  • a learning module 100 includes a Knowledge Base 102 , which acts as the memory of learning module 100 , by holding information gathered by the learning module 100 as a result of interactions with the environment.
  • Knowledge Base 102 may be stored in non-volatile memory (not shown).
  • Knowledge Base 102 stores information that assists the learning module 100 to select the appropriate action. This information can include values such as numerical weights for an inner neural net, or a table with action reward values, or any other type of information.
  • the learning module 100 features a plurality of sensors 104 .
  • the sensors 104 allow the learning module 100 to perceive its environment state.
  • the sensors 104 are connected to the environment and output sensed values.
  • the values can come from the program itself (for example, position on screen, energy level, etc.), or from real device values (for example, battery value and operating state, such as a flipper state for cellular telephones in which the device can be activated or an incoming call answered by opening a "flipper").
  • the learning module 100 also includes a perception unit 106 , for processing the current output of the sensors 104 into a uniform representation of the world, called a "state". The state is then the input to a reasoning system 108 , which may be described as the "brain" of learning module 100 .
  • This design supports the extension of the world state and the sensor mechanism, as well as supporting easy porting of the system to several host platforms (different computational devices and environments), such that the world state can be changed according to the device.
  • the reasoning system 108 processes the current state with the Knowledge Base 102 , thereby producing a decision as to which action to perform.
  • the reasoning system 108 receives the current state of the world, outputs the action to be performed, and receives feedback on the action selected. Based on the feedback, the reasoning system 108 updates the Knowledge Base 102 . This is an iterative process in which learning module 100 learns to associate actions to states.
  • the computational device may feature one or more biological sensors, for sensing various types of biological information about the user, such as emotional state, physical state, movement, etc. This information may then be fed to the sensors 104 for assisting the perception unit 106 in a determination of the state of the user, and hence to determine the proper state for the device.
  • biological sensors may include but are not limited to sensors for body temperature, heart rate, oxygen saturation or any other type of sensor which measures biological parameters of the user.
  • Figure 2 shows an exemplary embodiment of a system 200 according to the present invention for providing the proactive user interface, again featuring the learning module 100 .
  • the learning module 100 is shown communicating with an operating system 202 of the computational device (not shown) with which the learning module 100 is associated and/or controls and/or by which the learning module 100 is operated.
  • the operating system 202 controls the operation of an interface part 204 and also at least one other software application 206 (although of course many such software applications may optionally be present).
  • the user communicates through interface part 204 , for example by selecting a choice from a menu.
  • the operating system 202 enables this communication to be received and translated into data.
  • the learning module 100 then preferably receives such data, and can send a command back to the operating system 202 , for example to change some aspect of the interface part 204 (for example by offering a different menu), and/or to operate the software application 206 .
  • the user then responds through the interface part 204 ; from this response, the learning module 100 learns whether or not the action (command that was sent by learning module 100 ) was appropriate.
  • FIG 3 is a block diagram showing an exemplary implementation of a proactive user interface system 300 according to the present invention.
  • system 300 features a three level architecture, with an application layer being supported by an AI (artificial intelligence) framework, which in turn communicates with the host platform computational device (shown as "host platform").
  • AI artificial intelligence
  • the application layer features a plurality of different applications, of which a few non-limiting examples are shown, such as a MutateApp 302 , a PreviousApp 304 and a TeachingApp 306 .
  • the MutateApp 302 is invoked in order to control and/or initiate mutations in the system 300.
  • the learning module can optionally change its behavior through directed or semi-directed evolution, for example through genetic algorithms.
  • the MutateApp 302 controls and/or initiates such mutations through evolution. The embodiment of evolution is described in greater detail below.
  • the PreviousApp 304 enables a prior state of the system 300 , or a portion thereof (such as the state of the learning module) to be invoked in place of the current state. More specifically, the PreviousApp 304 enables the user to return to the previous evolutionary step if the present invention is being implemented with an evolutionary algorithm. More generally, the system 300 is preferably stateful and therefore can return to a previous state, as a history of such states is preferably maintained.
  • the TeachingApp 306 is only one non-limiting example of a generic application which may be implemented over the AI framework layer.
  • the AI framework layer itself contains one or more components which enable the user interface to behave in a proactive manner.
  • the framework can include a DeviceWorldMapper 308, for determining the state of the computational device and also that of the virtual world, as well as the relationship between the two states.
  • the DeviceWorldMapper 308 receives input, for example from various events from an EventHandler 310 , in order to determine the state of the virtual world and that of the device.
  • the DeviceWorldMapper 308 also communicates with an AI/ML (machine learning) module 312 for analyzing input data.
  • the AI/ML module 312 also determines the behavior of the system 300 in response to various stimuli, and also enables the system 300 to learn, for example from the response of the user to different types of user interface actions.
  • the behavior of the system 300 may also be improved according to an evolution module 314 .
  • the embodiment of evolution is particularly preferred with regard to the use of an intelligent agent on a mobile information device (see below for an example), but may also be used with any proactive user interface for a computational device. This embodiment is used when the proactive user interface also features or is used in combination with an avatar.
  • Evolution can be simulated by a set of genetic algorithms.
  • the basis of these algorithms is describing the properties of the proactive interface (and particularly the avatar's appearance) in term of genes, chromosomes, and phenotypes.
  • the gene is a discrete property that has a level of expression for example a leg of a certain type. The level of the expression can be the number of these legs.
  • a phenotype is the external expression of a gene; for example the leg gene can have different phenotypes in term of leg length or size.
  • the gene can go though a mutation process. This process (preferably according to a certain probability) changes one or more parameter of the gene, thereby producing different new phenotypes.
  • a chromosome is a set of genes that function together.
  • the chromosome can hybridize (cross-breed) with the same type of chromosome from a different creature, thus creating a new chromosome that is a combination of its genetic parent chromosomes.
  • This methodology helps in creating a generic infrastructure to simulate visual evolution (for example of the appearance of the avatar) and/or evolution of the behavior of the proactive user interface.
  • These algorithms may also be used for determining non-visual behavioral characteristics, such as dexterity, stamina and so on. The effect could result for example in a faster creature, or a more efficient creature.
  • These algorithms may be used for any such characteristics that can be described according to the previously mentioned gene/genotype/phenotype structure, such that for example behavioral genes could optionally determine the behavior of AI algorithms used by the present invention.
  • the algorithm output preferably provides a variety of possible descendant avatars and/or proactive user interfaces.
  • the genetic algorithms use a natural selection process to decide which of the genetic children will continue as the next generation.
  • the selection process can be decided by the user or can be predefined. In this way the creature can display interesting evolutional behavior.
  • the generic algorithm framework can be used to evolve genes that encode other non visual properties of the creature, such as goals or character.
  • the evolution module 314 supports and also preferably manages such evolution, for example through the operation of the MutateApp 302.
  • one or more different low level managers preferably support the receipt and handling of different events, and also the performance of different actions by the system 300 .
  • These managers may include but are not limited to, an ActionManager 316 , a UIManager 318 , a StorageManager 320 and an ApplicationManager 322 .
  • the ActionManager 316 is described in greater detail below, but briefly enables the system 300 to determine which action should be taken, for example through the operation of the AI/ML module 312 .
  • the UIManager 318 manages the appearance and functions of the user interface, for example by directing changes to that interface as previously described.
  • the StorageManager 320 manages the storage and handling of data, for example with regard to the knowledge base of the system 300 (not shown).
  • the ApplicationManager 322 handles communications with the previously described applications in the application layer.
  • EventHandler 310 All of these different managers receive events from the EventHandler 310 .
  • an AI infrastructure 324 supports communication with the host platform.
  • the host platform itself features a host platform interface 326 , which may be provided through the operating system of the host platform for example.
  • the AI infrastructure 324 can include an I/O module 328 , for receiving inputs from the host platform interface 326 and also for sending commands to the host platform interface 326 .
  • a screen module 330 handles the display of the user interface on the screen of the host platform computational device.
  • a resources module 332 enables the system 300 to access various host platform resources, such as data storage and so forth.
  • the learning module may also be represented as a set of individual agents, in which each agent has a simple goal.
  • the learning module chooses an agent to perform an action based on the current state.
  • the appropriate mapping between the current state and agents can also be learned by the learning module with reinforcement learning.
  • EXAMPLE 2 ADAPTIVE SYSTEM FOR MOBILE INFORMATION DEVICE
  • This example relates to the illustrative implementation of an adaptive system of the present invention with a mobile information device, although it should be understood that this implementation is preferred but optional, and is not intended to be limiting in any way.
  • the adaptive system may optionally include any of the functionality described above in Example 1, and may also be implemented as previously described.
  • This Example focuses more on the actual architecture of the adaptive system with regard to the mobile information device operation. Also, this Example describes an optional but preferred implementation of the creature or avatar according to the present invention.
  • This Section describes a preferred embodiment of an event driven system according to the present invention, including but not limited to an application manager, and interactions between the device itself and the system of the present invention as it is operated by the device.
  • Figure 4 is a block diagram of an exemplary adaptive system 400 according to the present invention, and interactions of the system 400 with a mobile information device 402 . Also as shown, both the system 400 and the mobile information device 402 interact with a user 404 .
  • the mobile information device 402 has a number of standard functions, which are shown divided into two categories for the purpose of explanation only: data and mechanisms. Mechanisms may include but are not limited to such functions as a UI (user interface) system 406 (screen, keypad or touchscreen input, etc); incoming and outgoing call function 408 ; messaging function 410 for example for SMS; sound 412 and/or vibration 414 for alerting user 404 of an incoming call or message, and/or alarm, etc; and storage 416.
  • UI user interface
  • incoming and outgoing call function 408 for example for SMS
  • sound 412 and/or vibration 414 for alerting user 404 of an incoming call or message, and/or alarm, etc
  • storage 416 storage 416.
  • Data may include such information as an address (telephone) book 418 ; incoming or outgoing call information 420 ; the location of the mobile information device 402 , shown as location 422 ; message information 424 ; cached Internet data 426 ; and data related to the user 404 , shown as owner data 428 .
  • mobile information device 402 may include any one or more of the above data/mechanisms, but does not necessarily need to include all of them, and/or may include additional data/mechanisms that are not shown. These are simply intended as non-limiting examples with regard to the mobile information device 402, particularly for cellular telephones.
  • the adaptive system 400 preferably interacts with the data/mechanisms of the mobile information device 402 in order to be able to provide an adaptive (and also preferably proactive) user interface, thereby increasing the ease and efficiency with which the user 404 interacts with the mobile information device 402 .
  • the adaptive system 400 features logic 430 , which functions in a similar manner as the previously described learning module, and which also operates according to the previously described AI and machine learning algorithms.
  • the logic 430 is able to communicate with the knowledge base 102 as described with regard to Figure 1 (components featuring the same reference numbers have either identical or similar functionality, unless otherwise stated).
  • the information storage 432 includes data about the actions of the mobile information device 402 , user information and so forth, and preferably supplements the data in the knowledge base 102 .
  • the adaptive system 400 is capable of evolution, through an evolution logic 434, which may optionally combine the previously described functionality of the evolution module 314 and the MutateApp 302 of Figure 3 .
  • the adaptive system 400 is capable of communicating directly with the user 404 through text and/or audible language, as supported by a language module 436 .
  • the user 404 may be presented with an avatar (not shown) for the user interface. If present, such an avatar may be created through a 3D graphics model 438 and an animation module 440 .
  • FIG. 5A shows a block diagram of an exemplary application management system 500 , which is a core infrastructure for supporting the adaptive system of the present invention.
  • the system 500 may also be used for supporting such embodiments as a teaching application, as previously described and also as described in greater detail below.
  • the system 500 features an application manager 502 for managing the different types of applications which are part of the adaptive system according to the present invention.
  • the application manager 502 communicates with an application interface called a BaseApp 504 , which is implemented by all applications in the system 500 . Both the application manager 502 and the BaseApp 504 communicate events through an EventHandler 506.
  • the application manager 502 is responsible for managing and providing runtime for the execution of the system applications (applications which are part of the system 500 ).
  • the life cycle of each such application is defined in the BaseApp 504 , which allows the application manager 502 to start, pause, resume and exit (stop) each such application.
  • the application manager 502 manages the runtime execution through the step method of the interface of BaseApp 504 . It should be noted that the step method is used for execution, since the system 500 is stateful, such that each step preferably corresponds (approximately) to one or more states. However, execution could also be based upon threads and/or any type of execution method.
  • the application manager 502 receives a timer event from the mobile information device.
  • the mobile information device features an operating system, such that the timer event is received from the operating system layer.
  • the application manager 502 invokes the step of the current application being executed.
  • the application manager 502 switches from one application to another application when the user activates a different application, for example when using the menu system.
  • system applications including but not limited to, a TeachingMachineApp 508 , a MutateApp 510 , a GeneStudioApp 514, a TWizardApp 516, a FloatingAgentApp 518, a TCWorldApp 522 and a HybridApp 520 . These applications are also described in greater detail below with regard to Example 3.
  • the MutateApp 510 is invoked in order to control and/or initiate mutations in the adaptive system, and/or in the appearance of an avatar representing the adaptive system as a user interface.
  • the adaptive system of the present invention can change its behavior through directed or semi-directed evolution, for example through genetic algorithms.
  • the MutateApp 510 controls and/or initiates such mutations.
  • the GeneStudioApp 514 enables the user to perform directed and/or semi-directed mutations through one or more manual commands. For example, the user may wish to direct the adaptive system (through the application management system 500) to perform a particular task sequence upon receiving a particular input. Alternatively, the user may wish to directly change part of the appearance of an avatar, if present. According to the preferred embodiments of the present invention, these different aspects of the adaptive system are implemented by distinct "genes", which can then be altered by the user.
  • the HybridApp 520 may be invoked if the user wishes to receive information from an external source, such as the adaptive system of another mobile information device, and to merge this information with existing information on the user's mobile information device. For example, the user may wish to create an avatar having a hybrid appearance with the avatar of another mobile information device.
  • the HybridApp 520 also provides the main control of the user on the entire evolutionary state of the avatar.
  • the HybridApp 520 may be used to instruct the user on the "life" properties of with the avatar, which may have a name, personality, behavior and appearance.
  • the TeachingMachineApp 508 is an illustrative, non-limiting example of an application which may relate to providing instruction on the use of the device itself, but provides instruction on a subject which is not related to the direct operation of the device itself. Therefore, the TeachingMachineApp 508 represents an example of an application which is provided on the mobile information device for a purpose other than the use of the device itself.
  • the TCWorldApp 522 is an application which runs the intelligent agent, controlling both the intelligent aspects of the agent and also the graphical display of the creature or avatar.
  • the TWizardApp 516 is another type of application which provides information to the user. It is described with regard to the Start Wizard application in Example 4 below. Briefly, this application contains the user preferences and configuration of the AI framework, such as the character of the intelligent agent, particularly with regard to the emotional system, and also with regard to setting goal priorities.
  • the FloatingAgentApp 518 controls the appearance of the user interface, particularly with regard to the appearance of an avatar (if present).
  • the FloatingAgentApp 518 enables the visual display aspects of the user interface to be displayed independently of the display of the avatar, which may therefore appear to "float" over the user interface, for example.
  • the FloatingAgentApp 518 is the default application being operated when no other application is running.
  • FIG. 5B shows an exemplary sequence diagram for the operations of the application manager according to the present invention.
  • an EventHandler 506 dispatches a notification of an event to the application manager 502 , as shown in arrow 1. If the event is a timer event, then the application manager 502 invokes the step (action) of the relevant application that was already invoked, as shown in arrow 1.1.1. If the event is to initiate the execution of an application, then the application manager 502 invokes the relevant application, as shown in arrow 1.2.1. If a currently running application is to be paused, then the application manager 502 sends the pause command to the application, as shown in arrow 1.3.1.
  • the application manager 502 sends the resume command to the application, as shown in arrow 1.4.1. In any case, successful execution of the step is returned to the application manager 502 , as shown by the relevant return arrows above. The application manager 502 then notifies the EventHandler 506 of the successful execution, or alternatively of the failure.
  • the adaptive system also needs to be able to communicate directly with various mobile information device components, through the operating system of the mobile information device. Such communication may be performed through a communication system 600 , shown with regard to Figure 6 , preferably with the action algorithms described below.
  • Figures 6A and 6B show an exemplary implementation of the infrastructure required for the adaptive system according to the present invention to perform one or more actions through the operating system of the mobile information device, as well as a sequence diagram for operation of the communication system 600 .
  • this infrastructure is an example of a more general concept of "AI wrappers", or the ability to "wrap" an existing UI (user interface) system with innovative AI and machine learning capabilities.
  • the communication system 600 is capable of handling various types of events, with a base class event 602 that communicates with the EventHandler 506 as previously described.
  • the EventDispatcher 604 then routes the event to the correct object within the system of the present invention. Routing is determined by registration of the object with the EventDispatcher 604 for a particular event.
  • the EventDispatcher 604 preferably manages a registry of handlers that implement the EventHandler 506 interface for such notification.
  • Specific events for which particular handlers are implemented include a flipper event handler 606 for cellular telephones in which the device can be activated or an incoming call answered by opening a "flipper"; when the flipper is opened or closed, this event occurs.
  • Applications being operated according to the present invention may send events to each other, which are handled by an InterAppEvent handler 608 .
  • An event related to the evolution (change) of the creature or avatar is handled by an EvolutionEvent handler 610 .
  • An incoming or outgoing telephone call is handled by a CallEvent handler 612 , which in turn has two further handlers, a CallStartedEvent handler 614 for starting a telephone call and a CallEndedEvent handler 616 for ending a telephone call.
  • SMSEvent handler 618 An SMS event (incoming or outgoing message) is handled by an SMSEvent handler 618 .
  • Parameters which may be included in the event comprise parameters related to hybridization of the creature or avatar of one mobile information device with the creature or avatar of another mobile information device, as described in greater detail below.
  • a KeyEvent handler 620 and/or a KeyCodeEvent handler 622 are preferably handled by a KeyEvent handler 620 and/or a KeyCodeEvent handler 622.
  • the KeyEvent handler 620 preferably handles this event, which relates to incoming information for the operation of the system according to the present invention.
  • the key_event is an object from class KeyEvent, which represents the key event message object.
  • the KeyEvent handler 620 handles the key_event itself, while the KeyCodeEvent handler 622 listens for input code (both input events are obtained through a hook into the operating system).
  • Figure 6B is an exemplary sequence diagram, which shows how events are handled between the mobile information device operating system or other control structure and the system of the present invention.
  • the mobile information device has an operating system, although a similar operation flow could be implemented for devices that lack such an operating system. If present, the operating system handles the input and output to/from the device, and manages the state and events which occur for the device.
  • the sequence diagram in Figure 6B is an abstraction for facilitating the handling of, and the relation to, these events.
  • An operating system module (os_module) 628 causes or relates to an event; a plurality of such modules may be present, but only one is shown for the purposes of clarity and without intending to be limiting in any way.
  • the operating system module 628 is part of the operating system of the mobile information device.
  • the operating system module 628 sends a notification of an event, whether received or created by operating system module 628 , to a hook 630 .
  • the hook 630 is part of the system according to the present invention, and is used to permit communication between the operating system and the system according to the present invention.
  • the hook 630 listens for relevant events from the operating system.
  • the hook 630 is capable of interpreting the event from the operating system, and of constructing the event in a message which is comprehensible to the event 602.
  • Hook 630 also dispatches the event to the EventDispatcher 604 , which communicates with each handler for the event, shown as the EventHandler 506 (although there may be a plurality of such handlers). The EventDispatcher 604 then reports to the hook 630 , which reports to the operating system module 628 about the handling of the event.
  • Figures 7A , 7B and 7C show exemplary events, and how they are handled by interactions between the mobile information device (through the operating system of the device) and the system of the present invention. It should be noted that some events may be handled within the system of the present invention, without reference to the mobile information device.
  • FIG. 7A shows an exemplary key event sequence diagram, described according to a mobile information device that has the DMSS operating system infrastructure from Qualcomm Inc., for their MSM (messaging state machine) CDMA (code division multiple access) mobile platform.
  • This operating system provides operating system services such as user interface service, I/O services and interactive input by using the telephone keys (keypad).
  • This example shows how an input event from a key is generated and handled by the system of the present invention.
  • Other events are sent to the system in almost an identical manner, although the function of the hook 630 alters according to the operating system module which is sending the event; a plurality of such hooks is present, such that each hook has a different function with regard to interacting with the operating system.
  • a ui_do_event module 700 is a component of the operating system and is periodically invoked.
  • the user interface (UI) structure which transfers information to the ui_do_event module 700 contains the value of the key.
  • the hook 630 then receives the key value, identifies the event as a key event (particularly if the ui_do_event module 700 dispatches a global event) and generates a key event 702 .
  • the key event 702 is then dispatched to the EventDispatcher 604 .
  • the event is then sent to an application 704 which has requested to receive notification of such an event, preferably through an event handler (not shown) as previously described. Notification of success (or failure) in handling the event is then preferably returned to the EventDispatcher 604 and hence to the hook 630 and the ui_do_event module 700 .
  • Figure 7B shows a second illustrative example of a sequence diagram for handling an event; in this case, the event is passed from the system of the present invention to the operating system, and is related to drawing on the screen of the mobile information device.
  • Information is passed through the screen access method of the operating system, in which the screen is (typically) represented by a frame buffer.
  • the frame buffer is a memory segment that is copied by using the screen driver (driver for the screen hardware) and displayed by the screen.
  • the system of the present invention produces the necessary information for controlling drawing on the screen to the operating system.
  • the operating system (through scrn_update_main module 710 ) first updates the frame buffer for the screen.
  • This updating may involve drawing the background for example, which may be displayed on every part of the screen to which data is not drawn from the information provided by the system of the present invention.
  • the presence of such a background supports the use of semi-transparent windows, which may be used for the creature or agent as described in greater detail below.
  • the Scrn_update_main module 710 then sends a request for updated data to a screen module 712 , which is part of the system of the present invention and which features a hook for communicating with the operating system.
  • the screen module 712 then sends a request to each application window, shown as an agentWindow 714 , of which a plurality may be present, for updated information about what should be drawn to the screen. If a change has occurred, such that an update is required, then the agentWindow 714 notifies the screen module 712 that the update is required.
  • the screen module 712 then asks for the location and size of the changed portion, preferably in two separate requests (shown as arrows 2.1.2.1 and 2.1.2.2 respectively), for which answers are sent by the agentWindow 714 .
  • the screen module 712 returns the information to the operating system through the scm_update_main 710 in the form of an updated rectangle, as follows.
  • the scm_update_main 710 responds to the notification about the presence of an update by copying the frame buffer to a pre-buffer (process 3.1).
  • the screen module 712 then draws the changes for each window into the pre-buffer, shown as arrow 3.2.1.
  • the pre-buffer is then copied to the frame buffer and hence to the screen (arrow 3.3).
  • Figure 7C shows the class architecture for the system of the present invention for drawing on the screen.
  • the screen module 712 and the agentWindow 714 are both shown.
  • the class agentWindow 714 also communicates with three other window classes, which provide information regarding updating (changes to) windows: BackScreenWindow 716 , BufferedWindow 718 and DirectAccessWindow 720 .
  • the BufferedWindow 718 has two further window classes with which it communicates: TransBufferedWindow 722 and PreBufferedWindow 724 .
  • This Section describes a preferred embodiment of an action selection system according to the present invention, including but not limited to a description of optional action selection according to incentive(s)/disincentive(s), and so forth.
  • an initial explanation is provided with regard to the structure of the intelligent agent, and the interactions of the intelligent agent with the virtual environment which is provided by the system of the present invention.
  • Figure 8A describes an exemplary structure of the intelligent agent and Figure 8B includes an exemplary sequence diagram for the operation of the intelligent agent.
  • an intelligent agent 800 includes a plurality of classes.
  • the main class is an AICreature 802 , which includes information about the intelligent agent such as its state, personality, goals etc, and also information about the appearance of the creature which visually represents the agent, such as location, color, whether it is currently visible and so forth.
  • the AICreature 802 communicates with World 804 , which is the base class for the virtual environment for the intelligent agent.
  • the World 804 in turn communicates with the classes which comprise the virtual environment, of which some non-limiting examples are shown.
  • World 804 preferably communicates with various instances of a WorldObject 806 , which represents an object that is found in the virtual environment and with which the intelligent agent may interact.
  • the World 804 manages these different objects and also receives information about their characteristics, including their properties such as location and so forth.
  • the World 804 also manages the properties of the virtual environment itself, such as size, visibility and so forth.
  • the visual representation of the WorldObject 806 may use two dimensional or three dimensional graphics, or a mixture thereof, and may also use other capabilities of the mobile information device, such as sound production and so forth.
  • the WorldObject 806 itself may represent an object which belongs to one of several classes. This abstraction enables different object classes to be added to or removed from the virtual environment.
  • the object may be a "ball" which for example may start as part of a menu and then be "removed” by the creature in order to play with it, as represented by a MenuBallObject 808 .
  • a GoodAnimalObject 810 also communicates with the WorldObject 806; in turn, classes such as a FoodObject 812 (representing food for the creature), a BadAnimalObject 814 (an animal which may annoy the creature and cause them to fight for example) and a HouseObject 816 (a house for the creature) preferably communicate with the GoodAnimalObject 810 .
  • the GoodAnimalObject 810 includes the functionality to be able to draw objects on the screen and so forth, which is why other classes and objects preferably communicate with the GoodAnimalObject 810 .
  • other classes and objects are possible in this system, since other toys may optionally be provided to the creature, for example.
  • the WorldObject 806 may also relate to the state of the intelligent agent, for example by providing a graded input to the state.
  • This input is graded in the sense that it provides an incentive to the intelligent agent or a disincentive to the intelligent agent; it may also have a neutral influence.
  • the aggregation of a plurality of such graded inputs enables the state of the intelligent agent to be determined.
  • the graded inputs are preferably aggregated in order to maximize the reward returned to the intelligent agent from the virtual environment.
  • graded inputs may also include input from the user in the form of encouraging or discouraging feedback, so that the intelligent agent has an incentive or disincentive, respectively, to continue the behavior for which feedback has been provided.
  • weighting_factor is a value between 0 and 1, which indicates the weight of the user feedback as opposed to the virtual environment (world) feedback.
  • Non-limiting examples of such reward for the agent's action include positive or negative feedback on the agent's suggestion; provision of a world object such as a ball or food to the agent; telephone usage duration; user teaching duration; and the like.
  • Each of these examples can be assigned a predetermined score, and the agent's action can be restricted or expanded according to a corresponding accumulated score.
  • positive and negative feedback provided by the user may be assigned positive and negative point values, respectively; encountering an enemy or bad animal: -20 points; obtaining a food, toy or house object: +5 points; low battery alarm: -1 point; correct and incorrect answers, when the agent teaches the user: +1 point and -1 point, respectively; inactivity for 20 minutes: -1 point; wrong dialing: -1 point; SMS use: +1 point; and the like.
  • the above examples may be applied in other ways.
  • Figure 8B shows an illustrative sequence diagram for an exemplary set of interactions between the virtual world and the intelligent agent of the present invention.
  • the sequence starts with a request from a virtual world module 818 to the AICreature 802 for an update on the status of the intelligent agent.
  • a virtual world module 818 controls and manages the entire virtual environment, including the intelligent agent itself.
  • the intelligent agent then considers an action to perform, as shown by arrow 1.1.1.
  • the action is preferably selected through a search (arrow 1.1.1.1) through all world objects, and then recursively through all actions for each object, by interacting with the World 804 and the WorldObject 806.
  • the potential reward for each action is evaluated (arrow 1.1.1.1.1.1) and graded (arrow 1.1.1.1.1.1.2).
  • the action with the highest reward is selected.
  • the overall grade for the intelligent agent is then determined and the AICreature 802 performs the selected action.
  • the Virtual_world 818 then updates the location and status of all objects in the world, by communicating with the World 804 and the WorldObject 806.
  • Figures 9A and 9B show two exemplary methods for selecting an action according to the present invention.
  • Figure 9A shows an exemplary method for action selection, termed herein a rule based strategy for selecting an action.
  • stage 1 the status of the virtual environment is determined by the World state.
  • a World Event occurs, after which the State Handler which is appropriate for that event is invoked in stage 2.
  • the State Handler preferably queries a knowledge base in stage 3.
  • the knowledge base may be divided into separate sections and/or separate knowledge bases according to the State Handler which has been invoked.
  • a response is returned to the State Handler.
  • rule base validation is performed, in which the response (and hence the suggested action which in turn brings the intelligent agent into a specific state) is compared against the rules. If the action is not valid, then the process returns to stage 1. If the action is valid, then in stage 6 the action is generated. The priority for the action is then determined in stage 7; more preferably, the priority is determined according to a plurality of inputs, including but not limited to, an action probability, an action utility and a user preference. In stage 8, the action is placed in a queue for the action manager. In stage 9, the action manager retrieves the highest priority action, which is then performed by the intelligent agent in stage 10.
  • Figure 9B shows an exemplary action selection method according to a graph search strategy.
  • the process begins by determining the state of the world (virtual environment), including the state of the intelligent agent and of the objects in the world.
  • the intelligent agent is queried.
  • the intelligent agent obtains a set of legal (permitted or possible) actions for each world object; preferably each world object is queried as shown.
  • an action to be performed is simulated.
  • the effect of the simulation is determined for the world, and is preferably determined for each world object in stage 6.
  • a grade is determined for the effect of each action.
  • stage 8 the state of the objects and hence of the world is determined, as is the overall accumulated reward of an action.
  • stage 9 the effect of the action is simulated on the intelligent agent; preferably the effect between the intelligent agent and each world object is also considered in stage 10.
  • stage 11 all of this information is preferably used to determine the action path with the highest reward.
  • the action is generated.
  • the action priority is set, preferably according to the action grade or reward.
  • the action is placed in a queue at the action manager, as in Figure 9A .
  • the action is considered by the action manager according to priority; the highest priority action is selected, and is executed in stage 16.
  • FIG. 10 shows a sequence diagram of an exemplary action execution method according to the present invention.
  • a handler 1000 send a goal for an action to an action module 1002 in arrow 1, which features a base action interface.
  • the base action interface enables the action module 1002 to communicate with the handler 1000 and also with other objects in the system, which are able to generate and post actions for later execution by the intelligent agent, shown here as a FloatingAgentApp 1006 .
  • These actions are managed by an action manager 1004 .
  • the action manager 1004 has two queues containing action objects. One queue is the ready for execution queue, while the other queue is the pending for execution queue. The latter queue may be used for example if an action has been generated, but the internal state of the action is pending so that the action is not ready for execution. When the action state matures to be ready for execution, the action is preferably moved to the ready for execution queue.
  • An application manager 1008 interacts with the FloatingAgentApp 1006 for executing an action, as shown in arrow 2.
  • the FloatingAgentApp 1006 requests the next action from the action manager 1004 (arrow 2.1); the action itself is provided by the action module 1002 (arrow 2.2.1).
  • Actions are enqueued from the handler 1000 to the action manager 1004 (arrow 3).
  • Goals (and hence at least a part of the priority) are set for each action by communication between the handler 1000 and the action module 1002 (arrow 4).
  • Arrows 5 and 6 show the harakiri () method, described in greater detail below.
  • the actions are queued in priority order.
  • the priority is determined through querying the interface of the action module 1002 by the action manager 1004 .
  • the priority of the action is determined according to a calculation which includes a plurality of parameters.
  • the parameters may include the priority as derived or inferred by the generating object, more preferably based upon the predicted probability for the success of the action; the persistent priority for this type of action, which is determined according to past experience with this type of action (for example according to user acceptance and action success); and the goal priority, which is determined according to the user preferences.
  • P all P action probability * P persistent priority + P action goal / 10 / 2 )
  • each action referably has a Time To Live (ttl) period; this ttl value stands for the amount of execution time passed between the time when the action was posted in the ready queue and the expiration time of this action.
  • ttl Time To Live
  • the action manager 1004 preferably invokes the method harakiri(), which notifies the action that it will not be executed.
  • harakiri() preferably decreases the priority of the action until a threshold is reached. After this threshold has been reached, the persistent priority starts to increase.
  • This model operates to handle actions that were proposed or executed but failed since the user aborted the action. The persistent priority decreases by incorporating the past experience in the action priority calculation.
  • This method shows how actions that were suggested or executed adapt to the specific user's implicit preferences in realtime.
  • EXAMPLE 3 EVOLUTION SYSTEM FOR AN INTELLIGENT AGENT
  • This example describes a preferred embodiment of an evolution system according to the present invention, including but not limited to a description of DNA (DeoxyriboNucleic Acid) for the creature or avatar according to a preferred embodiment of the present invention, and also a description of an optional gene studio according to the present invention.
  • the evolution system enables the creature or avatar to "evolve", that is, to alter at least one aspect of the behavior and/or appearance of the creature.
  • This example is described as being operative with the intelligent agent described in example 2, but this description is for the purposes of illustration only and is not meant to be limiting in any way.
  • the evolution system for the intelligent agent described in this example may be used (but not necessarily) in conjunction with the learning module and the action selection system described above, thereby making it possible to implement a system that can determine the user's preferences and actively evolve without requesting the user's behavior.
  • Evolution (change) of the intelligent agent is described herein with regard to both tangible features of the agent, which are displayed by the avatar or creature, and non-tangible features of the agent, which affect the behavior of the avatar or creature.
  • Figure 11A shows an exemplary evolution class diagram 1800 .
  • the genetic model described in the class diagram allows for various properties of the intelligent agent to be changed, including visual as well as functional properties.
  • the model includes a CreatureDNA class 1802 that represents the DNA structure.
  • the DNA structure is a vector of available genes and can preferably be extended to incorporate new genes.
  • a gene is a parameter with a range of possible values (i.e. genotype).
  • the gene is interpreted by the system according to the present invention, such that the expression of the data in the gene is its genotype.
  • the head gene is located as the first gene in the DNA, and its value is expressed as the visual structure of the creature's head, although preferably the color of the head is encoded in another gene.
  • the genetic model according to the present invention implements hybrid and mutate genetic operations that modify the DNA.
  • the CreatureProxy class 1804 is responsible for providing an interface to the DNA and to the genetic operations for the system classes. CreatureProxy 1804 holds other non-genetic information about the intelligent agent (i.e. name, birth date, and so forth).
  • the EvolutionMGR class 1806 manages the evolutions of the intelligent agent and provides an interface to the CreatureProxy 1804 of the intelligent agent and its genetic operations to applications.
  • the EvolutionEngine class 1808 listens to evolution events that may be generated from time to time, for indicating that a certain genetic operation should be invoked and performed on the intelligent agent DNA.
  • the DNA structure is given below.
  • the CreatureDNA 1802 preferably listens to such evolution events from the EvolutionEvent 1810 .
  • the following is an algorithm defining an examplory DNA structure.
  • Intelligent agent DNA construction is preferably performed as follows.
  • the DNA is preferably composed from a Gene for each Building Block of the intelligent agent.
  • the building block can be a visual part of the agent, preferably including color or scale (size of the building block), and also can include a non visual property that relate to the functionality and behavior of the intelligent agent.
  • This model of DNA composition can be extended as more building blocks can be added and the expression levels of each building block can increase.
  • each gene building block value (expression level) describes a different genotype expressed in the composed agent.
  • the basic building blocks of the visual agent are modeled as prototypes, hence the amount of prototypes dictate the range of each visual gene. It is also possible to generate in runtime values of expressed genes not relaying on prototypes, for example color gene expression levels can be computed as indexes in the host platform color table, or scale also can be computed with respect to the host screen size, to obtain genotypes that are independent of predefined prototypes.
  • the prototype models are decomposed and then a non-prototype agent is recomposed according to the gene values of each building block.
  • DNA 0 head , 0 : 15 body , 0 : 15 legs , 0 : 15 hands , 0 : 15 tail , 0 : 15
  • Each of the 5 building blocks has 16 different possible genotypes according to the building block gene values that are derived from the number of prototype models.
  • the right building block is taken according to the value of that building block in the DNA, which is the value of its respective gene.
  • DNA 3 5 10 13 0
  • DNA 1 head , 0 : 15 body , 0 : 15 legs , 0 : 15 hands , 0 : 15 tail , 0 : 15 bs_color , 0 : 15
  • DNA 2 head , 0 : 15 body , 0 : 15 legs , 0 : 15 hands , 0 : 15 tail , 0 : 15 bs_color , 0 : 15 intensity , 0 : 15
  • the present invention can express a variety of agent combination types as described above without storing the information of each of the completed combination types. According to the present invention, only with both the information of building blocks of the combination types and the information of a method for combining the building blocks is it possible to make a variety of agent combination types as described above. Accordingly, in case the agent is used with a portable computational device, it is possible for each of the computational device users to hold a substantially-unique type of agent, thanks to diversity in the combination methods.
  • the threshold time Tth is set to 2 weeks, but may also be set differently.
  • the growth time Tg of the agent indicates a time period from when the computational device user resets the agent or starts using the agent for the first time, to the current time.
  • a trait expressed by the DNA 0 may be selected from a combination of first building blocks if the growth time of the agent is less than 2 weeks, whereas a trait expressed by the DNA 1 may be selected from a combination of second building blocks if the growth time is 2 weeks or more.
  • the first building-block combination is set to represent the appearance of a younger agent
  • the second building-block combination is set to represent the appearance of a more grown-up agent
  • the basic mutation operation randomly selects a gene from the gene set that can be mutated, which may be the entire DNA, and then change the value of the selected gene within that gene's possible range (expression levels).
  • the basic operation can be performed numerous times.
  • a mutate application 1812 sends a request to the EvolutionMGR 1806 (arrow 1.1) to create a mutant.
  • the EvolutionMGR class 1806 passes this request to the CreatureProxy 1804 , for a number of mutants (this value may be given in the function call; arrow 1.1.1).
  • the CreatureProxy 1804 preferably selects a random gene (arrow 1.1.1.1.1) and changes it to a value that is still within the gene's range (arrow 1.1.1.1.2).
  • the mutant(s) are then returned to the mutate application 1812 , and are preferably displayed to the user, as described in greater detail below with regard to Example 4.
  • the mutate application 1812 sends a command to replace the existing implementation of the agent with the new mutant (arrow 2.1) to the EvolutionMGR 1806 .
  • the EvolutionMGR 1806 sets the DNA for the creature at the CreatureProxy 1804 (arrow 2.1.1), which preferably then updates the history of the agent at the agent_history 1814 (arrow 2.1.1.1).
  • Figure 11C shows an exemplary sequence diagram for the basic hybrid operation (or cross-over operation), which occurs when two candidate DNAs are aligned one to the other.
  • Both the two candidate DNAs may be obtained from the intelligent agent system.
  • One of the two candidate DNAs may also be obtained from an intelligent agent system for another mobile information device.
  • an intelligent agent for a networked mobile information device in Example 5 described below, one of the two candidate DNAs may be obtained from an intelligent agent for a second mobile information device of a second user via a short message service (SMS).
  • SMS short message service
  • DNA 0 gender , 0 : 1 head , 0 : 15 body , 0 : 15 legs , 0 : 15 hands , 0 : 15 tail , 0 : 15
  • the gender gene determines whether the hybrid operation is allowed. Preferably, the hybrid operation is allowed only between different gender genes. However, if the gender gene is not taken into consideration, the hybrid operation may be allowed in any case.
  • the hybrid operation one or more cross over points located on the DNA vector are preferably selected (the cross-over points number can vary from 1 to the number of genes in the DNA; this number may be randomly selected).
  • the operation of selecting the crossover points is called get_cut_index.
  • the value for the DNA is selected from one of the existing DNA values. This may be performed randomly or according to a count called a cutting_index.
  • the gender-gene hybrid operation is performed by selecting one of the corresponding two genes. The result is a mix between the two candidate DNAs.
  • the basic hybrid operation can be performed numerous times with numerous candidates.
  • a HybridApp 1816 sends a command to the EvolutionMGR 1806 to begin the process of hybridization.
  • the process is optionally performed until the user approves of the hybrid agent or aborts the process.
  • the EvolutionMGR 1806 starts hybridization by sending a command to obtain target DNA (arrow 2.1.1) from the CreatureProxy 1804 , with a number of cross-overs (hybridizations) to be performed.
  • a cutting_index is maintained to indicate when to do a cross-over between the values of the two DNAs.
  • the hybrid agent is returned, and if the user approves, then the current agent is replaced with the hybrid agent, as described above with regard to the mutant process. In the end, the history of the agent at the agent_history 1814 is updated.
  • Hybridization may be performed with agent DNA that is sent from a source external to the mobile information device, for example in a SMS message, through infrared, BlueTooth or the Internet, or any other source.
  • agent DNA that is sent from a source external to the mobile information device, for example in a SMS message, through infrared, BlueTooth or the Internet, or any other source.
  • this process is illustrated with regard to receiving such hybrid DNA through an SMS message.
  • the SMS message preferably contains the data for the DNA in a MIME type. More, the system of the present invention has a hook for this MIME type, so that this type of SMS message is automatically parsed for hybridization without requiring manual intervention by the user.
  • Figure 12 shows an exemplary sequence diagram of such a process.
  • User 1 sends a request to hybridize the intelligent agent of User 1 with that of User 2 through Handset 1.
  • User 2 can optionally approve or reject the request through Handset 2. If User 2 approves, the hybrid operation is performed between the DNA from both agents on Handset 1. The result is displayed to the requesting party (User 1), who may save this hybrid as a replacement for the current agent. If the hybrid is used as the replacement, then User 2 receives a notice and saves to the hybrid to the hybrid results collection on Handset 2.
  • User 1 sends a request to hybridize the intelligent agent of User 1 with that of User 2 through Handset 1.
  • User 2 can optionally approve or reject the request through Handset 2. If User 2 approves, the hybrid operation is performed between the DNA from both agents on Handset 1. The result is displayed to the requesting party (User 1), who may save this hybrid as a replacement for the current agent. If the hybrid is used as the replacement, then User 2 receives a notice and saves to the
  • This Example is described with regard to a plurality of representative, non-limiting, illustrative screenshots, in order to provide an optional but preferred embodiment of the system of the present invention as it interacts with the user.
  • Figure 13 shows an exemplary screenshot of the "floating agent", which is the creature or avatar (visual expression of the intelligent agent).
  • Figure 14 shows an exemplary screenshot of a menu for selecting objects for the intelligent agent's virtual world.
  • Figure 15A shows the Start Wizard application, which allows the user to configure and modify the agent settings, as well as user preferences.
  • Figure 15B-15F show exemplary screenshots of an initial setting mode for an agent after the start wizard is activated, where Figure 15B shows a screenshot of a setting mode for selecting the type of the agent; Figure 15C for selecting a color thereof; Figure 15D for selecting a name thereof; Figure 15E for selecting a personality thereof; and Figure 15F for indicating the completion of the agent setting.
  • One example of an action to be performed with the wizard is to Set Personality, to determine settings for the emotional system of the intelligent agent.
  • the user can configure the creature's personality and tendencies.
  • the user can determine the creature's setting by pressing the right arrow key in order to increase the level of the characteristic and in order to do the opposite and decrease the level of the various characteristics such as Enthusiasm, Sociability, Anti_social behavior, Temper (level of patience), Melancholy, Egoistic behavior, and so forth.
  • the user is also able to set User Preferences, for example to determine how quickly to receive help.
  • Some other non-limiting examples of these preferences include: communication (extent to which the agent communicates); entertain_user (controls agent playing with the user); entertain_self (controls agent playing alone); preserve_battery (extends battery life); and transparency_level (the level of the transparency of the creature).
  • the user also sets User Details with the start wizard, including but not limited to, user name, birthday (according to an optional embodiment of the present invention, this value is important in Hybrid SMS since it will define the "konghup" possibility between users, which is the ability to create a hybrid with a favorable astrology pattern; the konghup option is built according to suitable tables of horsocopes and dates), and gender.
  • the "konghup” also called “goong-hap” is a Korean word used to describe marital harmony as predicted by a fortuneteller, and the konghup possibility can be defined as the possibility of a favorable astrology pattern for inter-personal relationship.
  • the user can also preferably set Creature Details.
  • Figure 16 shows an exemplary menu for performing hybridization through the hybrid application as previously described.
  • Figure 17A shows an exemplary screenshot for viewing a new creature and generating again, by pressing on the Generate button, which enables the user to generate a creature randomly.
  • Figure 17B shows the resultant creature in a screenshot with a Hybrid button: pressing on this button confirms the user's creature selection and passes to the creature preview window.
  • the preview window allows the user to see the newly generated creature in three dimensions, and optionally to animate the creature by using the following options:
  • the animations that the creature can perform include but are not limited to, walking, sitting, smelling, flying, and jumping.
  • Figure 18 shows an exemplary screenshot of the hybrid history, which enables the user to review and explore the history of the creature's changes in the generations.
  • the user can see the current creature and its parents, and also the parents of the parents. Preferably, for every creature there can be at most 2 parents.
  • the creation can be set to indicate that mutation has occurred.
  • Figure 19 shows an exemplary screen shot of the Gene studio, with the DNA Sequence of the current creature.
  • the gene studio also preferably gives the opportunity for the user to change and modify the agent's DNA sequence.
  • the agent's DNA sequence displayed on the gene studio screen is preferably composed of a sequence of four letters A, G, C and T.
  • the four letters represent the four bases constituting biological DNA.
  • the present invention introduces the four letters so that the user becomes more familiar with the agent DNA.
  • the hybrid history or the information as to whether a mutation is selected is stored in the agent system.
  • the learning module can determine preferences or tendencies of the user on the basis of the stored information, and the action selection system can provide an evolution event, according to the determined user preferences or tendencies, to the evolution class diagram.
  • Information as to whether the user selects the result of the performance of the provided evolution event is stored in the agent system, so that the stored information is referred to when the next evolution event is provided.
  • the intelligent agent comprises an avatar for interacting with the user, and an agent for interacting with other components on the network, such as other mobile information devices, and/or the network itself.
  • the avatar forms the user interface (or a portion thereof) and also has an appearance, which is more preferably three-dimensional. This appearance may be humanoid but may alternatively be based upon any type of character or creature, whether real or imaginary.
  • the agent then handles the communication between the avatar and the mobile information device, and/or other components on the network, and/or other avatars on other mobile information devices.
  • the intelligent agent of the present invention is targeted at creating enhanced emotional experience by applying the concept of a "Living Device".
  • This concept includes both emphases upon the uniqueness of the intelligent agent, as every living creature is unique and special in appearance and behavior, while also providing variety, such as a variety of avatar appearances to enhance the user's interaction with the living device.
  • the avatar preferably has compelling visual properties, with suitable supplementary objects and surrounding environment.
  • the intelligent agent preferably displays intelligent decision making, with unexpected behavior that indicates its self-existence and independent learning.
  • independent behavior is an important aspect of the present invention, as it has not been previously demonstrated for any type of user interface or interaction for a user and a computational device of any type, and has certainly not been used for an intelligent agent for a mobile information device.
  • the intelligent agent can also evolve with time, as all living things, displaying visual changes. This is one of the most important "Living Device" properties.
  • the evolution step initiates an emotional response from the user of surprise and anticipation for the next evolution step.
  • Evolution is a visual change of the creature with respect to time.
  • the time frame may be set to a year for example, as this is the lifecycle of midrange cellular telephone in the market. During the year or quarter, periodic changes preferably occur through evolution.
  • the evolutionary path (adaptation to the environment) is a result of natural selection.
  • the natural selection can be user driven (i.e. user decides if the next generation is better), although another option is a predefined natural selection process by developing some criteria for automatic selection.
  • the intelligent agent may be implemented for functioning in two "worlds" or different environments: the telephone world and the virtual creature world.
  • the telephone (mobile information device) world enables the intelligent agent to control different functions of the telephone and to suggest various function selections to the user, as previously described.
  • the intelligent agent is able to operate on the basis of one or more telephone usage processes that are modeled for the agent to follow.
  • Another important aspect of the telephone world is emotional expressions that can be either graphic expressions such as breaking the screen or free drawing or facial and text expressions one or two relevant words for the specific case.
  • the virtual world is preferably a visual display and playground area, where objects other than the avatar can be inserted and the user can observe the avatar learning and interacting with them.
  • the objects that are entered into the world can be predefined, with different behaviors resulting from the learning process.
  • the user can give rewards or disincentives and be part of the learning process.
  • the intelligent agent through the appearance of the avatar may act as a type of virtual pet or companion (for example, act as a running puppy or a laughing person).
  • Some preferred aspects of the intelligent agent include but are not limited to, a 3D graphic infrastructure (with regard to the appearance of the avatar); the use of AI and machine learning mechanisms to support both adaptive and proactive behavior; the provision of gaming capabilities; the ability to enhance the usability of the mobile information device and also to provide specific user assistance; and provision of a host platform abstraction layer. Together, these features provide a robust, compelling and innovative content platform to support a plurality of AI applications all generically defined to be running on the mobile information device.
  • the avatar also preferably has a number of important visual aspects.
  • the outer clip size may optionally be up to 60 x 70 pixels, although a different resolution may be selected according to the characteristics of the screen display of the mobile information device.
  • the avatar is preferably represented as a 3D polygonal object with several colors, but in any case preferably has a plurality of different 3D visual characteristics, such as shades, textures, animation support and so forth. These capabilities may be provided through previously created visual building blocks that are stored on the mobile information device.
  • the visual appearance of the avatar is preferably composed in runtime.
  • the avatar may start "living” after a launch wizard, taking user preferences into account (user introduction to the living device).
  • the avatar may display small visual changes that represent mutations (color change / movement of some key vertices in a random step).
  • Visual evolution step is preferably performed by addition /replacement of a building block.
  • the avatar can preferably move in all directions and rotate, and more is a fully animated 3D character.
  • the avatar is preferably shown as floating over the mobile information device display with the mobile information device user interface in the background, but may also be dismissed upon a request by the user.
  • the avatar is preferably able to understand the current user's normal interaction with the mobile information device and tries to minimize forced hiding/dismissal by the user.
  • the avatar can be programmed to "move" on the screen in a more natural, physically realistic manner.
  • various types of algorithms and parameters are available which attempt to describe physically realistic behavior and movement for controlling the movement of robots. Examples of such algorithms and parameters are described in " Automatic Generation of Kinematic Models for the Conversion of Human Motion Capture Data into Humanoid Robot Motion", A. Ude et al., Proc. First IEEE-RAS Int. Conf. Humanoid Robots (Humanoids 2000), Cambridge, MA, USA, September 2000 .
  • This reference describes various human motion capture techniques, and methods for automatically translating the captured data into humanoid robot kinetic parameters. Briefly, both human and robotic motion are modeled, and the models are used for translating actual human movement data into data that can be used for controlling the motions of humanoid robots.
  • This type of reference is useful as it provides information on how to model the movement of the humanoid robot.
  • the present invention is concerned with realistic movement of an avatar (virtual character being depicted three-dimensionally)
  • similar models could optionally be used for the avatar as for the humanoid robot.
  • a model could also be constructed for modeling animal movements, thereby permitting more realistic movement of an animal or animal-like avatar.
  • the system can handle any given set of 3D character data generically.
  • models could also be used to permit the movement of the avatar to evolve, since different parameters of the model could be altered during the evolutionary process, thereby changing how the avatar moves.
  • Such models are also useful for describing non-deterministic movement of the avatar, and also for enabling non-deterministic movements to evolve. Such non-deterministic behavior also helps to maintain the interest of the user.
  • the intelligent agent may be constructed as described below with regard to Figures 20-23b , although it should be noted that these figures only represent one exemplary implementation and that many different implementations are possible. Again, the implementation of the intelligent agent may incorporate or rely upon the implementations described in Examples 1 and 2 above.
  • FIG 20 is a block diagram of an intelligent agent system 2700 according to the present invention.
  • a first user 2702 controls a first mobile information device 2704 , which for the purpose of this example may be implemented as a cellular telephone for illustration only and without any intention of being limiting.
  • a second user 2706 controls a second mobile information device 2708 .
  • the first mobile information device 2704 and the second mobile information device 2708 preferably communicate through a network 2710 , for example through messaging.
  • Each of the first mobile information device 2704 and the second mobile information device 2708 preferably features an intelligent agent, for interacting with their respective users 2702 and 2706 and also for interacting with the other intelligent agent. Therefore, as shown, the system 2700 enables a community of such intelligent agents to interact with each other, and/or to obtain information for their respective users through the network 2710 , for example.
  • the interactions of the users 2702 and 2706 with their respective mobile information devices 2704, 2708 preferably include the regular operation of the mobile information device, but also add the new exciting functionalities of "living mobile phone".
  • These functionalities can include the intelligent agent but also the use of an avatar for providing a user interface and also more preferably for providing an enhanced user emotional experience.
  • the intelligent agent preferably features an "aware" and intelligent software framework.
  • the inner operation of such a system preferably involve several algorithmic tools, including but not limited to AI and ML algorithms.
  • the system 2700 may involve interactions between multiple users as shown. Such interactions increase the usability and enjoyment of using the mobile information device for the end-user.
  • Figure 21 shows the intelligent agent system of Figure 20 in more detail.
  • a first intelligent agent 2800 is able to operate according to scenario data 2802 (such as the previously described knowledge base) in order to be able to take actions, learn and make decisions as to the operation of the mobile information device.
  • scenario data 2802 such as the previously described knowledge base
  • the learning and development process of the first intelligent agent 2800 is supported by an evolution module 2804 for evolving as previously described. If the first intelligent agent 2800 communicates with the user through an avatar, according to a preferred embodiment of the present invention, then an animation module 2806 is used to support the appearance of the avatar.
  • the first intelligent agent 2800 may also communicate through the network (not shown) with a backend server 2808 and/or another network resource such as a computer 2810 , for example for obtaining information for the user.
  • the first intelligent agent 2800 may also communicate with a second intelligent agent 2812 as shown.
  • Figure 22 shows a block diagram of an exemplary implementation of an action selection system 2900 according to the present invention, which provides the infrastructure for enabling the intelligent agent to select an action.
  • the action selection system 2900 preferably features an ActionManager 2902 (see also Figure 10 for a description), which actually executes the action.
  • a BaseAction interface 2904 provides the interface for all actions executed by the ActionManager 2902 .
  • Actions may use device and application capabilities denoted as an AnimationManager 2906 and a SoundManager 2908 that are necessary to perform the specific action. Each action aggregates the appropriate managers for the correct right execution.
  • the AnimationManager 2906 may also control a ChangeUIAction 2910 , which changes the appearance of the visual display of the user interface.
  • the AnimationManager 2906 may also control a GoAwayFromObjectAction 2912 and a GoTowardObjectAction 2914 , which enables the avatar to interact with virtual objects in the virtual world of the avatar.
  • Figures 23A and 23B show two exemplary, illustrative non-limiting screenshots of the avatar according to the present invention on the screen of the mobile information device.
  • Figure 23A shows an exemplary screenshot of the user interface for adjusting the ring tone volume through an interaction with the avatar.
  • Figure 23B shows an exemplary screenshot of the user interface for receiving a message through an interaction with the avatar.

Claims (14)

  1. Système évolutif pour un agent intelligent pour un dispositif informatique comportant un système d'exploitation, ledit système évolutif comprenant :
    un avatar (802) qui est une interface utilisateur (204) par l'intermédiaire de laquelle l'agent intelligent est adapté à communiquer avec un utilisateur du système évolutif ;
    au moins une application logicielle commandée par ledit système d'exploitation ; et
    une classe « evolution » pour permettre de modifier des propriétés visuelles ou fonctionnelles dudit agent en utilisant un ADN, acide désoxyribonucléique, algorithmique, dans lequel l'ADN est un vecteur de gènes disponibles pour les propriétés visuelles ou fonctionnelles de l'agent intelligent, un gène étant un paramètre dans une gamme de valeurs possibles et l'expression de données du paramètre est un génotype de l'agent intelligent,
    dans lequel ledit agent intelligent sélectionne au hasard un gène à partir de son ADN et modifie une valeur du gène sélectionné pour mettre en oeuvre une mutation, dans lequel ladite mutation est mise en oeuvre en remplaçant l'agent intelligent par un mutant ayant le gène muté après avoir affiché le mutant à l'utilisateur et obtenu l'approbation de l'utilisateur pour le mutant.
  2. Système évolutif selon la revendication 1, dans lequel ladite classe « evolution » comprend :
    une classe « creature DNA » (1802) pour représenter une structure ADN de l'ADN ;
    une classe « creature proxy » (1804) pour fournir une interface à l'ADN et à des opérations génétiques pour des classes dudit système évolutif ;
    une classe « evolution MGR » (1806) pour gérer les évolutions dudit agent intelligent et fournir une interface à la classe « creature proxy » et des opérations génétiques de la classe « creature proxy » à ladite au moins une application logicielle ; et
    une classe « evolution engine » (1810) pour être à l'écoute des événements évolutifs générés pour indiquer qu'une certaine opération génétique doit être invoquée et exécutée sur l'ADN de l'agent intelligent.
  3. Système évolutif selon la revendication 1, dans lequel ledit vecteur contient au moins un élément parmi une tête, un corps, une main, une queue, une jambe, une couleur de tête, une couleur de corps, une couleur de main, une couleur de queue, une couleur de jambe, une échelle de tête, une échelle de corps, une échelle de main, une échelle de queue, une échelle de jambe, une dextérité, une efficacité, une interactivité et une couleur de base.
  4. Système évolutif selon la revendication 1, dans lequel ledit agent intelligent est construit en utilisant un gène inclus dans la structure ADN et une valeur dudit gène.
  5. Système évolutif selon la revendication 1, dans lequel, pour le même ADN, ledit agent intelligent exprime un trait correspondant à une combinaison de premiers blocs de base si le temps de croissance dudit agent intelligent est inférieur à un intervalle de temps prédéterminé et un trait correspondant à une combinaison de deuxièmes blocs de base si le temps de croissance est supérieur ou égal à l'intervalle de temps prédéterminé, de telle sorte que ledit agent intelligent croît automatiquement à mesure que le temps passe, dans lequel chaque bloc de base de l'agent intelligent est une propriété visuelle ou fonctionnelle de l'agent intelligent et correspond à un gène de l'agent intelligent.
  6. Système évolutif selon la revendication 5, dans lequel ledit intervalle de temps prédéterminé est d'au moins 2 semaines.
  7. Système évolutif selon la revendication 1, dans lequel ledit agent intelligent effectue une opération d'hybridation sur son ADN avec un autre ADN provenant d'un autre agent intelligent en sélectionnant au moins un point de croisement situé sur chacun des deux ADN et en sélectionnant une valeur d'ADN parmi les valeurs d'ADN existantes de chaque point de croisement.
  8. Système évolutif selon la revendication 7, dans lequel ladite opération d'hybridation est exécutée entre des gènes de genres différents.
  9. Système évolutif selon la revendication 7, dans lequel ledit agent intelligent est remplacé par un agent hybride généré par l'opération d'hybridation après affichage de l'agent hybride à l'utilisateur et obtention de l'approbation de l'utilisateur de l'agent hybride.
  10. Système évolutif selon la revendication 7, dans lequel l'autre ADN d'un autre agent intelligent est reçu dans un message SMS reçu depuis une source externe par l'intermédiaire d'au moins une liaison parmi une liaison infrarouge, Bluetooth et Internet.
  11. Système évolutif selon la revendication 1, dans lequel une combinaison des lettres A, G, C et T exprimant l'ADN est affichée sur un écran.
  12. Système évolutif selon la revendication 1, dans lequel l'utilisateur peut revoir l'historique des modifications d'une créature représentée par ledit avatar.
  13. Système évolutif selon la revendication 1 ou 2, dans lequel ledit avatar est commandé de manière à agir comme un animal domestique virtuel ou un compagnon virtuel.
  14. Système évolutif selon la revendication 1 ou 2, dans lequel ledit avatar est représenté par un objet polygonal en trois dimensions avec plusieurs couleurs.
EP04021148.4A 2003-09-05 2004-09-06 Interface utilisateur proactive ayant un agent évolutif Not-in-force EP1528464B1 (fr)

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
US50066903P 2003-09-05 2003-09-05
US500669P 2003-09-05
US10/743,476 US20050054381A1 (en) 2003-09-05 2003-12-23 Proactive user interface
US743476 2003-12-23
KR2004016266 2004-03-10
KR1020040016266A KR100680190B1 (ko) 2003-09-05 2004-03-10 진화하는 에이전트를 갖는 사전 행동적 사용자 인터페이스

Publications (3)

Publication Number Publication Date
EP1528464A2 EP1528464A2 (fr) 2005-05-04
EP1528464A3 EP1528464A3 (fr) 2007-01-31
EP1528464B1 true EP1528464B1 (fr) 2013-06-05

Family

ID=34228747

Family Applications (2)

Application Number Title Priority Date Filing Date
EP04001994A Withdrawn EP1522918A3 (fr) 2003-09-05 2004-01-29 Interface utilisateur proactive
EP04021148.4A Not-in-force EP1528464B1 (fr) 2003-09-05 2004-09-06 Interface utilisateur proactive ayant un agent évolutif

Family Applications Before (1)

Application Number Title Priority Date Filing Date
EP04001994A Withdrawn EP1522918A3 (fr) 2003-09-05 2004-01-29 Interface utilisateur proactive

Country Status (13)

Country Link
US (1) US20050054381A1 (fr)
EP (2) EP1522918A3 (fr)
JP (2) JP2005085256A (fr)
KR (6) KR100720023B1 (fr)
CN (2) CN1312554C (fr)
AU (1) AU2003288790B2 (fr)
BR (1) BR0318494A (fr)
CA (1) CA2540397A1 (fr)
IL (1) IL174117A0 (fr)
MX (1) MXPA06002131A (fr)
RU (1) RU2353068C2 (fr)
UA (1) UA84439C2 (fr)
WO (1) WO2005025081A1 (fr)

Families Citing this family (511)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8645137B2 (en) 2000-03-16 2014-02-04 Apple Inc. Fast, language-independent method for user authentication by voice
US6773994B2 (en) * 2001-12-26 2004-08-10 Agere Systems Inc. CMOS vertical replacement gate (VRG) transistors
AU2003209194A1 (en) 2002-01-08 2003-07-24 Seven Networks, Inc. Secure transport for mobile communication network
US20050064916A1 (en) * 2003-09-24 2005-03-24 Interdigital Technology Corporation User cognitive electronic device
US20050147054A1 (en) * 2003-10-23 2005-07-07 Loo Rose P. Navigational bar
KR20050073126A (ko) * 2004-01-08 2005-07-13 와이더댄 주식회사 무선 인터넷에서의 개인화된 웹 페이지 제공 방법 및 시스템
WO2005074235A1 (fr) * 2004-01-30 2005-08-11 Combots Product Gmbh & Co. Kg Procede et systeme de telecommunication a l'aide de representants virtuels
US8001120B2 (en) * 2004-02-12 2011-08-16 Microsoft Corporation Recent contacts and items
JP4151728B2 (ja) * 2004-03-04 2008-09-17 日本電気株式会社 データ更新システム、データ更新方法、およびデータ更新プログラム、ならびにロボットシステム
JP2005292893A (ja) * 2004-03-31 2005-10-20 Nec Access Technica Ltd 携帯情報端末装置
US7555717B2 (en) * 2004-04-30 2009-06-30 Samsung Electronics Co., Ltd. Method for displaying screen image on mobile terminal
US20070203589A1 (en) * 2005-04-08 2007-08-30 Manyworlds, Inc. Adaptive Recombinant Process Methods
US9047388B2 (en) 2004-07-01 2015-06-02 Mindjet Llc System, method, and software application for displaying data from a web service in a visual map
US7580363B2 (en) * 2004-08-16 2009-08-25 Nokia Corporation Apparatus and method for facilitating contact selection in communication devices
KR100673162B1 (ko) * 2004-08-17 2007-01-22 에스케이 텔레콤주식회사 자율적 이동 컴퓨팅을 위한 모바일 에이전트, 이를 구비한이동단말기, 및 이를 이용한 서비스 방법
KR100612859B1 (ko) * 2004-08-26 2006-08-14 삼성전자주식회사 사용패턴에 의한 대화형 사용자 인터페이스 운영 방법 및시스템
EP1631050B1 (fr) * 2004-08-26 2007-06-13 Samsung Electronics Co., Ltd. Système mobile, procédé et logiciel pour gérer une interface utilisateur conversationnel en fonction de profils d'utilisation détectés
US7912186B2 (en) * 2004-10-20 2011-03-22 Microsoft Corporation Selectable state machine user interface system
US7551727B2 (en) * 2004-10-20 2009-06-23 Microsoft Corporation Unified messaging architecture
US7590430B1 (en) * 2004-11-01 2009-09-15 Sprint Communications Company L.P. Architecture and applications to support device-driven firmware upgrades and configurable menus
US20090018698A1 (en) * 2004-11-26 2009-01-15 Electronics And Telecommunications Research Instit Robot system based on network and execution method of that system
KR100755433B1 (ko) * 2004-12-20 2007-09-04 삼성전자주식회사 휴대단말기의 통화 관련 이벤트 처리 장치 및 방법
US7913189B2 (en) * 2005-02-21 2011-03-22 Canon Kabushiki Kaisha Information processing apparatus and control method for displaying user interface
US20060195797A1 (en) * 2005-02-25 2006-08-31 Toshiba Corporation Efficient document processing selection
US20060247851A1 (en) * 2005-03-08 2006-11-02 Morris Robert P Mobile phone having a TV remote style user interface
US20060206364A1 (en) * 2005-03-14 2006-09-14 Nokia Corporation Relationship assistant
US8438633B1 (en) 2005-04-21 2013-05-07 Seven Networks, Inc. Flexible real-time inbox access
DE102005020688B4 (de) * 2005-05-03 2008-01-03 Siemens Ag Mobiltelefon
JP4698281B2 (ja) * 2005-05-09 2011-06-08 ソニー・エリクソン・モバイルコミュニケーションズ株式会社 携帯端末、情報推奨方法及びプログラム
WO2006126205A2 (fr) * 2005-05-26 2006-11-30 Vircomzone Ltd. Systemes, utilisations et procedes d'affichage graphique
WO2006136660A1 (fr) 2005-06-21 2006-12-28 Seven Networks International Oy Maintien d'une connexion ip dans un reseau mobile
JP4681965B2 (ja) * 2005-07-19 2011-05-11 富士通東芝モバイルコミュニケーションズ株式会社 通信端末
KR100842866B1 (ko) * 2005-08-18 2008-07-02 (주)인피니티 텔레콤 학습기능을 가지는 이동통신 단말기와 이동통신 단말기를이용한 학습방법
US9152982B2 (en) 2005-08-19 2015-10-06 Nuance Communications, Inc. Method of compensating a provider for advertisements displayed on a mobile phone
US8549392B2 (en) * 2005-08-30 2013-10-01 Microsoft Corporation Customizable spreadsheet table styles
US8677377B2 (en) 2005-09-08 2014-03-18 Apple Inc. Method and apparatus for building an intelligent automated assistant
US8073700B2 (en) 2005-09-12 2011-12-06 Nuance Communications, Inc. Retrieval and presentation of network service results for mobile device using a multimodal browser
US20070061189A1 (en) * 2005-09-12 2007-03-15 Sbc Knowledge Ventures Lp Method for motivating competitors in an enterprise setting
JP2007075446A (ja) * 2005-09-15 2007-03-29 Square Enix Co Ltd ビデオゲーム処理装置、およびビデオゲーム処理プログラム
US8539374B2 (en) * 2005-09-23 2013-09-17 Disney Enterprises, Inc. Graphical user interface for electronic devices
US7477909B2 (en) * 2005-10-31 2009-01-13 Nuance Communications, Inc. System and method for conducting a search using a wireless mobile device
US7840451B2 (en) * 2005-11-07 2010-11-23 Sap Ag Identifying the most relevant computer system state information
US8401884B1 (en) * 2005-11-07 2013-03-19 Avantas L.L.C. Electronic scheduling for work shifts
US8805675B2 (en) * 2005-11-07 2014-08-12 Sap Ag Representing a computer system state to a user
US7979295B2 (en) * 2005-12-02 2011-07-12 Sap Ag Supporting user interaction with a computer system
US20070135110A1 (en) * 2005-12-08 2007-06-14 Motorola, Inc. Smart call list
US20070174235A1 (en) * 2006-01-26 2007-07-26 Michael Gordon Method of using digital characters to compile information
JP2007249818A (ja) * 2006-03-17 2007-09-27 Graduate School For The Creation Of New Photonics Industries 電子カルテシステム
US20070286395A1 (en) * 2006-05-24 2007-12-13 International Business Machines Corporation Intelligent Multimedia Dial Tone
KR100763238B1 (ko) * 2006-05-26 2007-10-04 삼성전자주식회사 모바일 디바이스를 위한 특이성 탐지 장치 및 방법
US20080034396A1 (en) * 2006-05-30 2008-02-07 Lev Zvi H System and method for video distribution and billing
WO2007139342A1 (fr) * 2006-05-30 2007-12-06 Samsung Electronics Co., Ltd. Plate-forme de lancement d'application mobile commandée par l'intérêt utilisateur et procédé de fonctionnement
KR101295155B1 (ko) 2006-06-26 2013-08-09 삼성전자주식회사 사용자의 행동 분석 결과에 따라 대기화면을 표시하는이동통신단말기 및 그 방법
WO2008008893A2 (fr) * 2006-07-12 2008-01-17 Medical Cyberworlds, Inc. Système de formation médicale informatisé
EP1895505A1 (fr) * 2006-09-04 2008-03-05 Sony Deutschland GmbH Méthode et appareil pour la détection d'atmosphère musicale
KR100834646B1 (ko) * 2006-09-05 2008-06-02 삼성전자주식회사 소프트웨어 로봇 메시지 전송 방법
US8014760B2 (en) 2006-09-06 2011-09-06 Apple Inc. Missed telephone call management for a portable multifunction device
AU2011244866B2 (en) * 2006-09-06 2015-01-29 Apple Inc. Incoming telephone call management for a portable multifunction device with touch screen display
US20080055263A1 (en) * 2006-09-06 2008-03-06 Lemay Stephen O Incoming Telephone Call Management for a Portable Multifunction Device
US9318108B2 (en) * 2010-01-18 2016-04-19 Apple Inc. Intelligent automated assistant
US7881990B2 (en) * 2006-11-30 2011-02-01 Intuit Inc. Automatic time tracking based on user interface events
US8731610B2 (en) * 2006-12-13 2014-05-20 Samsung Electronics Co., Ltd. Method for adaptive user interface in mobile devices
US20080161045A1 (en) * 2006-12-29 2008-07-03 Nokia Corporation Method, Apparatus and Computer Program Product for Providing a Link to Contacts on the Idle Screen
US7975242B2 (en) 2007-01-07 2011-07-05 Apple Inc. Portable multifunction device, method, and graphical user interface for conference calling
US8225227B2 (en) * 2007-01-19 2012-07-17 Microsoft Corporation Managing display of user interfaces
US8024660B1 (en) 2007-01-31 2011-09-20 Intuit Inc. Method and apparatus for variable help content and abandonment intervention based on user behavior
CN101601263B (zh) * 2007-02-06 2013-06-19 日本电气株式会社 定制蜂窝电话的设备及方法
EP2119022A4 (fr) * 2007-02-09 2010-12-01 Mobile Complete Inc Enregistrement interactif de dispositif virtuel
WO2008106196A1 (fr) * 2007-03-01 2008-09-04 Sony Computer Entertainment America Inc. Commande d'avatar de monde virtuel, interactivité et messagerie interactive de communication
GB0703974D0 (en) * 2007-03-01 2007-04-11 Sony Comp Entertainment Europe Entertainment device
US8843376B2 (en) * 2007-03-13 2014-09-23 Nuance Communications, Inc. Speech-enabled web content searching using a multimodal browser
US8977255B2 (en) 2007-04-03 2015-03-10 Apple Inc. Method and system for operating a multi-function portable electronic device using voice-activation
JP4367663B2 (ja) * 2007-04-10 2009-11-18 ソニー株式会社 画像処理装置、画像処理方法、プログラム
US7870491B1 (en) 2007-04-27 2011-01-11 Intuit Inc. System and method for user support based on user interaction histories
US20080297515A1 (en) * 2007-05-30 2008-12-04 Motorola, Inc. Method and apparatus for determining the appearance of a character display by an electronic device
US20080301556A1 (en) * 2007-05-30 2008-12-04 Motorola, Inc. Method and apparatus for displaying operational information about an electronic device
US8805425B2 (en) 2007-06-01 2014-08-12 Seven Networks, Inc. Integrated messaging
US8886259B2 (en) 2007-06-20 2014-11-11 Qualcomm Incorporated System and method for user profiling from gathering user data through interaction with a wireless communication device
US8892171B2 (en) * 2007-06-20 2014-11-18 Qualcomm Incorporated System and method for user profiling from gathering user data through interaction with a wireless communication device
US8065429B2 (en) * 2007-06-28 2011-11-22 Nokia Corporation System, apparatus and method for associating an anticipated success indication with data delivery
JP4506795B2 (ja) 2007-08-06 2010-07-21 ソニー株式会社 生体運動情報表示処理装置、生体運動情報処理システム
KR100934225B1 (ko) * 2007-09-21 2009-12-29 한국전자통신연구원 일상생활 행위 인식을 위한 주체자의 행위 분류 보정 장치및 방법, 이를 이용한 일상생활 행위 인식 시스템
US8312373B2 (en) 2007-10-18 2012-11-13 Nokia Corporation Apparatus, method, and computer program product for affecting an arrangement of selectable items
ES2541106T3 (es) * 2007-11-22 2015-07-16 Telefonaktiebolaget Lm Ericsson (Publ) Procedimiento y dispositivo para cómputo o cálculo ágil
US9002828B2 (en) 2007-12-13 2015-04-07 Seven Networks, Inc. Predictive content delivery
US10002189B2 (en) 2007-12-20 2018-06-19 Apple Inc. Method and apparatus for searching using an active ontology
US9330720B2 (en) 2008-01-03 2016-05-03 Apple Inc. Methods and apparatus for altering audio output signals
US10176827B2 (en) 2008-01-15 2019-01-08 Verint Americas Inc. Active lab
US8862657B2 (en) 2008-01-25 2014-10-14 Seven Networks, Inc. Policy based content service
US20090193338A1 (en) 2008-01-28 2009-07-30 Trevor Fiatal Reducing network and battery consumption during content delivery and playback
US7827072B1 (en) 2008-02-18 2010-11-02 United Services Automobile Association (Usaa) Method and system for interface presentation
US9659011B1 (en) * 2008-02-18 2017-05-23 United Services Automobile Association (Usaa) Method and system for interface presentation
US8042061B1 (en) 2008-02-18 2011-10-18 United Services Automobile Association Method and system for interface presentation
US8171407B2 (en) * 2008-02-21 2012-05-01 International Business Machines Corporation Rating virtual world merchandise by avatar visits
US8997018B2 (en) * 2008-03-04 2015-03-31 Synaptics Incorporated Presenting a menu
JP5159375B2 (ja) 2008-03-07 2013-03-06 インターナショナル・ビジネス・マシーンズ・コーポレーション メタバースにおけるオブジェクトの真贋判断システム、方法及びそのコンピュータ・プログラム
ATE526616T1 (de) * 2008-03-13 2011-10-15 Rational Ag Verfahren zum bereitstellen eines intelligenten mensch-maschinen-interfaces bei gargeräten
EP2101230B1 (fr) 2008-03-13 2012-11-07 Rational AG Procédé destiné à la préparation d'interfaces homme-machine intelligentes dans des appareils de cuisson
US8996376B2 (en) 2008-04-05 2015-03-31 Apple Inc. Intelligent text-to-speech conversion
KR101461949B1 (ko) * 2008-04-24 2014-11-14 엘지전자 주식회사 이동통신 단말기 및 이를 이용한 캐릭터 연동 방법
US10496753B2 (en) 2010-01-18 2019-12-03 Apple Inc. Automatically adapting user interfaces for hands-free interaction
US20090298020A1 (en) * 2008-06-03 2009-12-03 United Parcel Service Of America, Inc. Systems and methods for improving user efficiency with handheld devices
US8787947B2 (en) 2008-06-18 2014-07-22 Seven Networks, Inc. Application discovery on mobile devices
US20100030549A1 (en) 2008-07-31 2010-02-04 Lee Michael M Mobile device having human language translation capability with positional feedback
US20100042469A1 (en) * 2008-08-18 2010-02-18 Microsoft Corporation Mobile device enhanced shopping experience
US9223469B2 (en) * 2008-08-22 2015-12-29 Intellectual Ventures Fund 83 Llc Configuring a virtual world user-interface
US8805450B2 (en) * 2008-09-05 2014-08-12 Microsoft Corp. Intelligent contact management
US20100082515A1 (en) * 2008-09-26 2010-04-01 Verizon Data Services, Llc Environmental factor based virtual communication systems and methods
US8676904B2 (en) 2008-10-02 2014-03-18 Apple Inc. Electronic devices with voice command and contextual data processing capabilities
US8909759B2 (en) 2008-10-10 2014-12-09 Seven Networks, Inc. Bandwidth measurement
US8452456B2 (en) 2008-10-27 2013-05-28 Lennox Industries Inc. System and method of use for a user interface dashboard of a heating, ventilation and air conditioning network
US8452906B2 (en) 2008-10-27 2013-05-28 Lennox Industries, Inc. Communication protocol system and method for a distributed-architecture heating, ventilation and air conditioning network
US8255086B2 (en) 2008-10-27 2012-08-28 Lennox Industries Inc. System recovery in a heating, ventilation and air conditioning network
US9377768B2 (en) 2008-10-27 2016-06-28 Lennox Industries Inc. Memory recovery scheme and data structure in a heating, ventilation and air conditioning network
US9152155B2 (en) 2008-10-27 2015-10-06 Lennox Industries Inc. Device abstraction system and method for a distributed-architecture heating, ventilation and air conditioning system
US8892797B2 (en) 2008-10-27 2014-11-18 Lennox Industries Inc. Communication protocol system and method for a distributed-architecture heating, ventilation and air conditioning network
US8977794B2 (en) 2008-10-27 2015-03-10 Lennox Industries, Inc. Communication protocol system and method for a distributed-architecture heating, ventilation and air conditioning network
US8437878B2 (en) 2008-10-27 2013-05-07 Lennox Industries Inc. Alarm and diagnostics system and method for a distributed architecture heating, ventilation and air conditioning network
US8994539B2 (en) 2008-10-27 2015-03-31 Lennox Industries, Inc. Alarm and diagnostics system and method for a distributed-architecture heating, ventilation and air conditioning network
US8433446B2 (en) 2008-10-27 2013-04-30 Lennox Industries, Inc. Alarm and diagnostics system and method for a distributed-architecture heating, ventilation and air conditioning network
US8661165B2 (en) 2008-10-27 2014-02-25 Lennox Industries, Inc. Device abstraction system and method for a distributed architecture heating, ventilation and air conditioning system
US8600558B2 (en) 2008-10-27 2013-12-03 Lennox Industries Inc. System recovery in a heating, ventilation and air conditioning network
US8744629B2 (en) 2008-10-27 2014-06-03 Lennox Industries Inc. System and method of use for a user interface dashboard of a heating, ventilation and air conditioning network
US9268345B2 (en) 2008-10-27 2016-02-23 Lennox Industries Inc. System and method of use for a user interface dashboard of a heating, ventilation and air conditioning network
US9261888B2 (en) 2008-10-27 2016-02-16 Lennox Industries Inc. System and method of use for a user interface dashboard of a heating, ventilation and air conditioning network
US9325517B2 (en) 2008-10-27 2016-04-26 Lennox Industries Inc. Device abstraction system and method for a distributed-architecture heating, ventilation and air conditioning system
US8788100B2 (en) 2008-10-27 2014-07-22 Lennox Industries Inc. System and method for zoning a distributed-architecture heating, ventilation and air conditioning network
US8239066B2 (en) 2008-10-27 2012-08-07 Lennox Industries Inc. System and method of use for a user interface dashboard of a heating, ventilation and air conditioning network
US8874815B2 (en) 2008-10-27 2014-10-28 Lennox Industries, Inc. Communication protocol system and method for a distributed architecture heating, ventilation and air conditioning network
US8548630B2 (en) 2008-10-27 2013-10-01 Lennox Industries, Inc. Alarm and diagnostics system and method for a distributed-architecture heating, ventilation and air conditioning network
US8762666B2 (en) 2008-10-27 2014-06-24 Lennox Industries, Inc. Backup and restoration of operation control data in a heating, ventilation and air conditioning network
US8295981B2 (en) 2008-10-27 2012-10-23 Lennox Industries Inc. Device commissioning in a heating, ventilation and air conditioning network
US9632490B2 (en) 2008-10-27 2017-04-25 Lennox Industries Inc. System and method for zoning a distributed architecture heating, ventilation and air conditioning network
US8543243B2 (en) 2008-10-27 2013-09-24 Lennox Industries, Inc. System and method of use for a user interface dashboard of a heating, ventilation and air conditioning network
US8463442B2 (en) 2008-10-27 2013-06-11 Lennox Industries, Inc. Alarm and diagnostics system and method for a distributed architecture heating, ventilation and air conditioning network
US8560125B2 (en) 2008-10-27 2013-10-15 Lennox Industries Communication protocol system and method for a distributed-architecture heating, ventilation and air conditioning network
US8615326B2 (en) 2008-10-27 2013-12-24 Lennox Industries Inc. System and method of use for a user interface dashboard of a heating, ventilation and air conditioning network
US8600559B2 (en) 2008-10-27 2013-12-03 Lennox Industries Inc. Method of controlling equipment in a heating, ventilation and air conditioning network
US8352081B2 (en) 2008-10-27 2013-01-08 Lennox Industries Inc. Communication protocol system and method for a distributed-architecture heating, ventilation and air conditioning network
US8352080B2 (en) 2008-10-27 2013-01-08 Lennox Industries Inc. Communication protocol system and method for a distributed-architecture heating, ventilation and air conditioning network
US8564400B2 (en) 2008-10-27 2013-10-22 Lennox Industries, Inc. Communication protocol system and method for a distributed-architecture heating, ventilation and air conditioning network
US8655491B2 (en) 2008-10-27 2014-02-18 Lennox Industries Inc. Alarm and diagnostics system and method for a distributed architecture heating, ventilation and air conditioning network
US8855825B2 (en) 2008-10-27 2014-10-07 Lennox Industries Inc. Device abstraction system and method for a distributed-architecture heating, ventilation and air conditioning system
US8694164B2 (en) 2008-10-27 2014-04-08 Lennox Industries, Inc. Interactive user guidance interface for a heating, ventilation and air conditioning system
US9651925B2 (en) 2008-10-27 2017-05-16 Lennox Industries Inc. System and method for zoning a distributed-architecture heating, ventilation and air conditioning network
US8463443B2 (en) 2008-10-27 2013-06-11 Lennox Industries, Inc. Memory recovery scheme and data structure in a heating, ventilation and air conditioning network
US8774210B2 (en) 2008-10-27 2014-07-08 Lennox Industries, Inc. Communication protocol system and method for a distributed-architecture heating, ventilation and air conditioning network
US9678486B2 (en) 2008-10-27 2017-06-13 Lennox Industries Inc. Device abstraction system and method for a distributed-architecture heating, ventilation and air conditioning system
US8725298B2 (en) 2008-10-27 2014-05-13 Lennox Industries, Inc. Alarm and diagnostics system and method for a distributed architecture heating, ventilation and conditioning network
US8802981B2 (en) 2008-10-27 2014-08-12 Lennox Industries Inc. Flush wall mount thermostat and in-set mounting plate for a heating, ventilation and air conditioning system
US8798796B2 (en) 2008-10-27 2014-08-05 Lennox Industries Inc. General control techniques in a heating, ventilation and air conditioning network
US9432208B2 (en) 2008-10-27 2016-08-30 Lennox Industries Inc. Device abstraction system and method for a distributed architecture heating, ventilation and air conditioning system
US8437877B2 (en) 2008-10-27 2013-05-07 Lennox Industries Inc. System recovery in a heating, ventilation and air conditioning network
US8442693B2 (en) 2008-10-27 2013-05-14 Lennox Industries, Inc. System and method of use for a user interface dashboard of a heating, ventilation and air conditioning network
US8655490B2 (en) 2008-10-27 2014-02-18 Lennox Industries, Inc. System and method of use for a user interface dashboard of a heating, ventilation and air conditioning network
US9396455B2 (en) 2008-11-10 2016-07-19 Mindjet Llc System, method, and software application for enabling a user to view and interact with a visual map in an external application
US10102534B2 (en) * 2008-12-09 2018-10-16 International Business Machines Corporation System and method for virtual universe relocation through an advertising offer
US10489434B2 (en) 2008-12-12 2019-11-26 Verint Americas Inc. Leveraging concepts with information retrieval techniques and knowledge bases
US9009662B2 (en) 2008-12-18 2015-04-14 Adobe Systems Incorporated Platform sensitive application characteristics
US9009661B2 (en) * 2008-12-18 2015-04-14 Adobe Systems Incorporated Platform sensitive application characteristics
DE102008055011A1 (de) 2008-12-19 2010-07-01 Deutsche Telekom Ag Verfahren zur Steuerung einer Benutzerschnittstelle
US9635195B1 (en) * 2008-12-24 2017-04-25 The Directv Group, Inc. Customizable graphical elements for use in association with a user interface
US8209638B2 (en) * 2008-12-31 2012-06-26 Sap Ag Customization abstraction
CN101770339B (zh) * 2009-01-05 2012-12-19 深圳富泰宏精密工业有限公司 使用者行为追踪及记录系统与方法
US8401992B2 (en) * 2009-02-06 2013-03-19 IT Actual, Sdn. Bhd. Computing platform based on a hierarchy of nested data structures
US8151199B2 (en) 2009-02-09 2012-04-03 AltEgo, LLC Computational delivery system for avatar and background game content
US8539359B2 (en) * 2009-02-11 2013-09-17 Jeffrey A. Rapaport Social network driven indexing system for instantly clustering people with concurrent focus on same topic into on-topic chat rooms and/or for generating on-topic search results tailored to user preferences regarding topic
DE102009018491A1 (de) * 2009-04-21 2010-11-11 Siemens Aktiengesellschaft Einstellen eines Feld- oder Leitgerätes
US8311983B2 (en) * 2009-04-28 2012-11-13 Whp Workflow Solutions, Llc Correlated media for distributed sources
US9760573B2 (en) 2009-04-28 2017-09-12 Whp Workflow Solutions, Llc Situational awareness
US10565065B2 (en) 2009-04-28 2020-02-18 Getac Technology Corporation Data backup and transfer across multiple cloud computing providers
US10419722B2 (en) 2009-04-28 2019-09-17 Whp Workflow Solutions, Inc. Correlated media source management and response control
KR101052411B1 (ko) * 2009-05-11 2011-07-28 경희대학교 산학협력단 패턴 추론 방식으로 사용자의 상황을 예측하는 방법
US8412662B2 (en) * 2009-06-04 2013-04-02 Motorola Mobility Llc Method and system of interaction within both real and virtual worlds
US10241752B2 (en) 2011-09-30 2019-03-26 Apple Inc. Interface for a virtual digital assistant
US9858925B2 (en) 2009-06-05 2018-01-02 Apple Inc. Using context information to facilitate processing of commands in a virtual assistant
US10706373B2 (en) 2011-06-03 2020-07-07 Apple Inc. Performing actions associated with task items that represent tasks to perform
US10241644B2 (en) 2011-06-03 2019-03-26 Apple Inc. Actionable reminder entries
KR101562792B1 (ko) * 2009-06-10 2015-10-23 삼성전자주식회사 목표 예측 인터페이스 제공 장치 및 그 방법
US11520455B2 (en) * 2009-06-29 2022-12-06 International Business Machines Corporation Dioramic user interface having a user customized experience
US9431006B2 (en) 2009-07-02 2016-08-30 Apple Inc. Methods and apparatuses for automatic speech recognition
DE112010000035B4 (de) * 2009-08-03 2015-04-30 Honda Motor Co., Ltd. Roboter und Regelungs- /Steuerungssystem
US8491504B2 (en) * 2009-08-04 2013-07-23 University Of South Carolina Devices and methods for monitoring sit to stand transfers
KR101544371B1 (ko) * 2009-08-07 2015-08-17 삼성전자주식회사 사용자 상황을 반영하는 휴대 단말기 및 이의 운용 방법
KR101584058B1 (ko) * 2009-08-07 2016-01-12 삼성전자주식회사 현재 상황에 적합한 사용 환경을 제공하는 휴대 단말기 및 이의 운용 방법
JP5333068B2 (ja) * 2009-08-31 2013-11-06 ソニー株式会社 情報処理装置、表示方法及び表示プログラム
US8943094B2 (en) 2009-09-22 2015-01-27 Next It Corporation Apparatus, system, and method for natural language processing
US20110087975A1 (en) * 2009-10-13 2011-04-14 Sony Ericsson Mobile Communications Ab Method and arrangement in a data
USD648641S1 (en) 2009-10-21 2011-11-15 Lennox Industries Inc. Thin cover plate for an electronic system controller
USD648642S1 (en) 2009-10-21 2011-11-15 Lennox Industries Inc. Thin cover plate for an electronic system controller
FR2952200A1 (fr) * 2009-10-29 2011-05-06 Alcatel Lucent Dispositif et procede d'analyse automatique de l'utilisation de l'interface utilisateur d'une application
WO2011063106A1 (fr) * 2009-11-18 2011-05-26 Nellcor Puritan Bennett Llc Interface utilisateur intelligente destinée à des moniteurs médicaux
US20110143728A1 (en) * 2009-12-16 2011-06-16 Nokia Corporation Method and apparatus for recognizing acquired media for matching against a target expression
US20110153868A1 (en) * 2009-12-18 2011-06-23 Alcatel-Lucent Usa Inc. Cloud-Based Application For Low-Provisioned High-Functionality Mobile Station
US8305433B2 (en) * 2009-12-23 2012-11-06 Motorola Mobility Llc Method and device for visual compensation
US20110179303A1 (en) * 2010-01-15 2011-07-21 Microsoft Corporation Persistent application activation and timer notifications
US10553209B2 (en) 2010-01-18 2020-02-04 Apple Inc. Systems and methods for hands-free notification summaries
US10705794B2 (en) 2010-01-18 2020-07-07 Apple Inc. Automatically adapting user interfaces for hands-free interaction
US10276170B2 (en) 2010-01-18 2019-04-30 Apple Inc. Intelligent automated assistant
US10679605B2 (en) 2010-01-18 2020-06-09 Apple Inc. Hands-free list-reading by intelligent automated assistant
DE202011111062U1 (de) 2010-01-25 2019-02-19 Newvaluexchange Ltd. Vorrichtung und System für eine Digitalkonversationsmanagementplattform
US20110202864A1 (en) * 2010-02-15 2011-08-18 Hirsch Michael B Apparatus and methods of receiving and acting on user-entered information
US8260444B2 (en) 2010-02-17 2012-09-04 Lennox Industries Inc. Auxiliary controller of a HVAC system
US8682667B2 (en) 2010-02-25 2014-03-25 Apple Inc. User profiling for selecting user specific voice input processing information
US10551992B2 (en) * 2010-03-07 2020-02-04 Brendan Edward Clark Interface transitioning and/or transformation
US9348615B1 (en) * 2010-03-07 2016-05-24 Brendan Edward Clark Interface transitioning and/or transformation
US8527530B2 (en) * 2010-03-22 2013-09-03 Sony Corporation Destination prediction using text analysis
US20110248822A1 (en) * 2010-04-09 2011-10-13 Jc Ip Llc Systems and apparatuses and methods to adaptively control controllable systems
US10786736B2 (en) 2010-05-11 2020-09-29 Sony Interactive Entertainment LLC Placement of user information in a game space
US9245177B2 (en) * 2010-06-02 2016-01-26 Microsoft Technology Licensing, Llc Limiting avatar gesture display
US8922376B2 (en) * 2010-07-09 2014-12-30 Nokia Corporation Controlling a user alert
EP3651028A1 (fr) 2010-07-26 2020-05-13 Seven Networks, LLC Coordination de la circulation de réseau mobile à travers de multiples applications
US8838783B2 (en) 2010-07-26 2014-09-16 Seven Networks, Inc. Distributed caching for resource and mobile network traffic management
US20120042263A1 (en) 2010-08-10 2012-02-16 Seymour Rapaport Social-topical adaptive networking (stan) system allowing for cooperative inter-coupling with external social networking systems and other content sources
US20120054626A1 (en) * 2010-08-30 2012-03-01 Jens Odenheimer Service level agreements-based cloud provisioning
US8504487B2 (en) * 2010-09-21 2013-08-06 Sony Computer Entertainment America Llc Evolution of a user interface based on learned idiosyncrasies and collected data of a user
US9122744B2 (en) 2010-10-11 2015-09-01 Next It Corporation System and method for providing distributed intelligent assistance
GB2484715A (en) * 2010-10-21 2012-04-25 Vodafone Ip Licensing Ltd Communication terminal with situation based configuration updating
WO2012060995A2 (fr) 2010-11-01 2012-05-10 Michael Luna Mise en cache distribuée dans un réseau sans fil d'un contenu fourni par une application mobile sur une requête de longue durée
US8484314B2 (en) 2010-11-01 2013-07-09 Seven Networks, Inc. Distributed caching in a wireless network of content delivered for a mobile application over a long-held request
US8843153B2 (en) 2010-11-01 2014-09-23 Seven Networks, Inc. Mobile traffic categorization and policy for network use optimization while preserving user experience
US8903954B2 (en) 2010-11-22 2014-12-02 Seven Networks, Inc. Optimization of resource polling intervals to satisfy mobile device requests
CN102479024A (zh) * 2010-11-24 2012-05-30 国基电子(上海)有限公司 手持装置及其用户界面构建方法
US20120162443A1 (en) * 2010-12-22 2012-06-28 International Business Machines Corporation Contextual help based on facial recognition
GB2501416B (en) 2011-01-07 2018-03-21 Seven Networks Llc System and method for reduction of mobile network traffic used for domain name system (DNS) queries
US9262612B2 (en) 2011-03-21 2016-02-16 Apple Inc. Device access using voice authentication
US8316098B2 (en) 2011-04-19 2012-11-20 Seven Networks Inc. Social caching for device resource sharing and management
EP2621144B1 (fr) 2011-04-27 2014-06-25 Seven Networks, Inc. Système et procédé de présentation de demandes pour le compte d'un dispositif mobile à partir de processus atomiques pour soulager le trafic de réseau mobile
EP2702500B1 (fr) 2011-04-27 2017-07-19 Seven Networks, LLC Détection et conservation d'un état pour répondre aux demandes d'application dans un système d'antémémoire et de serveur mandataire réparti
US8676937B2 (en) 2011-05-12 2014-03-18 Jeffrey Alan Rapaport Social-topical adaptive networking (STAN) system allowing for group based contextual transaction offers and acceptances and hot topic watchdogging
US10057736B2 (en) 2011-06-03 2018-08-21 Apple Inc. Active transport based notifications
WO2013015995A1 (fr) 2011-07-27 2013-01-31 Seven Networks, Inc. Génération et distribution automatiques d'informations de politique concernant un trafic mobile malveillant dans un réseau sans fil
US8994660B2 (en) 2011-08-29 2015-03-31 Apple Inc. Text correction processing
US8425308B2 (en) 2011-09-07 2013-04-23 International Business Machines Corporation Counter-balancing in-play video game incentives/rewards by creating a counter-incentive
US9485411B2 (en) 2011-10-28 2016-11-01 Canon Kabushiki Kaisha Display control apparatus and method for controlling display control apparatus
US8868753B2 (en) 2011-12-06 2014-10-21 Seven Networks, Inc. System of redundantly clustered machines to provide failover mechanisms for mobile traffic management and network resource conservation
US8934414B2 (en) 2011-12-06 2015-01-13 Seven Networks, Inc. Cellular or WiFi mobile traffic optimization based on public or private network destination
WO2013086455A1 (fr) 2011-12-07 2013-06-13 Seven Networks, Inc. Schémas d'intégration flexibles et dynamiques d'un système de gestion de trafic avec divers opérateurs de réseau permettant d'alléger le trafic du réseau
US9277443B2 (en) 2011-12-07 2016-03-01 Seven Networks, Llc Radio-awareness of mobile device for sending server-side control signals using a wireless network optimized transport protocol
US9244583B2 (en) * 2011-12-09 2016-01-26 Microsoft Technology Licensing, Llc Adjusting user interface screen order and composition
EP2792188B1 (fr) 2011-12-14 2019-03-20 Seven Networks, LLC Système et procédé de rapport et d'analyse d'utilisation de réseau mobile utilisant une agrégation de données dans un système d'optimisation de trafic distribué
GB2497935A (en) * 2011-12-22 2013-07-03 Ibm Predicting actions input to a user interface
US9836177B2 (en) 2011-12-30 2017-12-05 Next IT Innovation Labs, LLC Providing variable responses in a virtual-assistant environment
WO2013103988A1 (fr) * 2012-01-05 2013-07-11 Seven Networks, Inc. Détection et gestion d'interactions d'utilisateur à l'aide d'applications d'avant-plan sur un dispositif mobile dans une mise en cache distribuée
JP2015509237A (ja) * 2012-01-08 2015-03-26 テクニジョン インコーポレイテッド 動的割当て可能なユーザインターフェースのための方法およびシステム
US9348430B2 (en) * 2012-02-06 2016-05-24 Steelseries Aps Method and apparatus for transitioning in-process applications to remote devices
US10134385B2 (en) 2012-03-02 2018-11-20 Apple Inc. Systems and methods for name pronunciation
US9483461B2 (en) 2012-03-06 2016-11-01 Apple Inc. Handling speech synthesis of content for multiple languages
JP5938987B2 (ja) * 2012-03-28 2016-06-22 ソニー株式会社 情報処理装置、情報処理方法及びプログラム
US8812695B2 (en) 2012-04-09 2014-08-19 Seven Networks, Inc. Method and system for management of a virtual network connection without heartbeat messages
WO2013155208A1 (fr) 2012-04-10 2013-10-17 Seven Networks, Inc. Service client/services de centre d'appels intelligents améliorés au moyen d'une application mobile en temps réel et historique et des statistiques relatives au trafic collectées par un système de mémoire cache distribué dans un réseau mobile
US9223537B2 (en) 2012-04-18 2015-12-29 Next It Corporation Conversation user interface
US9280610B2 (en) 2012-05-14 2016-03-08 Apple Inc. Crowd sourcing information to fulfill user requests
US10417037B2 (en) 2012-05-15 2019-09-17 Apple Inc. Systems and methods for integrating third party services with a digital assistant
US20130325758A1 (en) * 2012-05-30 2013-12-05 Microsoft Corporation Tailored operating system learning experience
US9721563B2 (en) 2012-06-08 2017-08-01 Apple Inc. Name recognition system
US9495129B2 (en) 2012-06-29 2016-11-15 Apple Inc. Device, method, and user interface for voice-activated navigation and browsing of a document
JP6028429B2 (ja) * 2012-07-10 2016-11-16 富士ゼロックス株式会社 表示制御装置、サービス提供装置、及びプログラム
WO2014011216A1 (fr) 2012-07-13 2014-01-16 Seven Networks, Inc. Ajustement dynamique de bande passante pour une activité de navigation ou de lecture en continu dans un réseau sans fil sur la base d'une prédiction du comportement de l'utilisateur lors d'une interaction avec des applications mobiles
JP5954037B2 (ja) 2012-08-09 2016-07-20 沖電気工業株式会社 紙幣処理装置、及び紙幣処理方法
US20140057619A1 (en) * 2012-08-24 2014-02-27 Tencent Technology (Shenzhen) Company Limited System and method for adjusting operation modes of a mobile device
US9536049B2 (en) 2012-09-07 2017-01-03 Next It Corporation Conversational virtual healthcare assistant
KR20140033672A (ko) * 2012-09-10 2014-03-19 삼성전자주식회사 이벤트에 관련된 정보를 전송하는 방법 및 디바이스
US9547647B2 (en) 2012-09-19 2017-01-17 Apple Inc. Voice-based media searching
US20140111523A1 (en) * 2012-10-22 2014-04-24 Google Inc. Variable length animations based on user inputs
US9161258B2 (en) 2012-10-24 2015-10-13 Seven Networks, Llc Optimized and selective management of policy deployment to mobile clients in a congested network to prevent further aggravation of network congestion
US9652744B2 (en) * 2012-12-10 2017-05-16 Sap Se Smart user interface adaptation in on-demand business applications
US9307493B2 (en) 2012-12-20 2016-04-05 Seven Networks, Llc Systems and methods for application management of mobile device radio state promotion and demotion
US8690578B1 (en) 2013-01-03 2014-04-08 Mark E. Nusbaum Mobile computing weight, diet, nutrition, and exercise tracking system with enhanced feedback and data acquisition functionality
CN112130874A (zh) 2013-01-11 2020-12-25 辛纳科尔股份有限公司 背景控制面板配置选择的方法和系统
US9241314B2 (en) 2013-01-23 2016-01-19 Seven Networks, Llc Mobile device with application or context aware fast dormancy
US8874761B2 (en) 2013-01-25 2014-10-28 Seven Networks, Inc. Signaling optimization in a wireless network for traffic utilizing proprietary and non-proprietary protocols
JP2016508007A (ja) 2013-02-07 2016-03-10 アップル インコーポレイテッド デジタルアシスタントのためのボイストリガ
US9326185B2 (en) 2013-03-11 2016-04-26 Seven Networks, Llc Mobile network congestion recognition for optimization of mobile traffic
US9764468B2 (en) 2013-03-15 2017-09-19 Brain Corporation Adaptive predictor apparatus and methods
US10445115B2 (en) 2013-04-18 2019-10-15 Verint Americas Inc. Virtual assistant focused user interfaces
US9223413B2 (en) * 2013-04-30 2015-12-29 Honeywell International Inc. Next action page key for system generated messages
WO2014178306A1 (fr) * 2013-04-30 2014-11-06 グリー株式会社 Procede de fourniture d'informations d'affichage, programme de fourniture d'informations d'affichage et dispositif serveur
US9242372B2 (en) 2013-05-31 2016-01-26 Brain Corporation Adaptive robotic interface apparatus and methods
WO2014197334A2 (fr) 2013-06-07 2014-12-11 Apple Inc. Système et procédé destinés à une prononciation de mots spécifiée par l'utilisateur dans la synthèse et la reconnaissance de la parole
US9582608B2 (en) 2013-06-07 2017-02-28 Apple Inc. Unified ranking with entropy-weighted information for phrase-based semantic auto-completion
WO2014197336A1 (fr) 2013-06-07 2014-12-11 Apple Inc. Système et procédé pour détecter des erreurs dans des interactions avec un assistant numérique utilisant la voix
WO2014197335A1 (fr) 2013-06-08 2014-12-11 Apple Inc. Interprétation et action sur des commandes qui impliquent un partage d'informations avec des dispositifs distants
CN110442699A (zh) 2013-06-09 2019-11-12 苹果公司 操作数字助理的方法、计算机可读介质、电子设备和系统
US10176167B2 (en) 2013-06-09 2019-01-08 Apple Inc. System and method for inferring user intent from speech inputs
US9314924B1 (en) 2013-06-14 2016-04-19 Brain Corporation Predictive robotic controller apparatus and methods
US9792546B2 (en) 2013-06-14 2017-10-17 Brain Corporation Hierarchical robotic controller apparatus and methods
US9065765B2 (en) 2013-07-22 2015-06-23 Seven Networks, Inc. Proxy server associated with a mobile carrier for enhancing mobile traffic management in a mobile network
WO2015016723A1 (fr) * 2013-08-02 2015-02-05 Auckland Uniservices Limited Système d'animation neuro-comportemental
US9204288B2 (en) 2013-09-25 2015-12-01 At&T Mobility Ii Llc Intelligent adaptation of address books
US10175845B2 (en) * 2013-10-16 2019-01-08 3M Innovative Properties Company Organizing digital notes on a user interface
US9597797B2 (en) 2013-11-01 2017-03-21 Brain Corporation Apparatus and methods for haptic training of robots
US20150127593A1 (en) * 2013-11-06 2015-05-07 Forever Identity, Inc. Platform to Acquire and Represent Human Behavior and Physical Traits to Achieve Digital Eternity
US10296160B2 (en) 2013-12-06 2019-05-21 Apple Inc. Method for extracting salient dialog usage from live data
US20150162000A1 (en) * 2013-12-10 2015-06-11 Harman International Industries, Incorporated Context aware, proactive digital assistant
US9823811B2 (en) 2013-12-31 2017-11-21 Next It Corporation Virtual assistant team identification
US9358685B2 (en) * 2014-02-03 2016-06-07 Brain Corporation Apparatus and methods for control of robot actions based on corrective user inputs
US9984516B2 (en) 2014-02-11 2018-05-29 Gentex Corporation Systems and methods for adding a trainable transceiver to a vehicle
US20150248193A1 (en) * 2014-02-28 2015-09-03 Fuhu Holdings, Inc. Customized user interface for mobile computers
DE102015206263A1 (de) * 2014-04-10 2015-10-15 Ford Global Technologies, Llc Anwendungsvorhersage für kontextbezogene schnittstellen
CN106463044B (zh) 2014-04-18 2020-03-20 金泰克斯公司 可训练收发器和移动通信设备系统及方法
RU2580434C2 (ru) * 2014-05-22 2016-04-10 Общество С Ограниченной Ответственностью "Яндекс" Сервер и способ обработки электронных сообщений (варианты)
US9842101B2 (en) 2014-05-30 2017-12-12 Apple Inc. Predictive conversion of language input
US9760559B2 (en) 2014-05-30 2017-09-12 Apple Inc. Predictive text input
US10078631B2 (en) 2014-05-30 2018-09-18 Apple Inc. Entropy-guided text prediction using combined word and character n-gram language models
US9633004B2 (en) 2014-05-30 2017-04-25 Apple Inc. Better resolution when referencing to concepts
US9966065B2 (en) 2014-05-30 2018-05-08 Apple Inc. Multi-command single utterance input method
US9785630B2 (en) 2014-05-30 2017-10-10 Apple Inc. Text prediction using combined word N-gram and unigram language models
US10170123B2 (en) 2014-05-30 2019-01-01 Apple Inc. Intelligent assistant for home automation
US9715875B2 (en) 2014-05-30 2017-07-25 Apple Inc. Reducing the need for manual start/end-pointing and trigger phrases
US9430463B2 (en) 2014-05-30 2016-08-30 Apple Inc. Exemplar-based natural language processing
US9807559B2 (en) 2014-06-25 2017-10-31 Microsoft Technology Licensing, Llc Leveraging user signals for improved interactions with digital personal assistant
US9338493B2 (en) 2014-06-30 2016-05-10 Apple Inc. Intelligent automated assistant for TV user interactions
US10659851B2 (en) 2014-06-30 2020-05-19 Apple Inc. Real-time digital assistant knowledge updates
US9959560B1 (en) 2014-08-26 2018-05-01 Intuit Inc. System and method for customizing a user experience based on automatically weighted criteria
US10446141B2 (en) 2014-08-28 2019-10-15 Apple Inc. Automatic speech recognition based on user feedback
US20160071517A1 (en) 2014-09-09 2016-03-10 Next It Corporation Evaluating Conversation Data based on Risk Factors
US11354755B2 (en) 2014-09-11 2022-06-07 Intuit Inc. Methods systems and articles of manufacture for using a predictive model to determine tax topics which are relevant to a taxpayer in preparing an electronic tax return
US9818400B2 (en) 2014-09-11 2017-11-14 Apple Inc. Method and apparatus for discovering trending terms in speech requests
US10789041B2 (en) 2014-09-12 2020-09-29 Apple Inc. Dynamic thresholds for always listening speech trigger
KR102319983B1 (ko) 2014-09-24 2021-11-01 삼성전자주식회사 정보제공 방법 및 그 방법을 처리하는 전자장치
US10127911B2 (en) 2014-09-30 2018-11-13 Apple Inc. Speaker identification and unsupervised speaker adaptation techniques
US9646609B2 (en) 2014-09-30 2017-05-09 Apple Inc. Caching apparatus for serving phonetic pronunciations
US9668121B2 (en) 2014-09-30 2017-05-30 Apple Inc. Social reminders
US10074360B2 (en) 2014-09-30 2018-09-11 Apple Inc. Providing an indication of the suitability of speech recognition
US9886432B2 (en) 2014-09-30 2018-02-06 Apple Inc. Parsimonious handling of word inflection via categorical stem + suffix N-gram language models
US9630318B2 (en) 2014-10-02 2017-04-25 Brain Corporation Feature detection apparatus and methods for training of robotic navigation
US10096072B1 (en) 2014-10-31 2018-10-09 Intuit Inc. Method and system for reducing the presentation of less-relevant questions to users in an electronic tax return preparation interview process
US10255641B1 (en) 2014-10-31 2019-04-09 Intuit Inc. Predictive model based identification of potential errors in electronic tax return
KR101594946B1 (ko) * 2014-11-21 2016-02-17 스튜디오씨드코리아 주식회사 프로토타이핑 툴을 제공하기 위한 방법, 시스템 및 비일시성의 컴퓨터 판독 가능한 기록 매체
US10552013B2 (en) 2014-12-02 2020-02-04 Apple Inc. Data detection
US20160180352A1 (en) * 2014-12-17 2016-06-23 Qing Chen System Detecting and Mitigating Frustration of Software User
US10628894B1 (en) 2015-01-28 2020-04-21 Intuit Inc. Method and system for providing personalized responses to questions received from a user of an electronic tax return preparation system
US9717387B1 (en) 2015-02-26 2017-08-01 Brain Corporation Apparatus and methods for programming and training of robotic household appliances
US10152299B2 (en) 2015-03-06 2018-12-11 Apple Inc. Reducing response latency of intelligent automated assistants
US9865280B2 (en) 2015-03-06 2018-01-09 Apple Inc. Structured dictation using intelligent automated assistants
US9886953B2 (en) 2015-03-08 2018-02-06 Apple Inc. Virtual assistant activation
US10567477B2 (en) 2015-03-08 2020-02-18 Apple Inc. Virtual assistant continuity
US9721566B2 (en) 2015-03-08 2017-08-01 Apple Inc. Competing devices responding to voice triggers
US9899019B2 (en) 2015-03-18 2018-02-20 Apple Inc. Systems and methods for structured stem and suffix language models
US10065502B2 (en) * 2015-04-14 2018-09-04 Ford Global Technologies, Llc Adaptive vehicle interface system
US9842105B2 (en) 2015-04-16 2017-12-12 Apple Inc. Parsimonious continuous-space phrase representations for natural language processing
US10176534B1 (en) 2015-04-20 2019-01-08 Intuit Inc. Method and system for providing an analytics model architecture to reduce abandonment of tax return preparation sessions by potential customers
US10255258B2 (en) * 2015-04-23 2019-04-09 Avoka Technologies Pty Ltd Modifying an electronic form using metrics obtained from measuring user effort
US10740853B1 (en) 2015-04-28 2020-08-11 Intuit Inc. Systems for allocating resources based on electronic tax return preparation program user characteristics
US10460227B2 (en) 2015-05-15 2019-10-29 Apple Inc. Virtual assistant in a communication session
US10083688B2 (en) 2015-05-27 2018-09-25 Apple Inc. Device voice control for selecting a displayed affordance
US10504509B2 (en) 2015-05-27 2019-12-10 Google Llc Providing suggested voice-based action queries
US10127220B2 (en) 2015-06-04 2018-11-13 Apple Inc. Language identification from short strings
US10101822B2 (en) 2015-06-05 2018-10-16 Apple Inc. Language input correction
US9578173B2 (en) 2015-06-05 2017-02-21 Apple Inc. Virtual assistant aided communication with 3rd party service in a communication session
US11025565B2 (en) 2015-06-07 2021-06-01 Apple Inc. Personalized prediction of responses for instant messaging
US10255907B2 (en) 2015-06-07 2019-04-09 Apple Inc. Automatic accent detection using acoustic models
US10186254B2 (en) 2015-06-07 2019-01-22 Apple Inc. Context-based endpoint detection
US10296168B2 (en) * 2015-06-25 2019-05-21 Northrop Grumman Systems Corporation Apparatus and method for a multi-step selection interface
US20160378747A1 (en) 2015-06-29 2016-12-29 Apple Inc. Virtual assistant for media playback
KR101684454B1 (ko) * 2015-07-02 2016-12-08 주식회사 엘지씨엔에스 하이브리드 애플리케이션 및 이의 이벤트 처리 방법
US10671428B2 (en) 2015-09-08 2020-06-02 Apple Inc. Distributed personal assistant
US10747498B2 (en) 2015-09-08 2020-08-18 Apple Inc. Zero latency digital assistant
US9665567B2 (en) 2015-09-21 2017-05-30 International Business Machines Corporation Suggesting emoji characters based on current contextual emotional state of user
CN106547582A (zh) * 2015-09-22 2017-03-29 阿里巴巴集团控股有限公司 一种预处理方法及装置
US9697820B2 (en) 2015-09-24 2017-07-04 Apple Inc. Unit-selection text-to-speech synthesis using concatenation-sensitive neural networks
US11010550B2 (en) 2015-09-29 2021-05-18 Apple Inc. Unified language modeling framework for word prediction, auto-completion and auto-correction
US10366158B2 (en) 2015-09-29 2019-07-30 Apple Inc. Efficient word encoding for recurrent neural network language models
US11587559B2 (en) 2015-09-30 2023-02-21 Apple Inc. Intelligent device identification
US10740854B1 (en) 2015-10-28 2020-08-11 Intuit Inc. Web browsing and machine learning systems for acquiring tax data during electronic tax return preparation
US10691473B2 (en) 2015-11-06 2020-06-23 Apple Inc. Intelligent automated assistant in a messaging environment
CN105262905A (zh) * 2015-11-20 2016-01-20 小米科技有限责任公司 管理联系人的方法及装置
US10884718B2 (en) 2015-12-01 2021-01-05 Koninklijke Philips N.V. Device for use in improving a user interaction with a user interface application
US10471594B2 (en) * 2015-12-01 2019-11-12 Kindred Systems Inc. Systems, devices, and methods for the distribution and collection of multimodal data associated with robots
US10049668B2 (en) 2015-12-02 2018-08-14 Apple Inc. Applying neural network language models to weighted finite state transducers for automatic speech recognition
US10223066B2 (en) 2015-12-23 2019-03-05 Apple Inc. Proactive assistance based on dialog communication between devices
US10937109B1 (en) 2016-01-08 2021-03-02 Intuit Inc. Method and technique to calculate and provide confidence score for predicted tax due/refund
DE102017000063B4 (de) * 2016-01-14 2019-10-31 Fanuc Corporation Robotereinrichtung mit Lernfunktion
EP3206170A1 (fr) 2016-02-09 2017-08-16 Wipro Limited Système et procédés de création de l'automatisation de processus robotique à la demande
US10140356B2 (en) 2016-03-11 2018-11-27 Wipro Limited Methods and systems for generation and transmission of electronic information using real-time and historical data
US10446143B2 (en) 2016-03-14 2019-10-15 Apple Inc. Identification of voice inputs providing credentials
CN107229965B (zh) * 2016-03-25 2021-10-22 陕西微阅信息技术有限公司 智能机器人的拟人系统和模拟遗忘效果的方法
US10158593B2 (en) 2016-04-08 2018-12-18 Microsoft Technology Licensing, Llc Proactive intelligent personal assistant
US10757048B2 (en) 2016-04-08 2020-08-25 Microsoft Technology Licensing, Llc Intelligent personal assistant as a contact
WO2017181103A1 (fr) * 2016-04-14 2017-10-19 Motiv8 Technologies, Inc. Système de changement de comportement
US10410295B1 (en) 2016-05-25 2019-09-10 Intuit Inc. Methods, systems and computer program products for obtaining tax data
US9934775B2 (en) 2016-05-26 2018-04-03 Apple Inc. Unit-selection text-to-speech synthesis based on predicted concatenation parameters
US9972304B2 (en) 2016-06-03 2018-05-15 Apple Inc. Privacy preserving distributed evaluation framework for embedded personalized systems
US11227589B2 (en) 2016-06-06 2022-01-18 Apple Inc. Intelligent list reading
US10249300B2 (en) 2016-06-06 2019-04-02 Apple Inc. Intelligent list reading
US10049663B2 (en) 2016-06-08 2018-08-14 Apple, Inc. Intelligent automated assistant for media exploration
DK179309B1 (en) 2016-06-09 2018-04-23 Apple Inc Intelligent automated assistant in a home environment
US10586535B2 (en) 2016-06-10 2020-03-10 Apple Inc. Intelligent digital assistant in a multi-tasking environment
US10192552B2 (en) 2016-06-10 2019-01-29 Apple Inc. Digital assistant providing whispered speech
US10067938B2 (en) 2016-06-10 2018-09-04 Apple Inc. Multilingual word prediction
US10490187B2 (en) 2016-06-10 2019-11-26 Apple Inc. Digital assistant providing automated status report
US10509862B2 (en) 2016-06-10 2019-12-17 Apple Inc. Dynamic phrase expansion of language input
DK201670540A1 (en) 2016-06-11 2018-01-08 Apple Inc Application integration with a digital assistant
DK179049B1 (en) 2016-06-11 2017-09-18 Apple Inc Data driven natural language event detection and classification
DK179415B1 (en) 2016-06-11 2018-06-14 Apple Inc Intelligent device arbitration and control
DK179343B1 (en) 2016-06-11 2018-05-14 Apple Inc Intelligent task discovery
CN106201833A (zh) * 2016-06-30 2016-12-07 北京小米移动软件有限公司 WiFi信号图标的展示方法、装置和移动终端
KR102577584B1 (ko) * 2016-08-16 2023-09-12 삼성전자주식회사 기계 번역 방법 및 장치
US10474753B2 (en) 2016-09-07 2019-11-12 Apple Inc. Language identification using recurrent neural networks
US10043516B2 (en) 2016-09-23 2018-08-07 Apple Inc. Intelligent automated assistant
US10268335B2 (en) * 2016-09-29 2019-04-23 Flipboard, Inc. Custom onboarding process for application functionality
DE102016118888A1 (de) * 2016-10-05 2018-04-05 Dr. Ing. H.C. F. Porsche Aktiengesellschaft Verfahren und Vorrichtung zur Steuerung eines Fahrzeugs
US10552742B2 (en) 2016-10-14 2020-02-04 Google Llc Proactive virtual assistant
US10802839B2 (en) * 2016-11-21 2020-10-13 Vmware, Inc. User interface customization based on user tendencies
US11281993B2 (en) 2016-12-05 2022-03-22 Apple Inc. Model and ensemble compression for metric learning
US10593346B2 (en) 2016-12-22 2020-03-17 Apple Inc. Rank-reduced token representation for automatic speech recognition
US11204787B2 (en) * 2017-01-09 2021-12-21 Apple Inc. Application integration with a digital assistant
US10963774B2 (en) * 2017-01-09 2021-03-30 Microsoft Technology Licensing, Llc Systems and methods for artificial intelligence interface generation, evolution, and/or adjustment
US20180247554A1 (en) * 2017-02-27 2018-08-30 Speech Kingdom Llc System and method for treatment of individuals on the autism spectrum by using interactive multimedia
US10636418B2 (en) 2017-03-22 2020-04-28 Google Llc Proactive incorporation of unsolicited content into human-to-computer dialogs
US9865260B1 (en) 2017-05-03 2018-01-09 Google Llc Proactive incorporation of unsolicited content into human-to-computer dialogs
DK201770383A1 (en) 2017-05-09 2018-12-14 Apple Inc. USER INTERFACE FOR CORRECTING RECOGNITION ERRORS
US10417266B2 (en) 2017-05-09 2019-09-17 Apple Inc. Context-aware ranking of intelligent response suggestions
US10395654B2 (en) 2017-05-11 2019-08-27 Apple Inc. Text normalization based on a data-driven learning network
US10726832B2 (en) 2017-05-11 2020-07-28 Apple Inc. Maintaining privacy of personal information
DK201770439A1 (en) 2017-05-11 2018-12-13 Apple Inc. Offline personal assistant
DK179745B1 (en) 2017-05-12 2019-05-01 Apple Inc. SYNCHRONIZATION AND TASK DELEGATION OF A DIGITAL ASSISTANT
US11301477B2 (en) 2017-05-12 2022-04-12 Apple Inc. Feedback analysis of a digital assistant
DK201770429A1 (en) 2017-05-12 2018-12-14 Apple Inc. LOW-LATENCY INTELLIGENT AUTOMATED ASSISTANT
DK179496B1 (en) 2017-05-12 2019-01-15 Apple Inc. USER-SPECIFIC Acoustic Models
DK201770431A1 (en) 2017-05-15 2018-12-20 Apple Inc. Optimizing dialogue policy decisions for digital assistants using implicit feedback
DK201770432A1 (en) 2017-05-15 2018-12-21 Apple Inc. Hierarchical belief states for digital assistants
DK179549B1 (en) 2017-05-16 2019-02-12 Apple Inc. FAR-FIELD EXTENSION FOR DIGITAL ASSISTANT SERVICES
US10303715B2 (en) 2017-05-16 2019-05-28 Apple Inc. Intelligent automated assistant for media exploration
US10311144B2 (en) 2017-05-16 2019-06-04 Apple Inc. Emoji word sense disambiguation
US10403278B2 (en) 2017-05-16 2019-09-03 Apple Inc. Methods and systems for phonetic matching in digital assistant services
US10657328B2 (en) 2017-06-02 2020-05-19 Apple Inc. Multi-task recurrent neural network architecture for efficient morphology handling in neural language modeling
US10812539B2 (en) * 2017-06-09 2020-10-20 International Business Machines Corporation Enhanced group communications with external participants
US10742435B2 (en) 2017-06-29 2020-08-11 Google Llc Proactive provision of new content to group chat participants
JP6218057B1 (ja) * 2017-07-14 2017-10-25 Jeインターナショナル株式会社 自動応答サーバー装置、端末装置、応答システム、応答方法、およびプログラム
US10409132B2 (en) 2017-08-30 2019-09-10 International Business Machines Corporation Dynamically changing vehicle interior
KR102499379B1 (ko) 2017-09-18 2023-02-13 삼성전자주식회사 전자 장치 및 이의 피드백 정보 획득 방법
US10445429B2 (en) 2017-09-21 2019-10-15 Apple Inc. Natural language understanding using vocabularies with compressed serialized tries
US10755051B2 (en) 2017-09-29 2020-08-25 Apple Inc. Rule-based natural language processing
CN111386553A (zh) * 2017-11-29 2020-07-07 斯纳普公司 用于电子消息传递应用的图形渲染
US10636424B2 (en) 2017-11-30 2020-04-28 Apple Inc. Multi-turn canned dialog
US10765948B2 (en) 2017-12-22 2020-09-08 Activision Publishing, Inc. Video game content aggregation, normalization, and publication systems and methods
US10733982B2 (en) 2018-01-08 2020-08-04 Apple Inc. Multi-directional dialog
US11568236B2 (en) 2018-01-25 2023-01-31 The Research Foundation For The State University Of New York Framework and methods of diverse exploration for fast and safe policy improvement
US10671283B2 (en) * 2018-01-31 2020-06-02 Salesforce.Com, Inc. Systems, methods, and apparatuses for implementing intelligently suggested keyboard shortcuts for web console applications
US10733375B2 (en) 2018-01-31 2020-08-04 Apple Inc. Knowledge-based framework for improving natural language understanding
US10789959B2 (en) 2018-03-02 2020-09-29 Apple Inc. Training speaker recognition models for digital assistants
US10592604B2 (en) 2018-03-12 2020-03-17 Apple Inc. Inverse text normalization for automatic speech recognition
US10818288B2 (en) 2018-03-26 2020-10-27 Apple Inc. Natural assistant interaction
US10909331B2 (en) 2018-03-30 2021-02-02 Apple Inc. Implicit identification of translation payload with neural machine translation
US10474329B2 (en) 2018-04-09 2019-11-12 Capital One Services, Llc Selective generation and display of interfaces of a website or program
US11922283B2 (en) 2018-04-20 2024-03-05 H2O.Ai Inc. Model interpretation
US11386342B2 (en) * 2018-04-20 2022-07-12 H2O.Ai Inc. Model interpretation
US10928918B2 (en) 2018-05-07 2021-02-23 Apple Inc. Raise to speak
US11145294B2 (en) 2018-05-07 2021-10-12 Apple Inc. Intelligent automated assistant for delivering content from user experiences
US10984780B2 (en) 2018-05-21 2021-04-20 Apple Inc. Global semantic word embeddings using bi-directional recurrent neural networks
US11386266B2 (en) 2018-06-01 2022-07-12 Apple Inc. Text correction
DK180639B1 (en) 2018-06-01 2021-11-04 Apple Inc DISABILITY OF ATTENTION-ATTENTIVE VIRTUAL ASSISTANT
DK179822B1 (da) 2018-06-01 2019-07-12 Apple Inc. Voice interaction at a primary device to access call functionality of a companion device
US10892996B2 (en) 2018-06-01 2021-01-12 Apple Inc. Variable latency device coordination
DK201870355A1 (en) 2018-06-01 2019-12-16 Apple Inc. VIRTUAL ASSISTANT OPERATION IN MULTI-DEVICE ENVIRONMENTS
US10496705B1 (en) 2018-06-03 2019-12-03 Apple Inc. Accelerated task performance
WO2020018592A1 (fr) 2018-07-17 2020-01-23 Methodical Mind, Llc. Système d'interface utilisateur graphique
JP7099126B2 (ja) * 2018-07-25 2022-07-12 セイコーエプソン株式会社 表示制御装置および表示制御プログラム
JP2020017839A (ja) 2018-07-25 2020-01-30 セイコーエプソン株式会社 スキャンシステム、スキャンプログラムおよび機械学習装置
CN112534449A (zh) * 2018-07-27 2021-03-19 索尼公司 信息处理系统、信息处理方法和记录介质
JP2021182172A (ja) * 2018-07-31 2021-11-25 ソニーグループ株式会社 情報処理装置、情報処理方法およびプログラム
US11568175B2 (en) 2018-09-07 2023-01-31 Verint Americas Inc. Dynamic intent classification based on environment variables
US11010561B2 (en) 2018-09-27 2021-05-18 Apple Inc. Sentiment prediction from textual data
US10839159B2 (en) 2018-09-28 2020-11-17 Apple Inc. Named entity normalization in a spoken dialog system
US11170166B2 (en) 2018-09-28 2021-11-09 Apple Inc. Neural typographical error modeling via generative adversarial networks
US11462215B2 (en) 2018-09-28 2022-10-04 Apple Inc. Multi-modal inputs for voice commands
US10846105B2 (en) * 2018-09-29 2020-11-24 ILAN Yehuda Granot User interface advisor
US11196863B2 (en) 2018-10-24 2021-12-07 Verint Americas Inc. Method and system for virtual assistant conversations
US11475898B2 (en) 2018-10-26 2022-10-18 Apple Inc. Low-latency multi-speaker speech recognition
KR102639695B1 (ko) * 2018-11-02 2024-02-23 삼성전자주식회사 디스플레이 장치 및 그 제어 방법
US11003999B1 (en) 2018-11-09 2021-05-11 Bottomline Technologies, Inc. Customized automated account opening decisioning using machine learning
US11597394B2 (en) 2018-12-17 2023-03-07 Sri International Explaining behavior by autonomous devices
US11638059B2 (en) 2019-01-04 2023-04-25 Apple Inc. Content playback on multiple devices
KR102305177B1 (ko) * 2019-01-22 2021-09-27 (주)티비스톰 Ai 개체에 대한 정보 수집 플랫폼 및 이를 통한 정보 수집 방법
US11048385B2 (en) * 2019-02-14 2021-06-29 Toshiba Tec Kabushiki Kaisha Self-order processing system and control processing method
US11301780B2 (en) 2019-02-15 2022-04-12 Samsung Electronics Co., Ltd. Method and electronic device for machine learning based prediction of subsequent user interface layouts
US11409990B1 (en) 2019-03-01 2022-08-09 Bottomline Technologies (De) Inc. Machine learning archive mechanism using immutable storage
US11348573B2 (en) 2019-03-18 2022-05-31 Apple Inc. Multimodality in digital assistant systems
USD956087S1 (en) 2019-04-23 2022-06-28 Bottomline Technologies, Inc Display screen or portion thereof with a payment transaction graphical user interface
US11307752B2 (en) 2019-05-06 2022-04-19 Apple Inc. User configurable task triggers
US11423908B2 (en) 2019-05-06 2022-08-23 Apple Inc. Interpreting spoken requests
US11475884B2 (en) 2019-05-06 2022-10-18 Apple Inc. Reducing digital assistant latency when a language is incorrectly determined
DK201970509A1 (en) 2019-05-06 2021-01-15 Apple Inc Spoken notifications
US11140099B2 (en) 2019-05-21 2021-10-05 Apple Inc. Providing message response suggestions
DK180129B1 (en) 2019-05-31 2020-06-02 Apple Inc. USER ACTIVITY SHORTCUT SUGGESTIONS
US11289073B2 (en) 2019-05-31 2022-03-29 Apple Inc. Device text to speech
US11496600B2 (en) 2019-05-31 2022-11-08 Apple Inc. Remote execution of machine-learned models
US11360641B2 (en) 2019-06-01 2022-06-14 Apple Inc. Increasing the relevance of new available information
US11687807B1 (en) 2019-06-26 2023-06-27 Bottomline Technologies, Inc. Outcome creation based upon synthesis of history
US11436501B1 (en) 2019-08-09 2022-09-06 Bottomline Technologies, Inc. Personalization of a user interface using machine learning
US11747952B1 (en) 2019-08-09 2023-09-05 Bottomline Technologies Inc. Specialization of a user interface using machine learning
US11188923B2 (en) * 2019-08-29 2021-11-30 Bank Of America Corporation Real-time knowledge-based widget prioritization and display
KR20210036167A (ko) * 2019-09-25 2021-04-02 휴렛-팩커드 디벨롭먼트 컴퍼니, 엘.피. 어플리케이션의 테스트 자동화
US11488406B2 (en) 2019-09-25 2022-11-01 Apple Inc. Text detection using global geometry estimators
KR102298070B1 (ko) * 2019-10-02 2021-09-07 최대철 모바일 디바이스 기반 능동형 인공지능 영상 캐릭터 시스템
US11712627B2 (en) 2019-11-08 2023-08-01 Activision Publishing, Inc. System and method for providing conditional access to virtual gaming items
KR102349589B1 (ko) * 2020-01-13 2022-01-11 한국과학기술원 스마트 단말을 활용한 인공지능 에이전트와의 상호작용 선호도 유형 판별 방법 및 장치
US11099719B1 (en) * 2020-02-25 2021-08-24 International Business Machines Corporation Monitoring user interactions with a device to automatically select and configure content displayed to a user
US11386487B2 (en) 2020-04-30 2022-07-12 Bottomline Technologies, Inc. System for providing scores to customers based on financial data
US11929079B2 (en) 2020-10-27 2024-03-12 Samsung Electronics Co., Ltd Electronic device for managing user model and operating method thereof
US11663395B2 (en) 2020-11-12 2023-05-30 Accenture Global Solutions Limited Automated customization of user interface
USD1009055S1 (en) 2020-12-01 2023-12-26 Bottomline Technologies, Inc. Display screen with graphical user interface
KR20220131721A (ko) * 2021-03-22 2022-09-29 삼성전자주식회사 콘텐츠에 기초하여 루틴을 실행하는 전자 장치 및 전자 장치의 동작 방법
US11893399B2 (en) 2021-03-22 2024-02-06 Samsung Electronics Co., Ltd. Electronic device for executing routine based on content and operating method of the electronic device
US20230179675A1 (en) * 2021-12-08 2023-06-08 Samsung Electronics Co., Ltd. Electronic device and method for operating thereof
WO2023212162A1 (fr) * 2022-04-28 2023-11-02 Theai, Inc. Modèles de personnage d'intelligence artificielle à comportement orienté but
US20230410191A1 (en) * 2022-06-17 2023-12-21 Truist Bank Chatbot experience to execute banking functions
DE102022118722A1 (de) 2022-07-26 2024-02-01 Cariad Se Anpassungsvorrichtung, eingerichtet zur Anpassung eines Betriebs einer Steuervorrichtung eines Fahrzeugs, Verfahren und Fahrzeug
WO2024049415A1 (fr) * 2022-08-30 2024-03-07 Google Llc Suggestions d'actifs intelligentes basées à la fois sur une phrase précédente et sur des performances d'actif entières

Family Cites Families (111)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5535321A (en) * 1991-02-14 1996-07-09 International Business Machines Corporation Method and apparatus for variable complexity user interface in a data processing system
US7242988B1 (en) * 1991-12-23 2007-07-10 Linda Irene Hoffberg Adaptive pattern recognition based controller apparatus and method and human-factored interface therefore
US6400996B1 (en) * 1999-02-01 2002-06-04 Steven M. Hoffberg Adaptive pattern recognition based control system and method
US6418424B1 (en) * 1991-12-23 2002-07-09 Steven M. Hoffberg Ergonomic man-machine interface incorporating adaptive pattern recognition based control system
US5875108A (en) * 1991-12-23 1999-02-23 Hoffberg; Steven M. Ergonomic man-machine interface incorporating adaptive pattern recognition based control system
US7006881B1 (en) * 1991-12-23 2006-02-28 Steven Hoffberg Media recording device with remote graphic user interface
US5388198A (en) * 1992-04-16 1995-02-07 Symantec Corporation Proactive presentation of automating features to a computer user
JPH0612401A (ja) * 1992-06-26 1994-01-21 Fuji Xerox Co Ltd 感情模擬装置
JPH07160462A (ja) * 1993-12-06 1995-06-23 Nissan Motor Co Ltd 画面表示制御装置
JP3127084B2 (ja) * 1994-08-11 2001-01-22 シャープ株式会社 電子秘書システム
US5633484A (en) * 1994-12-26 1997-05-27 Motorola, Inc. Method and apparatus for personal attribute selection and management using a preference memory
US5726688A (en) * 1995-09-29 1998-03-10 Ncr Corporation Predictive, adaptive computer interface
US5821936A (en) * 1995-11-20 1998-10-13 Siemens Business Communication Systems, Inc. Interface method and system for sequencing display menu items
US5880731A (en) 1995-12-14 1999-03-09 Microsoft Corporation Use of avatars with automatic gesturing and bounded interaction in on-line chat session
EP0794647A1 (fr) * 1996-03-06 1997-09-10 Koninklijke Philips Electronics N.V. Téléphone à écran et procédé de gestion de menu d'un téléphone à écran
US5676138A (en) * 1996-03-15 1997-10-14 Zawilinski; Kenneth Michael Emotional response analyzer system with multimedia display
US6804726B1 (en) * 1996-05-22 2004-10-12 Geovector Corporation Method and apparatus for controlling electrical devices in response to sensed conditions
US5727129A (en) * 1996-06-04 1998-03-10 International Business Machines Corporation Network system for profiling and actively facilitating user activities
US6021403A (en) * 1996-07-19 2000-02-01 Microsoft Corporation Intelligent user assistance facility
US5884029A (en) * 1996-11-14 1999-03-16 International Business Machines Corporation User interaction with intelligent virtual objects, avatars, which interact with other avatars controlled by different users
US5966126A (en) * 1996-12-23 1999-10-12 Szabo; Andrew J. Graphic user interface for database system
US6118888A (en) * 1997-02-28 2000-09-12 Kabushiki Kaisha Toshiba Multi-modal interface apparatus and method
JP3159242B2 (ja) * 1997-03-13 2001-04-23 日本電気株式会社 感情生成装置およびその方法
JP3393789B2 (ja) * 1997-05-20 2003-04-07 インターナショナル・ビジネス・マシーンズ・コーポレーション 情報処理端末
US6260192B1 (en) * 1997-06-02 2001-07-10 Sony Corporation Filtering system based on pattern of usage
US6292480B1 (en) * 1997-06-09 2001-09-18 Nortel Networks Limited Electronic communications manager
JPH11259446A (ja) * 1998-03-12 1999-09-24 Aqueous Reserch:Kk エージェント装置
JP3286575B2 (ja) * 1997-09-04 2002-05-27 株式会社エニックス ビデオゲーム装置及びコンピュータプログラムを記録した記録媒体
JP4158213B2 (ja) * 1997-10-23 2008-10-01 カシオ計算機株式会社 通信システム及び該システムの制御方法
KR19990047854A (ko) * 1997-12-05 1999-07-05 정선종 메타데이타에 의한 정보 검색의 지능형 사용자 인터페이스 방법
KR100249859B1 (ko) * 1997-12-19 2000-03-15 이계철 차세대지능망 지능형정보제공시스템 지능망응용프로토콜 시험장치
EP1073941A2 (fr) * 1998-04-16 2001-02-07 Choice Logic Corporation Procedes et dispositif d'evaluation de choix collectifs
US6483523B1 (en) * 1998-05-08 2002-11-19 Institute For Information Industry Personalized interface browser and its browsing method
US6121968A (en) * 1998-06-17 2000-09-19 Microsoft Corporation Adaptive menus
KR100306708B1 (ko) * 1998-10-09 2001-10-19 오길록 지능형 정보 제공 시스템 및 그 시스템의 호처리 방법
US6845370B2 (en) * 1998-11-12 2005-01-18 Accenture Llp Advanced information gathering for targeted activities
JP4465560B2 (ja) 1998-11-20 2010-05-19 ソニー株式会社 情報表示制御装置及び情報表示制御装置の情報表示制御方法
JP2000155750A (ja) * 1998-11-24 2000-06-06 Omron Corp 行動生成装置、行動生成方法及び行動生成プログラム記録媒体
US6963937B1 (en) * 1998-12-17 2005-11-08 International Business Machines Corporation Method and apparatus for providing configurability and customization of adaptive user-input filtration
US6842877B2 (en) * 1998-12-18 2005-01-11 Tangis Corporation Contextual responses based on automated learning techniques
US6317718B1 (en) * 1999-02-26 2001-11-13 Accenture Properties (2) B.V. System, method and article of manufacture for location-based filtering for shopping agent in the physical world
GB2348520B (en) * 1999-03-31 2003-11-12 Ibm Assisting user selection of graphical user interface elements
US6408187B1 (en) * 1999-05-14 2002-06-18 Sun Microsystems, Inc. Method and apparatus for determining the behavior of a communications device based upon environmental conditions
KR20010011752A (ko) * 1999-07-30 2001-02-15 김진찬 인터넷에서의 지능형 정보 제공 시스템 운용 장치
JP2001084072A (ja) * 1999-09-09 2001-03-30 Fujitsu Ltd 次操作誘導型ヘルプ表示装置
GB2354609B (en) * 1999-09-25 2003-07-16 Ibm Method and system for predicting transactions
KR100648231B1 (ko) * 1999-10-19 2006-11-24 삼성전자주식회사 터치 스크린을 갖는 보조 액정디스플레이 패널을 이용한 포인팅 장치를 구비하는 휴대용 컴퓨터 및 그 방법
US6828992B1 (en) * 1999-11-04 2004-12-07 Koninklijke Philips Electronics N.V. User interface with dynamic menu option organization
US20050086239A1 (en) * 1999-11-16 2005-04-21 Eric Swann System or method for analyzing information organized in a configurable manner
KR100602332B1 (ko) * 1999-12-18 2006-07-14 주식회사 케이티 통신시스템에서의 아바타 활용 장치 및 그 방법
US6826540B1 (en) * 1999-12-29 2004-11-30 Virtual Personalities, Inc. Virtual human interface for conducting surveys
JP2001203811A (ja) * 2000-01-19 2001-07-27 Index:Kk 移動体通信システム
JP2003522569A (ja) * 2000-02-11 2003-07-29 マロウン,ディーン,ゲラルド,アンソニー ゲーム方法及びゲーム装置
WO2001069380A2 (fr) * 2000-03-14 2001-09-20 Edapta, Inc. Systeme et procede de validation d'interfaces utilisateur adaptables dynamiquement pour dispositifs electroniques
US20040006473A1 (en) * 2002-07-02 2004-01-08 Sbc Technology Resources, Inc. Method and system for automated categorization of statements
EP1314102B1 (fr) * 2000-04-02 2009-06-03 Tangis Corporation Reponse thematique a un contexte d'utilisateur informatique, notamment au moyen d'un ordinateur personnel portable
US7228327B2 (en) * 2000-05-08 2007-06-05 Hoshiko Llc Method and apparatus for delivering content via information retrieval devices
KR20010111127A (ko) * 2000-06-08 2001-12-17 박규진 통신인터페이스를 이용한 상호대화가 가능한 인간형시계,데이터제공시스템 및 이를 이용한 인터넷 사업방법
KR100383391B1 (ko) * 2000-06-28 2003-05-12 김지한 음성인식서비스 시스템 및 방법
US8495679B2 (en) * 2000-06-30 2013-07-23 Thomson Licensing Method and apparatus for delivery of television programs and targeted de-coupled advertising
WO2002010892A2 (fr) * 2000-07-28 2002-02-07 Symbian Limited Dispositif de calcul avec interface utilisateur amelioree pour menus
JP2002073233A (ja) * 2000-08-29 2002-03-12 Pineapple Company:Kk 処理方法、処理システム、処理装置、処理支援装置、及び記録媒体
KR100426280B1 (ko) * 2000-09-27 2004-04-08 (주) 고미드 네트웍 기반 사용자 행동 패턴 분석 시스템 및 그 방법
US20030046401A1 (en) * 2000-10-16 2003-03-06 Abbott Kenneth H. Dynamically determing appropriate computer user interfaces
US6731307B1 (en) * 2000-10-30 2004-05-04 Koninklije Philips Electronics N.V. User interface/entertainment device that simulates personal interaction and responds to user's mental state and/or personality
US20020133347A1 (en) * 2000-12-29 2002-09-19 Eberhard Schoneburg Method and apparatus for natural language dialog interface
JP2002215278A (ja) * 2001-01-16 2002-07-31 Mitsubishi Electric Corp ユーザインタフェース生成装置及びユーザインタフェース生成方法
US7158913B2 (en) * 2001-01-31 2007-01-02 Mobigence, Inc. Automatic activation of touch sensitive screen in a hand held computing device
US7089499B2 (en) * 2001-02-28 2006-08-08 International Business Machines Corporation Personalizing user interfaces across operating systems
US6701144B2 (en) * 2001-03-05 2004-03-02 Qualcomm Incorporated System for automatically configuring features on a mobile telephone based on geographic location
JP3672023B2 (ja) * 2001-04-23 2005-07-13 日本電気株式会社 番組推薦システムおよび番組推薦方法
EP1256875A1 (fr) * 2001-05-10 2002-11-13 Nokia Corporation Procédé et dispositif pour la prédiction des saisies de l'utilisateur selon le contexte
US7313621B2 (en) * 2001-05-15 2007-12-25 Sony Corporation Personalized interface with adaptive content presentation
US20020180786A1 (en) * 2001-06-04 2002-12-05 Robert Tanner Graphical user interface with embedded artificial intelligence
US20050193335A1 (en) * 2001-06-22 2005-09-01 International Business Machines Corporation Method and system for personalized content conditioning
US20030040850A1 (en) * 2001-08-07 2003-02-27 Amir Najmi Intelligent adaptive optimization of display navigation and data sharing
US20030030666A1 (en) * 2001-08-07 2003-02-13 Amir Najmi Intelligent adaptive navigation optimization
JP3545370B2 (ja) * 2001-08-17 2004-07-21 株式会社ジャパンヴィステック テレビジョンでのキャラクタ制御システム
US20030093419A1 (en) * 2001-08-17 2003-05-15 Srinivas Bangalore System and method for querying information using a flexible multi-modal interface
KR20030021525A (ko) * 2001-09-06 2003-03-15 유주성 사용자(나)만의 독립적 3차원 캐릭터 인터페이스
US7725554B2 (en) * 2001-09-28 2010-05-25 Quanta Computer, Inc. Network object delivery system for personal computing device
US7437344B2 (en) * 2001-10-01 2008-10-14 L'oreal S.A. Use of artificial intelligence in providing beauty advice
US6761442B2 (en) * 2001-11-02 2004-07-13 International United Technology Co., Ltd. Ink container with improved ink flow
KR100580617B1 (ko) * 2001-11-05 2006-05-16 삼성전자주식회사 오브젝트 성장제어 시스템 및 그 방법
US6912386B1 (en) * 2001-11-13 2005-06-28 Nokia Corporation Method for controlling operation of a mobile device by detecting usage situations
US20030090515A1 (en) * 2001-11-13 2003-05-15 Sony Corporation And Sony Electronics Inc. Simplified user interface by adaptation based on usage history
US7457735B2 (en) * 2001-11-14 2008-11-25 Bentley Systems, Incorporated Method and system for automatic water distribution model calibration
US20030147369A1 (en) * 2001-12-24 2003-08-07 Singh Ram Naresh Secure wireless transfer of data between different computing devices
US7136909B2 (en) * 2001-12-28 2006-11-14 Motorola, Inc. Multimodal communication method and apparatus with multimodal profile
AU2003202148A1 (en) * 2002-01-04 2003-07-30 Ktfreetel Co., Ltd. Method and device for providing one button-service in mobile terminal
US20030128236A1 (en) * 2002-01-10 2003-07-10 Chen Meng Chang Method and system for a self-adaptive personal view agent
KR100580618B1 (ko) * 2002-01-23 2006-05-16 삼성전자주식회사 생리 신호의 단시간 모니터링을 통한 사용자 정서 인식장치 및 방법
EP1478982B1 (fr) * 2002-02-27 2014-11-05 Y Indeed Consulting L.L.C. Systeme et procede facilitant la personnalisation des contenus multimedias
US7203909B1 (en) * 2002-04-04 2007-04-10 Microsoft Corporation System and methods for constructing personalized context-sensitive portal pages or views by analyzing patterns of users' information access activities
US6731323B2 (en) * 2002-04-10 2004-05-04 International Business Machines Corporation Media-enhanced greetings and/or responses in communication systems
US7512906B1 (en) * 2002-06-04 2009-03-31 Rockwell Automation Technologies, Inc. System and methodology providing adaptive interface in an industrial controller environment
KR100465797B1 (ko) * 2002-07-12 2005-01-13 삼성전자주식회사 휴대용 컴퓨터 및 그 제어방법
US7401295B2 (en) * 2002-08-15 2008-07-15 Simulearn, Inc. Computer-based learning system
US7668885B2 (en) * 2002-09-25 2010-02-23 MindAgent, LLC System for timely delivery of personalized aggregations of, including currently-generated, knowledge
KR20040048548A (ko) * 2002-12-03 2004-06-10 김상수 지능형 데이터베이스 및 검색 편집 프로그램을 통한사용자 맞춤 검색 방법 및 시스템
US7874983B2 (en) * 2003-01-27 2011-01-25 Motorola Mobility, Inc. Determination of emotional and physiological states of a recipient of a communication
US7443971B2 (en) * 2003-05-05 2008-10-28 Microsoft Corporation Computer system with do not disturb system and method
US7409639B2 (en) * 2003-06-19 2008-08-05 Accenture Global Services Gmbh Intelligent collaborative media
KR100576933B1 (ko) * 2003-10-13 2006-05-10 한국전자통신연구원 지능형 웹 에이전트를 이용한 위치기반정보 제공장치 및방법
US7454608B2 (en) * 2003-10-31 2008-11-18 International Business Machines Corporation Resource configuration in multi-modal distributed computing systems
US20050108406A1 (en) * 2003-11-07 2005-05-19 Dynalab Inc. System and method for dynamically generating a customized menu page
US7983920B2 (en) * 2003-11-18 2011-07-19 Microsoft Corporation Adaptive computing environment
US20050131856A1 (en) * 2003-12-15 2005-06-16 O'dea Paul J. Method and system for adaptive user interfacing with an imaging system
US20050266866A1 (en) * 2004-05-26 2005-12-01 Motorola, Inc. Feature finding assistant on a user interface
US20060165092A1 (en) * 2004-12-23 2006-07-27 Agovo Communications, Inc. Out-of-band signaling system, method and computer program product
US7539654B2 (en) * 2005-01-21 2009-05-26 International Business Machines Corporation User interaction management using an ongoing estimate of user interaction skills

Also Published As

Publication number Publication date
KR20050025222A (ko) 2005-03-14
KR100680190B1 (ko) 2007-02-08
EP1528464A2 (fr) 2005-05-04
CN1619470A (zh) 2005-05-25
BR0318494A (pt) 2006-09-12
KR20060101447A (ko) 2006-09-25
MXPA06002131A (es) 2006-05-31
JP2005085256A (ja) 2005-03-31
KR100721518B1 (ko) 2007-05-23
IL174117A0 (en) 2006-08-01
AU2003288790A1 (en) 2005-03-29
KR100703531B1 (ko) 2007-04-03
WO2005025081A1 (fr) 2005-03-17
CA2540397A1 (fr) 2005-03-17
KR20060101449A (ko) 2006-09-25
US20050054381A1 (en) 2005-03-10
RU2353068C2 (ru) 2009-04-20
KR20060110247A (ko) 2006-10-24
KR20060101448A (ko) 2006-09-25
EP1522918A2 (fr) 2005-04-13
KR100720023B1 (ko) 2007-05-18
CN1312554C (zh) 2007-04-25
KR20050025220A (ko) 2005-03-14
EP1528464A3 (fr) 2007-01-31
RU2006110932A (ru) 2007-10-20
CN1652063A (zh) 2005-08-10
JP2005100390A (ja) 2005-04-14
EP1522918A3 (fr) 2007-04-04
UA84439C2 (ru) 2008-10-27
KR100642432B1 (ko) 2006-11-10
KR100724930B1 (ko) 2007-06-04
AU2003288790B2 (en) 2009-02-19

Similar Documents

Publication Publication Date Title
EP1528464B1 (fr) Interface utilisateur proactive ayant un agent évolutif
US8990688B2 (en) Proactive user interface including evolving agent
US7725419B2 (en) Proactive user interface including emotional agent
EP1522920B1 (fr) Interface utilisateur proactive avec un agent émotionnel
CA2536233C (fr) Interface utilisateur proactive comprenant un agent evolutif
US7711778B2 (en) Method for transmitting software robot message

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20040906

AK Designated contracting states

Kind code of ref document: A2

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LI LU MC NL PL PT RO SE SI SK TR

AX Request for extension of the european patent

Extension state: AL HR LT LV MK

PUAL Search report despatched

Free format text: ORIGINAL CODE: 0009013

AK Designated contracting states

Kind code of ref document: A3

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LI LU MC NL PL PT RO SE SI SK TR

AX Request for extension of the european patent

Extension state: AL HR LT LV MK

17Q First examination report despatched

Effective date: 20070511

AKX Designation fees paid

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LI LU MC NL PL PT RO SE SI SK TR

RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: SAMSUNG ELECTRONICS CO., LTD.

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LI LU MC NL PL PT RO SE SI SK TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: AT

Ref legal event code: REF

Ref document number: 616012

Country of ref document: AT

Kind code of ref document: T

Effective date: 20130615

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 602004042331

Country of ref document: DE

Effective date: 20130801

REG Reference to a national code

Ref country code: NL

Ref legal event code: T3

REG Reference to a national code

Ref country code: AT

Ref legal event code: MK05

Ref document number: 616012

Country of ref document: AT

Kind code of ref document: T

Effective date: 20130605

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: AT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20130605

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20130906

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20130605

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20130916

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20130605

Ref country code: SI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20130605

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20130905

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20130605

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20130605

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20130605

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20131007

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20130605

Ref country code: BE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20130605

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20130605

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MC

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20130605

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20130605

REG Reference to a national code

Ref country code: CH

Ref legal event code: PL

26N No opposition filed

Effective date: 20140306

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20130605

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 602004042331

Country of ref document: DE

Effective date: 20140306

REG Reference to a national code

Ref country code: FR

Ref legal event code: ST

Effective date: 20140530

REG Reference to a national code

Ref country code: IE

Ref legal event code: MM4A

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: CH

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20130930

Ref country code: IE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20130906

Ref country code: LI

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20130930

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: FR

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20130930

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: TR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20130605

Ref country code: CY

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20130605

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LU

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20130906

Ref country code: HU

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO

Effective date: 20040906

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: DE

Payment date: 20170822

Year of fee payment: 14

Ref country code: GB

Payment date: 20170823

Year of fee payment: 14

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: NL

Payment date: 20180821

Year of fee payment: 15

REG Reference to a national code

Ref country code: DE

Ref legal event code: R119

Ref document number: 602004042331

Country of ref document: DE

GBPC Gb: european patent ceased through non-payment of renewal fee

Effective date: 20180906

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: DE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20190402

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: GB

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20180906

REG Reference to a national code

Ref country code: NL

Ref legal event code: MM

Effective date: 20191001

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: NL

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20191001