WO2008004813A1 - System and method for managing a character entity in a mobile terminal - Google Patents

System and method for managing a character entity in a mobile terminal Download PDF

Info

Publication number
WO2008004813A1
WO2008004813A1 PCT/KR2007/003252 KR2007003252W WO2008004813A1 WO 2008004813 A1 WO2008004813 A1 WO 2008004813A1 KR 2007003252 W KR2007003252 W KR 2007003252W WO 2008004813 A1 WO2008004813 A1 WO 2008004813A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
character
character entity
characteristic data
communication device
Prior art date
Application number
PCT/KR2007/003252
Other languages
French (fr)
Inventor
Min Hwa Lee
Original Assignee
Min Hwa Lee
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Min Hwa Lee filed Critical Min Hwa Lee
Publication of WO2008004813A1 publication Critical patent/WO2008004813A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/72427User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality for supporting games or graphical animations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0002Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/024Detecting, measuring or recording pulse rate or heart rate
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/165Evaluating the state of mind, e.g. depression, anxiety
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/318Heart-related electrical modalities, e.g. electrocardiography [ECG]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/40Detecting, measuring or recording for evaluating the nervous system
    • A61B5/4005Detecting, measuring or recording for evaluating the nervous system for evaluating the sensory system
    • A61B5/4017Evaluating sense of taste
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/01Measuring temperature of body parts ; Diagnostic temperature sensing, e.g. for malignant or inflamed tissue
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/0205Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/12Audiometering
    • A61B5/121Audiometering evaluating hearing capacity
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/63ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2201/00Electronic components, circuits, software, systems or apparatus used in telephone systems
    • H04M2201/34Microprocessors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2201/00Electronic components, circuits, software, systems or apparatus used in telephone systems
    • H04M2201/36Memories
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2201/00Electronic components, circuits, software, systems or apparatus used in telephone systems
    • H04M2201/38Displays

Definitions

  • the present disclosure relates to managing a character entity in a mobile terminal, and more particularly to creating and updating a character entity representative of a user of a mobile terminal.
  • Mobile terminals are widely used to store contact information about others to facilitate identification of persons whose information have been stored in the mobile terminal.
  • the person's information e.g., phone number, name, etc
  • Mobile terminals can also be used to store users'personal information such as their names, phone numbers, etc.
  • Some mobile terminals allow the users to express themselves by an "avatar.” For example, a user may use a mobile terminal to select an avatar as a representation of the user.
  • the avatar is a digital representation of the user in mobile terminal's virtual environment and is typically a virtual character representation of the user displayed on the mobile terminals.
  • the mobile terminal's avatar can be sent to the other terminals as a representation of the sender.
  • Conventional avatars used in mobile terminals are relatively simple characters that are typically manually selected by users.
  • conventional avatars used in mobile terminals are typically character figures (e.g., an animal, a cartoon, a three-dimensional picture of the user, etc.) selected according to user preferences. Users may select avatars such as character figures (e.g., a smiling face, sad face, angry face, etc.) with a desired expression or behavior to reflect their current state or personality. The selected avatar is then displayed on the mobile terminal and communicated to others.
  • the present disclosure is directed to a mobile terminal capable of automatically generating and updating a character entity representing a state of the user based on user characteristic data such as biological signals and personal information of the user.
  • the mobile terminal may provide a physical representation of the character entity in a plurality of forms such as avatar, color, sound, etc.
  • the character entity of the mobile terminal can be changed according to changes in the user characteristics such as biological signals or personal information.
  • a mobile terminal includes an input interface and a memory.
  • the input interface receives user characteristic data of a user and the memory stores the received user characteristic data.
  • the mobile terminal is configured to generate a character entity representing a state of the user based on the received user characteristic data.
  • a system for synthesizing a character entity includes an input device and a server.
  • the input device is configured to receive character entities from at least two mobile terminals.
  • the server is configured to synthesize a new character entity based on the received character entities.
  • the server may send the synthesized character entity to the mobile terminals or another terminal for storage or further processing.
  • a system for synthesizing a character entity includes a memory, an input device, and a processing device.
  • the memory stores a character entity and the input device is configured to receive another character entity from a mobile terminal.
  • the processing device is configured to synthesize a new character entity based on the stored character entity and the received character entity.
  • a method for generating a character entity in a mobile terminal includes receiving a first set of user characteristic data of a user and generating a character entity based on the received first set of user characteristic data with the character entity representing a state of the user.
  • the character entity is stored in a memory of the mobile terminal.
  • a method for synthesizing a character entity includes uploading at least two character entities from at least two mobile terminals to a server- connected to the mobile terminals. A new character entity is synthesized based on the character entities uploaded in the server.
  • a mobile terminal capable of displaying a plurality of colors includes a memory, and a processing module.
  • the memory receives and stores the user characteristic data of the user.
  • the processing module is configured to receive and analyze the user characteristic data from the memory, and in response to the received user characteristic data, assigns a color to at least a portion of the mobile terminal.
  • a method for changing colors in a mobile terminal capable of displaying a plurality of colors includes receiving user characteristic data of a user and assigning a color to at least a portion of the mobile terminal in response to the received user characteristic data. The assigned color is displayed on the portion of the mobile terminal.
  • FIG. 1 shows a mobile terminal in accordance with one embodiment of the present disclosure
  • FIG. 2 shows a detailed block diagram of information stored in a memory and used to generate a character entity in accordance with one embodiment of the present disclosure
  • FIG. 3 shows a mobile terminal that can display colors based on user characteristics in accordance with another embodiment of the present disclosure
  • Fig. 4 describes a mobile terminal that can display colors based on user characteristics in accordance with still another embodiment of the present disclosure
  • FIG. 5 illustrates a method of creating a character entity in a mobile terminal in accordance with one embodiment of the present disclosure
  • FIG. 6 depicts a method of updating a character entity in a mobile terminal in accordance with one embodiment of the present disclosure
  • Fig. 7 illustrates a system for synthesizing character entities in accordance with one embodiment of the present disclosure.
  • Fig. 8 shows a method of synthesizing character entities in accordance with one embodiment of the present disclosure.
  • the present disclosure provides a system and a method for generating and modifying a character entity representing the state of a user on a mobile terminal.
  • the character entity characterizes the user's state based on biological data acquired from the user and/or other user characteristics (e.g., personal information such as schedule, a pattern of telephonic communication, messages exchanged with other users, etc.).
  • the mobile terminal may provide a physical representation of the character entity such as an avatar, color, sound, etc.
  • the mobile terminal may transfer the avatar to or from another mobile terminal or a server computer.
  • the mobile terminal e.g., mobile phone
  • the mobile terminal is made of materials such as electro-luminescent materials that can change colors according to biological data or other user characteristics.
  • an avatar is typically a graphical or virtual character representation such as smiley face or a figure of a person.
  • an "avatar” is not so limited and may also include representation by means of animation, sound, color, vibration or any combination thereof.
  • FIG. 1 illustrates an exemplary mobile terminal 110. As shown, the mobile terminal
  • the 110 includes an input interface 113 for receiving biological signals ("bio-signals" of a user from a bio-sensor 130 through a communication link 140.
  • the bio-sensor 130 includes one or more sensors for detecting bio-signals of a user, which will be described later in detail.
  • the mobile terminal 110 may also receive bio-signals in the form of biological data from a remote bio-sensor, a server, or other mobile terminals through a wire or wireless communication link.
  • a memory 116 is coupled to receive and store the bio- signals from the input interface 113.
  • the memory 116 may also store other user characteristics such as user's schedule, a pattern of telephonic communication, messages exchanged with other users, etc.
  • the mobile terminal 110 keeps track of the user's schedule, call patterns (e.g., call destination/source, number of such calls, number of unanswered calls, message destination/source, number of such messages, etc.) and stores the data in the memory 116.
  • the user characteristics data (e.g., user's bio-signal and/or personal information) is processed in a character module 114 coupled to the memory 116.
  • the character module 114 may be implemented using a processor (e.g., CPU), an ASIC, or any suitable processing unit employed in a mobile terminal such as a mobile phone, a PDA, etc.
  • the character module 114 Based on the received data, the character module 114 generates a character entity (e.g., avatar, color, sound, vibration, etc.) that reflects the user's state for display or output on an LCD display unit 117 or other components of the mobile terminal. For example, if the user's biological data indicates a high pulse rate, the character entity may depict a heart with a pulsating heart.
  • the character entity may be shown as a running figure with a briefcase. Further, if the user's data indicates an unhealthy state, the character entity may be displayed as being bedridden. Additionally or alternatively, the character entities may be assigned a color representing the user's state. In the case of unhealthy user state, for example, the character module 114 may assign a red color to the character entity for display.
  • the term "color” means not only the chrominance (e.g., red, blue, green), but also luminance (e.g., grey, black, white) as well.
  • the character module 114 also modifies or updates the character entity as new user characteristics data are received. If the user's new biological data indicates an improvement in user's health status, for example, the character module 114 processes the information to modify the character entity to reflect the improvement. In the case of the bedridden character, the character entity may be shown in its healthy state without the bed.
  • the mobile terminal 110 may communicate the character entity and user characteristics data with external devices such as other mobile terminals, servers, or computers via a data communication unit 112.
  • the data communication unit 112 converts character entity and user characteristics data for communicating via an antenna 111 with the external devices. Such transfer of data may be performed, for example, through cellular network, conventional mobile Internet, or direct communication (e.g., Bluetooth) between two terminals.
  • the mobile terminal 110 may also include an FO interface 115 for interfacing with a portable memory card 120 to store or retrieve the character entity and/or user characteristics data.
  • the mobile terminal 110 may be any type of personal communication devices having mobile communication capability such as a mobile phone, a PDA (personal digital assistant), and a notebook computer.
  • the external case for the mobile terminal is made of an organic electro-luminescence (EL) or inorganic EL material that emits and changes color or light intensity in response to user characteristics data.
  • EL organic electro-luminescence
  • inorganic EL material that emits and changes color or light intensity in response to user characteristics data.
  • Fig. 2 illustrates one embodiment of memory 116, depicting various information and modules used to generate a character entity 200.
  • the character entity 200 includes personal information 210, biological information 220, an emotion model 230, a physical model 240, and a behavior engine 250.
  • the emotion model 230 represents the current emotional state of a user while the physical model 240 represents the physical state of the user.
  • the behavior engine 250 models the behavior of the character entity based on attributes from the emotion and physical models 230 and 240.
  • the character entity 200 is stored temporarily in the memory 116 of the mobile terminal 110 for access by the character module 114 in processing the character entity 200 in real-time.
  • the personal information 210 is maintained and updated by the character module
  • the personal information 210 includes user information such as address book (e.g., names or a group of names, corresponding telephone numbers, email addresses and homepage addresses), schedule (e.g., titles of meeting, meeting places, starting and ending time, attendants, relevant memos) and a pattern of telephonic communication (e.g., counter-party's names and telephone numbers, starting time and duration of telephone calls, short messages exchanged with other users, number of calls or messages to and from others, number of missed calls or messages, etc.).
  • the personal information 210 gathered by the character module 114 is provided to the emotion model 230 and the behavior engine 250, which perform estimation of emotion and behavior based on the information.
  • any methodology or theory developed in mental and behavior psychology may be applied to implement the emotion model 230 and the behavior engine 250, but not limited thereto.
  • the emotion model 230 and the physical model 240 are created and updated by using biological information 220 obtained through the bio-sensor 130.
  • the physical model 240 represents a virtual physiological model of the user by modeling various organs of the user based on the biological information 220 (e.g., biological signals, data, etc.).
  • the attributes stored in the physical model 240 are used to model geometrically an avatar, which represents the character entity 200 in a virtual world.
  • the personal information is used for the emotional model and the behavior engine and the biological information is used for the emotional and physical models in the above example, each model can be created or updated using other types of data as well (e.g., in response to the personal information such as a number of incoming calls, the physical model may model a tired state of the user.)
  • the mobile terminal 110 receives the biological information
  • the bio-sensor 130 may be embedded in the mobile terminal 110 or communicate with the mobile terminal 110 through the communication link 140. In an alternative configuration, the bio-sensor 130 may be coupled to the mobile terminal 110 through a hard wired connection, although it is perceived that the communication link 140 is preferably implemented by a wireless connection in most cases.
  • the bio-sensor 130 includes an auditory sensor 131 such as microphone, a visual sensor 132 such as a camera, a taste sensor 133 and an olfactory sensor 134, e.g., manufactured by using nano technology, or a motion sensor 135 such as a MEMS (micro-electro mechanical systems) accelerometer.
  • the biological information 220 of the user may be obtained by measuring bio-signals through a thermometer 136, a pulse sensor 137 or an ECG (electrocardiogram) sensor 138.
  • Fig. 1 illustrates a number of different bio-sensors that can be employed in the mobile terminal 110, the sensor 130 may employ any number of sensors or any other types of sensors that can detect user's physical signs.
  • the biological information obtained by the bio-sensor 130 is transferred to the memory 116 through the input interface 113. Accessing the biological information and personal information stored in the memory 116, the character module 114 processes the information to create and update the emotion model 230 and the physical model 240 of the character entity 200 as described above.
  • the character module 114 analyzes the voice data of the user, which is obtained through the auditory sensor 131, to generate information on the health state of the user.
  • the character module 114 may use any well-known algorithm to analyze the voice to model the health state of the user. According to well-known theory of oriental medicine or phonetics, such health status of a person can be obtained by analyzing the voice of the person.
  • the taste sensor 133 and the olfactory sensor 134 which may be implemented using a laminate metal oxide sensor, nano bio-sensor, etc., are used for detecting taste or scent of wine, food, spoiled food, counterfeit liquors, dangerous materials, etc.
  • the character module 114 compares the taste and scent information obtained by the sensors 133 and 134 with predetermined values to update corresponding attributes of the physical model 240, which determines behavior of the character entity 200 for the source of the information.
  • the character entity 200 including the emotion and physical models 230 and 240 and the behavior engine 250 stored in the memory 116 may be transferred to a portable memory card 120.
  • the portable memory card 120 may be detachable from the mobile terminal 110 and attached to another mobile terminal. As such, the user can transfer and maintain the character entity 200 in a new mobile terminal without having to recreate a character entity.
  • the character entity 200 may further include graphical or animation data, in which the emotion and physical models 230 and 240 are depicted in the form of an avatar.
  • the avatar can be updated or modified in response to changes in the emotion and physical models 230 and 240 over time as well as outputs of the behavior engine 250.
  • the character module 114 uses the graphical or animation data included in the character entity 200 to visualize the avatar on the LCD display unit 117.
  • Fig. 3 shows an embodiment in which the mobile terminal 110 can display colors based on user characteristics.
  • the mobile terminal 110 includes a case 118 made of electro-luminescent (EL) material such as organic or inorganic EL material.
  • EL electro-luminescent
  • the case 118 made of such material can change color or intensity of light emitting according to an input signal (e.g., voltage or current signal).
  • an input signal e.g., voltage or current signal
  • the case 118 is made of EL material, only a specified portion or any portions of the case of the mobile terminal may be made of EL material to display colors representing user's state.
  • the character module 114 generates a color input signal for the case 118 based on the attributes in the emotion model 230, the physical model 240, and the behavior engine 250.
  • the character entity 200 may include color information for display on the case 118.
  • the character module 114 analyzes the user characteristics data and generates a color signal for input to the case 118. If the user characteristics data indicates an unhealthy state, a color signal corresponding to a red color may be generated and provided to the case 118. For example, if the user's voice (which may be input through the bio-sensor 130 or an internal microphone on the mobile terminal 110) indicates an unhealthy state, the character module 114 outputs a color signal indicating a red color.
  • the case 118 In response to the color signal, the case 118 displays the red color on the mobile terminal 110. Subsequently, if the user's voice indicates a healthy state, a color signal indicating a blue color may be provided to the case 118 for display. By displaying and changing colors according to user characteristics data, the user's state is easily and automatically represented on the mobile terminal.
  • Fig. 4 shows an embodiment in which mobile terminal 400 can display colors based on user characteristics.
  • the mobile terminal 400 includes a case 410 made of EL material.
  • the case 410 is made of such material that can change color or intensity of light emitting according to an input signal.
  • the case 410 is made of EL material, only a specified portion or any portions of the case or mobile terminal may be made of EL material to display colors representing user's state.
  • the processing module 420 analyzes various bio-signals inputted through a sensor such as a bio-sensor 430 or an internal microphone 440 and generates an output signal indicating, e.g., current emotional or health state of the user. Such an output signal of the processing module 420 is used as a color signal for input to the case 410, indicating different colors depending on the emotional or health state of the user. In response to the color signal, the case 410 displays certain color corresponding to the color signal on the mobile terminal 110.
  • a sensor such as a bio-sensor 430 or an internal microphone 440
  • an output signal of the processing module 420 is used as a color signal for input to the case 410, indicating different colors depending on the emotional or health state of the user.
  • the case 410 displays certain color corresponding to the color signal on the mobile terminal 110.
  • Fig. 5 illustrates a flowchart of one embodiment of a method for generating a character entity in the mobile terminal 110.
  • the mobile terminal 110 gathers and stores user characteristics data such as personal information 210 and/or biological information 220 from the bio-sensor 130.
  • the personal information 210 includes user information such as address book, schedule and a pattern of telephonic communication, etc, which is stored in the memory 116.
  • the biological information 220 includes bio-signals and data indicating current emotional and physical state of the user, which includes voice of the user inputted through the auditory sensor 131, images taken by the visual sensor 132, taste and scent data obtained by the taste sensor 133 and the olfactory sensor 134, etc. as discussed above.
  • the mobile terminal 110 Based on the user characteristics data, the mobile terminal 110 creates a character entity 200 in operation 520. Specifically, the character module 114 processes the personal information 210 and/or biological information 220 from the memory 116 to generate the character entity 200 including the emotion and physical models 230 and 240 and the behavior engine 250 representing the state of the user.
  • the mobile terminal 110 After generating the character entity 200, the mobile terminal 110 stores the created character entity 200 in the portable memory card 120 in operation 530.
  • the character entity 200 can be transferred to another mobile terminal, computer, or a server connected to the mobile terminal 110 through wired/wireless connections. As such, the user of the mobile terminal 110 can easily transfer and recreate the character entity 200 representing his or her state on another device.
  • Fig. 6 illustrates a flowchart showing one embodiment of a method for updating or modifying a character entity in the mobile terminal 110.
  • the mobile terminal 110 determines if any instructions are received to update the character entity 200 in operation 610.
  • Character entity 200 may be updated periodically or aperiodically as set by the user, or updated automatically whenever new user characteristics data are received.
  • the mobile terminal 110 loads the character entity 200 from the portable memory card 120 into the memory 116 in operation 620. Although the character entity 200 is loaded into the memory 116 and accessed to or from the portable memory card 120, the character entity 200 may be updated in the memory 116 without being loaded/stored from/to the portable memory card 120.
  • the mobile terminal 110 analyzes new user characteristics data
  • the mobile terminal 110 updates or modifies the character entity 200 based on the new information.
  • the character module 114 processes the personal information 210 and/or biological information 220 to update the emotion and physical models 230 and 240 and the behavior engine 250 included in the character entity 200.
  • the character entity 200 preferably includes color information for updating or modifying the color on the case 118 or a portion of the case.
  • the mobile terminal 110 stores the updated character entity 200 from the memory 116 to the portable memory card 120 in operation 650.
  • the generated character entities may be synthesized to generate a new character entity.
  • Fig. 7 illustrates one embodiment of a system for synthesizing a plurality of character entities from a plurality of mobile terminals 712 to 716.
  • a server 730 receives character entities from the mobile terminals 712 and 714 through a wired/wireless network 720.
  • the network 720 may be the Internet, campus/enterprise intranet, a wide area network (WAN), a local area network (LAN), or any other type of network or inter-network.
  • this idea can be applied to networks that use any of a variety of communication techniques, including datagram based networks (e.g., the Internet), connection based (e.g., X.25) networks, virtual circuit based (e.g., Asynchronous Transfer Mode (ATM)) networks, etc.
  • datagram based networks e.g., the Internet
  • connection based networks e.g., X.25
  • virtual circuit based e.g., Asynchronous Transfer Mode (ATM)
  • the server 730 receives the character entities and synthesizes the character entities to create a new character entity.
  • the server 730 transmits the new character entity to a recipient such as the mobile terminal 716 through the network 720.
  • the information on recipient terminal 716 such as its address may be provided to the server 730 by at least one of the mobile terminals 712 and 714.
  • the server 730 is described to serve as an agent for synthesizing the character entities sent from the mobile terminals 712 and 714
  • the process for synthesizing the character entities may be executed in any one of the mobile terminals 712 to 716.
  • the mobile terminal 712 may receive the character entity from the mobile terminal 714, e.g., through a wireless connection such as infrared or Bluetooth, and then synthesize the character entity of the mobile terminal 714 with the one stored in its memory to produce a new character entity. After synthesizing the new character entity, the mobile terminal 712 stores the newly created character entity in its memory or sends it to another mobile terminal.
  • Fig. 8 depicts a flowchart showing one embodiment of a method for synthesizing character entities.
  • operation 810 at least two mobile terminals 712 and 714 upload their character entities to a server 730.
  • the server 730 Upon receiving the character entities, the server 730 synthesizes the character entities to create a new character entity in operation 820.
  • the server 730 preferably uses a genetic combination method.
  • attributes in the emotion and physical models 230 and 240 of the character entity 200 may be treated as genes, each of which has a value indicating a single characteristic of the character entity 200, e.g., ranging from a minimum value to a maximum value.
  • the genetic combination method computes a weighted sum of genes obtained from at least two character entities to generate genes of a new character entity.
  • the weighted values may be adjusted by a user using a graphical user interface designed for character entity customization. Alternatively, the weighted values may be determined using a pseudorandom number generator. By employing such genetic combination method, a variation of a previously generated character entity may be generated.
  • the server 730 Upon generating a new character entity, the server 730 transmits the created character entity to a recipient mobile terminal in operation 830.
  • the process for synthesizing the character entities may be executed in one of the mobile terminals 712 and 714 having the original character entities or the terminal 716, which may store the newly created character entity in its memory or transfer it to another mobile terminal.

Abstract

A system and a method for creating and updating a character entity representing emotional and physical characteristics of a user of a mobile terminal are disclosed. The character entity characterizes a user's emotional and physical state based on personal information of the user (e.g., schedule, a pattern of telephonic communication, messages exchanged with other users) and/or biological data acquired from the user. The character entity may be saved, e.g., in a portable memory card, and transferred from a mobile terminal to another mobile terminal. The character entity may be visualized in a mobile terminal through an avatar or as a color displayed on the mobile terminal.

Description

Description
SYSTEM AND METHOD FOR MANAGING A CHARACTER ENTITY IN A MOBILE TERMINAL
Technical Field
[1] The present disclosure relates to managing a character entity in a mobile terminal, and more particularly to creating and updating a character entity representative of a user of a mobile terminal.
[2]
[3] BACKGROUND
[4] Advances in wireless communications and processing power of mobile terminals
(e.g., mobile phones, PDAs, etc.) have allowed users to communicate a wide variety of information such as voice, music, pictures, video, etc. Mobile terminals are widely used to store contact information about others to facilitate identification of persons whose information have been stored in the mobile terminal. In a mobile phone, for example, when a call is received from a person whose contact information is stored, the person's information (e.g., phone number, name, etc) is displayed on the mobile phone's display. Mobile terminals can also be used to store users'personal information such as their names, phone numbers, etc.
[5] Some mobile terminals allow the users to express themselves by an "avatar." For example, a user may use a mobile terminal to select an avatar as a representation of the user. The avatar is a digital representation of the user in mobile terminal's virtual environment and is typically a virtual character representation of the user displayed on the mobile terminals. When communicating with other mobile terminals, the mobile terminal's avatar can be sent to the other terminals as a representation of the sender.
[6] Conventional avatars used in mobile terminals, however, are relatively simple characters that are typically manually selected by users. For example, conventional avatars used in mobile terminals are typically character figures (e.g., an animal, a cartoon, a three-dimensional picture of the user, etc.) selected according to user preferences. Users may select avatars such as character figures (e.g., a smiling face, sad face, angry face, etc.) with a desired expression or behavior to reflect their current state or personality. The selected avatar is then displayed on the mobile terminal and communicated to others.
[7] Conventional mobile terminals also allow users to express themselves through the use of colors. For example, some mobile terminals provide inserts of a variety of colors. Users change the color of the terminals by changing the inserts according to their preferences. [8] These conventional techniques, however, typically require users to manually select the avatars or colors to express or represent themselves. Since the avatars and colors are manually selected by the user according to users' preference, they may not accurately and realistically represent the user's state or personality at a given time. In addition, whenever the user's state or preference changes, the avatars and colors of the mobile terminals must be manually reconfigured.
[9] Thus, there is a need for a mobile terminal that can provide a realistic representation of a user' state or characteristics at a given time. Further, it would be desirable for the mobile terminal to automatically create and modify such a representation in response to a change in the user's state or personal characteristics.
[10]
[11] SUMMARY
[12] The present disclosure is directed to a mobile terminal capable of automatically generating and updating a character entity representing a state of the user based on user characteristic data such as biological signals and personal information of the user. The mobile terminal may provide a physical representation of the character entity in a plurality of forms such as avatar, color, sound, etc. The character entity of the mobile terminal can be changed according to changes in the user characteristics such as biological signals or personal information.
[13] In one embodiment, a mobile terminal includes an input interface and a memory. The input interface receives user characteristic data of a user and the memory stores the received user characteristic data. The mobile terminal is configured to generate a character entity representing a state of the user based on the received user characteristic data.
[14] In another embodiment, a system for synthesizing a character entity includes an input device and a server. The input device is configured to receive character entities from at least two mobile terminals. The server is configured to synthesize a new character entity based on the received character entities. The server may send the synthesized character entity to the mobile terminals or another terminal for storage or further processing.
[15] In still another embodiment, a system for synthesizing a character entity includes a memory, an input device, and a processing device. The memory stores a character entity and the input device is configured to receive another character entity from a mobile terminal. The processing device is configured to synthesize a new character entity based on the stored character entity and the received character entity.
[16] In yet another embodiment, a method for generating a character entity in a mobile terminal includes receiving a first set of user characteristic data of a user and generating a character entity based on the received first set of user characteristic data with the character entity representing a state of the user. The character entity is stored in a memory of the mobile terminal.
[17] In another embodiment, a method for synthesizing a character entity includes uploading at least two character entities from at least two mobile terminals to a server- connected to the mobile terminals. A new character entity is synthesized based on the character entities uploaded in the server.
[18] In yet another embodiment, a mobile terminal capable of displaying a plurality of colors includes a memory, and a processing module. The memory receives and stores the user characteristic data of the user. The processing module is configured to receive and analyze the user characteristic data from the memory, and in response to the received user characteristic data, assigns a color to at least a portion of the mobile terminal.
[19] In a further embodiment, a method for changing colors in a mobile terminal capable of displaying a plurality of colors includes receiving user characteristic data of a user and assigning a color to at least a portion of the mobile terminal in response to the received user characteristic data. The assigned color is displayed on the portion of the mobile terminal.
[20]
Brief Description of the Drawings
[21] The disclosure may best be understood by reference to the following description taken in conjunction with the following figures:
[22] Fig. 1 shows a mobile terminal in accordance with one embodiment of the present disclosure;
[23] Fig. 2 shows a detailed block diagram of information stored in a memory and used to generate a character entity in accordance with one embodiment of the present disclosure;
[24] Fig. 3 shows a mobile terminal that can display colors based on user characteristics in accordance with another embodiment of the present disclosure;
[25] Fig. 4 describes a mobile terminal that can display colors based on user characteristics in accordance with still another embodiment of the present disclosure;
[26] Fig. 5 illustrates a method of creating a character entity in a mobile terminal in accordance with one embodiment of the present disclosure;
[27] Fig. 6 depicts a method of updating a character entity in a mobile terminal in accordance with one embodiment of the present disclosure;
[28] Fig. 7 illustrates a system for synthesizing character entities in accordance with one embodiment of the present disclosure; and
[29] Fig. 8 shows a method of synthesizing character entities in accordance with one embodiment of the present disclosure.
[30]
[31] DETAILED DESCRIPTION
[32] In the following description, numerous specific details are set forth. It will be apparent, however, that these embodiments may be practiced without some or all of these specific details. In other instances, well known process steps or elements have not been described in detail in order not to unnecessarily obscure the disclosure.
[33] The present disclosure provides a system and a method for generating and modifying a character entity representing the state of a user on a mobile terminal. The character entity characterizes the user's state based on biological data acquired from the user and/or other user characteristics (e.g., personal information such as schedule, a pattern of telephonic communication, messages exchanged with other users, etc.). Once the character entity is created, the mobile terminal may provide a physical representation of the character entity such as an avatar, color, sound, etc. The mobile terminal may transfer the avatar to or from another mobile terminal or a server computer. In one embodiment, the mobile terminal (e.g., mobile phone) is made of materials such as electro-luminescent materials that can change colors according to biological data or other user characteristics. Thus, the disclosed embodiments automatically provide a realistic characterization of the user's state on a mobile terminal. An avatar is typically a graphical or virtual character representation such as smiley face or a figure of a person. However, as used herein, an "avatar"is not so limited and may also include representation by means of animation, sound, color, vibration or any combination thereof.
[34] Fig. 1 illustrates an exemplary mobile terminal 110. As shown, the mobile terminal
110 includes an input interface 113 for receiving biological signals ("bio-signals" of a user from a bio-sensor 130 through a communication link 140. The bio-sensor 130 includes one or more sensors for detecting bio-signals of a user, which will be described later in detail. The mobile terminal 110 may also receive bio-signals in the form of biological data from a remote bio-sensor, a server, or other mobile terminals through a wire or wireless communication link.
[35] In the mobile terminal 110, a memory 116 is coupled to receive and store the bio- signals from the input interface 113. In addition, the memory 116 may also store other user characteristics such as user's schedule, a pattern of telephonic communication, messages exchanged with other users, etc. For example, the mobile terminal 110 keeps track of the user's schedule, call patterns (e.g., call destination/source, number of such calls, number of unanswered calls, message destination/source, number of such messages, etc.) and stores the data in the memory 116.
[36] The user characteristics data (e.g., user's bio-signal and/or personal information) is processed in a character module 114 coupled to the memory 116. The character module 114 may be implemented using a processor (e.g., CPU), an ASIC, or any suitable processing unit employed in a mobile terminal such as a mobile phone, a PDA, etc. Based on the received data, the character module 114 generates a character entity (e.g., avatar, color, sound, vibration, etc.) that reflects the user's state for display or output on an LCD display unit 117 or other components of the mobile terminal. For example, if the user's biological data indicates a high pulse rate, the character entity may depict a heart with a pulsating heart. Similarly, if the user's schedule indicates a full schedule for a given day, the character entity may be shown as a running figure with a briefcase. Further, if the user's data indicates an unhealthy state, the character entity may be displayed as being bedridden. Additionally or alternatively, the character entities may be assigned a color representing the user's state. In the case of unhealthy user state, for example, the character module 114 may assign a red color to the character entity for display. As used herein, the term "color" means not only the chrominance (e.g., red, blue, green), but also luminance (e.g., grey, black, white) as well.
[37] Once generated, the character module 114 also modifies or updates the character entity as new user characteristics data are received. If the user's new biological data indicates an improvement in user's health status, for example, the character module 114 processes the information to modify the character entity to reflect the improvement. In the case of the bedridden character, the character entity may be shown in its healthy state without the bed.
[38] The mobile terminal 110 may communicate the character entity and user characteristics data with external devices such as other mobile terminals, servers, or computers via a data communication unit 112. The data communication unit 112 converts character entity and user characteristics data for communicating via an antenna 111 with the external devices. Such transfer of data may be performed, for example, through cellular network, conventional mobile Internet, or direct communication (e.g., Bluetooth) between two terminals. The mobile terminal 110 may also include an FO interface 115 for interfacing with a portable memory card 120 to store or retrieve the character entity and/or user characteristics data.
[39] The mobile terminal 110 may be any type of personal communication devices having mobile communication capability such as a mobile phone, a PDA (personal digital assistant), and a notebook computer. In one embodiment, the external case for the mobile terminal is made of an organic electro-luminescence (EL) or inorganic EL material that emits and changes color or light intensity in response to user characteristics data.
[40] Fig. 2 illustrates one embodiment of memory 116, depicting various information and modules used to generate a character entity 200. The character entity 200 includes personal information 210, biological information 220, an emotion model 230, a physical model 240, and a behavior engine 250. The emotion model 230 represents the current emotional state of a user while the physical model 240 represents the physical state of the user. The behavior engine 250 models the behavior of the character entity based on attributes from the emotion and physical models 230 and 240. As shown in Fig. 2, the character entity 200 is stored temporarily in the memory 116 of the mobile terminal 110 for access by the character module 114 in processing the character entity 200 in real-time.
[41] The personal information 210 is maintained and updated by the character module
114. The personal information 210 includes user information such as address book (e.g., names or a group of names, corresponding telephone numbers, email addresses and homepage addresses), schedule (e.g., titles of meeting, meeting places, starting and ending time, attendants, relevant memos) and a pattern of telephonic communication (e.g., counter-party's names and telephone numbers, starting time and duration of telephone calls, short messages exchanged with other users, number of calls or messages to and from others, number of missed calls or messages, etc.). The personal information 210 gathered by the character module 114 is provided to the emotion model 230 and the behavior engine 250, which perform estimation of emotion and behavior based on the information. Although not described herein, any methodology or theory developed in mental and behavior psychology may be applied to implement the emotion model 230 and the behavior engine 250, but not limited thereto.
[42] The emotion model 230 and the physical model 240 are created and updated by using biological information 220 obtained through the bio-sensor 130. Particularly, the physical model 240 represents a virtual physiological model of the user by modeling various organs of the user based on the biological information 220 (e.g., biological signals, data, etc.). The attributes stored in the physical model 240 are used to model geometrically an avatar, which represents the character entity 200 in a virtual world. Although the personal information is used for the emotional model and the behavior engine and the biological information is used for the emotional and physical models in the above example, each model can be created or updated using other types of data as well (e.g., in response to the personal information such as a number of incoming calls, the physical model may model a tired state of the user.)
[43] Turning back to Fig. 1, the mobile terminal 110 receives the biological information
220 of the user through the bio-sensor 130. The bio-sensor 130 may be embedded in the mobile terminal 110 or communicate with the mobile terminal 110 through the communication link 140. In an alternative configuration, the bio-sensor 130 may be coupled to the mobile terminal 110 through a hard wired connection, although it is perceived that the communication link 140 is preferably implemented by a wireless connection in most cases.
[44] The bio-sensor 130 includes an auditory sensor 131 such as microphone, a visual sensor 132 such as a camera, a taste sensor 133 and an olfactory sensor 134, e.g., manufactured by using nano technology, or a motion sensor 135 such as a MEMS (micro-electro mechanical systems) accelerometer. Also, the biological information 220 of the user may be obtained by measuring bio-signals through a thermometer 136, a pulse sensor 137 or an ECG (electrocardiogram) sensor 138. Although Fig. 1 illustrates a number of different bio-sensors that can be employed in the mobile terminal 110, the sensor 130 may employ any number of sensors or any other types of sensors that can detect user's physical signs.
[45] The biological information obtained by the bio-sensor 130 is transferred to the memory 116 through the input interface 113. Accessing the biological information and personal information stored in the memory 116, the character module 114 processes the information to create and update the emotion model 230 and the physical model 240 of the character entity 200 as described above.
[46] For example, the character module 114 analyzes the voice data of the user, which is obtained through the auditory sensor 131, to generate information on the health state of the user. In this case, the character module 114 may use any well-known algorithm to analyze the voice to model the health state of the user. According to well-known theory of oriental medicine or phonetics, such health status of a person can be obtained by analyzing the voice of the person.
[47] Further, the taste sensor 133 and the olfactory sensor 134, which may be implemented using a laminate metal oxide sensor, nano bio-sensor, etc., are used for detecting taste or scent of wine, food, spoiled food, counterfeit liquors, dangerous materials, etc. The character module 114 compares the taste and scent information obtained by the sensors 133 and 134 with predetermined values to update corresponding attributes of the physical model 240, which determines behavior of the character entity 200 for the source of the information.
[48] The character entity 200 including the emotion and physical models 230 and 240 and the behavior engine 250 stored in the memory 116 may be transferred to a portable memory card 120. The portable memory card 120 may be detachable from the mobile terminal 110 and attached to another mobile terminal. As such, the user can transfer and maintain the character entity 200 in a new mobile terminal without having to recreate a character entity.
[49] It should be appreciated that the character entity 200 may further include graphical or animation data, in which the emotion and physical models 230 and 240 are depicted in the form of an avatar. The avatar can be updated or modified in response to changes in the emotion and physical models 230 and 240 over time as well as outputs of the behavior engine 250. The character module 114 uses the graphical or animation data included in the character entity 200 to visualize the avatar on the LCD display unit 117.
[50] Fig. 3 shows an embodiment in which the mobile terminal 110 can display colors based on user characteristics. In this embodiment, the mobile terminal 110 includes a case 118 made of electro-luminescent (EL) material such as organic or inorganic EL material. As is well known in the art, the case 118 made of such material can change color or intensity of light emitting according to an input signal (e.g., voltage or current signal). Although the case 118 is made of EL material, only a specified portion or any portions of the case of the mobile terminal may be made of EL material to display colors representing user's state.
[51] The character module 114 generates a color input signal for the case 118 based on the attributes in the emotion model 230, the physical model 240, and the behavior engine 250. The character entity 200 may include color information for display on the case 118. In one embodiment, the character module 114 analyzes the user characteristics data and generates a color signal for input to the case 118. If the user characteristics data indicates an unhealthy state, a color signal corresponding to a red color may be generated and provided to the case 118. For example, if the user's voice (which may be input through the bio-sensor 130 or an internal microphone on the mobile terminal 110) indicates an unhealthy state, the character module 114 outputs a color signal indicating a red color. In response to the color signal, the case 118 displays the red color on the mobile terminal 110. Subsequently, if the user's voice indicates a healthy state, a color signal indicating a blue color may be provided to the case 118 for display. By displaying and changing colors according to user characteristics data, the user's state is easily and automatically represented on the mobile terminal.
[52] Fig. 4 shows an embodiment in which mobile terminal 400 can display colors based on user characteristics. In this embodiment, the mobile terminal 400 includes a case 410 made of EL material. As discussed above, the case 410 is made of such material that can change color or intensity of light emitting according to an input signal. Although the case 410 is made of EL material, only a specified portion or any portions of the case or mobile terminal may be made of EL material to display colors representing user's state.
[53] The processing module 420 analyzes various bio-signals inputted through a sensor such as a bio-sensor 430 or an internal microphone 440 and generates an output signal indicating, e.g., current emotional or health state of the user. Such an output signal of the processing module 420 is used as a color signal for input to the case 410, indicating different colors depending on the emotional or health state of the user. In response to the color signal, the case 410 displays certain color corresponding to the color signal on the mobile terminal 110.
[54] Fig. 5 illustrates a flowchart of one embodiment of a method for generating a character entity in the mobile terminal 110. In operation 510, the mobile terminal 110 gathers and stores user characteristics data such as personal information 210 and/or biological information 220 from the bio-sensor 130. As mentioned above with reference to Fig. 2, the personal information 210 includes user information such as address book, schedule and a pattern of telephonic communication, etc, which is stored in the memory 116. Further, the biological information 220 includes bio-signals and data indicating current emotional and physical state of the user, which includes voice of the user inputted through the auditory sensor 131, images taken by the visual sensor 132, taste and scent data obtained by the taste sensor 133 and the olfactory sensor 134, etc. as discussed above.
[55] Based on the user characteristics data, the mobile terminal 110 creates a character entity 200 in operation 520. Specifically, the character module 114 processes the personal information 210 and/or biological information 220 from the memory 116 to generate the character entity 200 including the emotion and physical models 230 and 240 and the behavior engine 250 representing the state of the user.
[56] After generating the character entity 200, the mobile terminal 110 stores the created character entity 200 in the portable memory card 120 in operation 530. The character entity 200 can be transferred to another mobile terminal, computer, or a server connected to the mobile terminal 110 through wired/wireless connections. As such, the user of the mobile terminal 110 can easily transfer and recreate the character entity 200 representing his or her state on another device.
[57] Fig. 6 illustrates a flowchart showing one embodiment of a method for updating or modifying a character entity in the mobile terminal 110. After a character entity has been generated and stored on the mobile terminal 110, the mobile terminal 110 determines if any instructions are received to update the character entity 200 in operation 610. Character entity 200 may be updated periodically or aperiodically as set by the user, or updated automatically whenever new user characteristics data are received.
[58] If the character entity 200 needs to be updated, the mobile terminal 110 loads the character entity 200 from the portable memory card 120 into the memory 116 in operation 620. Although the character entity 200 is loaded into the memory 116 and accessed to or from the portable memory card 120, the character entity 200 may be updated in the memory 116 without being loaded/stored from/to the portable memory card 120.
[59] In operation 630, the mobile terminal 110 analyzes new user characteristics data
(e.g., personal information 210 and/or biological information 220 from bio-sensor 130). Then in operation 640, the mobile terminal 110 updates or modifies the character entity 200 based on the new information. In particular, the character module 114 processes the personal information 210 and/or biological information 220 to update the emotion and physical models 230 and 240 and the behavior engine 250 included in the character entity 200. In this operation, the character entity 200 preferably includes color information for updating or modifying the color on the case 118 or a portion of the case. The mobile terminal 110 stores the updated character entity 200 from the memory 116 to the portable memory card 120 in operation 650.
[60] One embodiment of a system and a method for synthesizing a plurality of character entities obtained from a plurality of mobile terminals will be explained in detail with reference to Figs. 7 and 8.
[61] The generated character entities may be synthesized to generate a new character entity. Fig. 7 illustrates one embodiment of a system for synthesizing a plurality of character entities from a plurality of mobile terminals 712 to 716. As shown in Fig. 7, a server 730 receives character entities from the mobile terminals 712 and 714 through a wired/wireless network 720. The network 720 may be the Internet, campus/enterprise intranet, a wide area network (WAN), a local area network (LAN), or any other type of network or inter-network. In addition, this idea can be applied to networks that use any of a variety of communication techniques, including datagram based networks (e.g., the Internet), connection based (e.g., X.25) networks, virtual circuit based (e.g., Asynchronous Transfer Mode (ATM)) networks, etc.
[62] The server 730 receives the character entities and synthesizes the character entities to create a new character entity. The server 730 transmits the new character entity to a recipient such as the mobile terminal 716 through the network 720. The information on recipient terminal 716 such as its address may be provided to the server 730 by at least one of the mobile terminals 712 and 714.
[63] Although the server 730 is described to serve as an agent for synthesizing the character entities sent from the mobile terminals 712 and 714, the process for synthesizing the character entities may be executed in any one of the mobile terminals 712 to 716. In such case, for example, the mobile terminal 712 may receive the character entity from the mobile terminal 714, e.g., through a wireless connection such as infrared or Bluetooth, and then synthesize the character entity of the mobile terminal 714 with the one stored in its memory to produce a new character entity. After synthesizing the new character entity, the mobile terminal 712 stores the newly created character entity in its memory or sends it to another mobile terminal.
[64] Fig. 8 depicts a flowchart showing one embodiment of a method for synthesizing character entities. In operation 810, at least two mobile terminals 712 and 714 upload their character entities to a server 730. Upon receiving the character entities, the server 730 synthesizes the character entities to create a new character entity in operation 820.
[65] In synthesizing the character entities, the server 730 preferably uses a genetic combination method. Using this method, attributes in the emotion and physical models 230 and 240 of the character entity 200 may be treated as genes, each of which has a value indicating a single characteristic of the character entity 200, e.g., ranging from a minimum value to a maximum value. The genetic combination method computes a weighted sum of genes obtained from at least two character entities to generate genes of a new character entity. In this case, the weighted values may be adjusted by a user using a graphical user interface designed for character entity customization. Alternatively, the weighted values may be determined using a pseudorandom number generator. By employing such genetic combination method, a variation of a previously generated character entity may be generated.
[66] Upon generating a new character entity, the server 730 transmits the created character entity to a recipient mobile terminal in operation 830. As mentioned with reference to Fig. 7, the process for synthesizing the character entities may be executed in one of the mobile terminals 712 and 714 having the original character entities or the terminal 716, which may store the newly created character entity in its memory or transfer it to another mobile terminal.

Claims

Claims
[ 1 ] A personal communication device comprising : an input interface configured to receive user characteristic data of a user; and a memory configured to store the received user characteristic data; and wherein the personal communication device is configured to generate a character entity corresponding to a state of the user, wherein said generated character entity is based at least in part upon the received user characteristic data.
[2] The personal communication device of Claim 1, further comprising: a display unit configured to display a representation of the character entity.
[3] The personal communication device of Claim 1, wherein the user characteristic data include bio-signals of the user.
[4] The personal communication device of Claim 3, further comprising: a bio-sensor wherein the input interface receives the bio-signals from the biosensor.
[5] The personal communication device of Claim 3, wherein the input interface receives bio-signals from a bio-sensor externally connected to the personal communication device.
[6] The personal communication device of Claim 3, wherein the bio-signals are selected from a group comprising voice signals, video signals, picture signals, taste signals, scent signals, motion signals, temperature signals, pulse signals, and ECG signals.
[7] The personal communication device of Claim 1, wherein the memory is a portable memory card.
[8] The personal communication device of Claim 1, wherein the character entity corresponds to an avatar.
[9] The personal communication device of Claim 8, wherein the personal communication device modifies the character entity based on a change in the user characteristic data, modifying the avatar.
[10] The personal communication device of Claim 1, wherein the character entity includes information corresponding to one or more colors.
[11] The personal communication device of Claim 1, wherein at least a portion of the personal communication device is configured to change color based on a change in the user characteristic data.
[12] The personal communication device of Claim 1, further comprising a case configured to change color in response to a change in the user characteristic data.
[13] The personal communication device of Claim 12, wherein at least a portion of the case is made of an electro-luminescent material capable of changing colors.
[14] The personal communication device of Claim 1, wherein the user characteristic data include personal information, wherein the character entity is based at least in part upon the personal information.
[15] The personal communication device of Claim 14, wherein the personal information includes at least one of address book, schedule and a pattern of telephonic communication.
[16] The personal communication device of Claim 1, wherein the personal communication device is a mobile phone.
[17] The personal communication device of Claim 1, wherein the personal communication device is a personal digital assistant (PDA).
[18] A system for synthesizing a character entity, comprising: an input device configured to receive character entities from at least two mobile terminals; a server configured to synthesize a new character entity based on the received character entities.
[19] A system for synthesizing a character entity, comprising: a memory configured to store a character entity; an input device configured to receive another character entity from a mobile terminal; and a processing device configured to synthesize a new character entity based on the stored character entity and the received character entity.
[20] The system of Claim 18 or 19, wherein the server transmits the synthesized character entity to a recipient.
[21] The system of Claim 18 or 19, wherein each of the character entities comprises information representing a state of a corresponding user.
[22] The system of Claim 21, wherein each of the character entities is modeled based on the information representing said state of said corresponding user.
[23] A method for generating a character entity in a personal communication device, comprising: receiving a first set of user characteristic data of a user; generating a character entity corresponding to a state of the user, wherein said generated character entity is based at least in part on the received first set of user characteristic data; and storing the character entity in a memory of the personal communication device.
[24] The method of Claim 23, further comprising: receiving a second set of user characteristic data of the user; and updating the character entity based at least in part on the second set of user characteristic data.
[25] The method of Claim 23, wherein the operation of generating the character entity further includes creating a database including information corresponding to a current state of the user based on the received user characteristic data.
[26] The method of Claim 25, wherein the operation of generating the character entity further includes modeling the character entity based on the database.
[27] The method of Claim 23, wherein the character entity corresponds to an avatar.
[28] The method of Claim 24, wherein the character entity corresponds to an avatar that is capable of being modified based on a change in the user characteristic data.
[29] The method of Claim 23, further comprising: including color information as part of the character entity.
[30] The method of Claim 29, further comprising: changing the color information based on a change in the user characteristic data.
[31] The method of Claim 23, wherein the user characteristic data include at least one of bio-signals and personal information of the user.
[32] A method for synthesizing a character entity, comprising: uploading at least two character entities from at least two mobile terminals to a server connected to the mobile terminals; and synthesizing a new character entity based on the uploaded character entities.
[33] The method of Claim 32, further comprising: transmitting the synthesized character entity to a recipient designated by at least one of the mobile terminals.
[34] A mobile terminal capable of displaying a plurality of colors, comprising: a memory configured to receive and store user characteristic data of a user; and a processing module configured to receive and analyze the user characteristic data from the memory, wherein the processing module assigns a color to at least a portion of the mobile terminal in response to the received user characteristic data.
[35] The mobile terminal of claim 34, further comprising a case configured to display a plurality of colors, wherein at least a portion of the case displays the assigned color.
[36] The mobile terminal of Claim 35, wherein the case is made of an electro-luminescent material. [37] The mobile terminal of Claim 34, wherein the processing module changes the assigned color based on a change in the user characteristic data. [38] The mobile terminal of Claim 34, wherein the user characteristic data include at least one of bio-signals and personal information of the user. [39] A method for changing colors in a mobile terminal capable of displaying a plurality of colors, comprising: receiving user characteristic data of a user; assigning a color in response to the received user characteristic data; and displaying the assigned color on at least a portion of the mobile terminal. [40] The method of claim 39, wherein the mobile terminal comprises a case configured to display the plurality of colors, wherein at least a portion of the case displays the assigned color. [41] The method of Claim 40, wherein the case is made of an electro-luminescent material. [42] The method of Claim 39, further comprising: changing the assigned color based on a change in the user characteristic data from the user. [43] The method of Claim 39, wherein the user characteristic data include at least one of bio-signals and personal information of the user.
PCT/KR2007/003252 2006-07-05 2007-07-04 System and method for managing a character entity in a mobile terminal WO2008004813A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020060062915A KR20080004196A (en) 2006-07-05 2006-07-05 System and method for managing a character entity in a mobile terminal
KR10-2006-0062915 2006-07-05

Publications (1)

Publication Number Publication Date
WO2008004813A1 true WO2008004813A1 (en) 2008-01-10

Family

ID=38894741

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2007/003252 WO2008004813A1 (en) 2006-07-05 2007-07-04 System and method for managing a character entity in a mobile terminal

Country Status (2)

Country Link
KR (1) KR20080004196A (en)
WO (1) WO2008004813A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014089515A1 (en) * 2012-12-07 2014-06-12 Intel Corporation Physiological cue processing
CN105615901A (en) * 2014-11-06 2016-06-01 中国移动通信集团公司 Emotion monitoring method and system
WO2021023479A1 (en) * 2019-08-06 2021-02-11 Volkswagen Aktiengesellschaft Method for measuring health data of a vehicle occupant in a motor vehicle

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9665563B2 (en) 2009-05-28 2017-05-30 Samsung Electronics Co., Ltd. Animation system and methods for generating animation based on text-based data and user information
KR101890717B1 (en) 2010-07-20 2018-08-23 삼성전자주식회사 Apparatus and method for operating virtual world using vital information

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20000055455A (en) * 1999-02-06 2000-09-05 전희국 Method of storing user data in cellular phones using short message service
KR20010003904A (en) * 1999-06-26 2001-01-15 조정남 Method for managing personal information using short message service in CDMA network
KR20010035423A (en) * 2001-02-13 2001-05-07 이가형 A letter transmitting and saves method of the mobile computing device
US20030014278A1 (en) * 2001-07-13 2003-01-16 Lg Electronics Inc. Method for managing personal information in a mobile communication system
US20030033533A1 (en) * 2001-08-10 2003-02-13 Meisel William S. Use of identification codes in the handling and management of communications

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20000055455A (en) * 1999-02-06 2000-09-05 전희국 Method of storing user data in cellular phones using short message service
KR20010003904A (en) * 1999-06-26 2001-01-15 조정남 Method for managing personal information using short message service in CDMA network
KR20010035423A (en) * 2001-02-13 2001-05-07 이가형 A letter transmitting and saves method of the mobile computing device
US20030014278A1 (en) * 2001-07-13 2003-01-16 Lg Electronics Inc. Method for managing personal information in a mobile communication system
US20030033533A1 (en) * 2001-08-10 2003-02-13 Meisel William S. Use of identification codes in the handling and management of communications

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014089515A1 (en) * 2012-12-07 2014-06-12 Intel Corporation Physiological cue processing
US9640218B2 (en) 2012-12-07 2017-05-02 Intel Corporation Physiological cue processing
CN105615901A (en) * 2014-11-06 2016-06-01 中国移动通信集团公司 Emotion monitoring method and system
WO2021023479A1 (en) * 2019-08-06 2021-02-11 Volkswagen Aktiengesellschaft Method for measuring health data of a vehicle occupant in a motor vehicle

Also Published As

Publication number Publication date
KR20080004196A (en) 2008-01-09

Similar Documents

Publication Publication Date Title
US11327556B2 (en) Information processing system, client terminal, information processing method, and recording medium
JP5497015B2 (en) Method and system for automatically updating avatar status to indicate user status
JP6462386B2 (en) Program, communication terminal and display method
RU2293445C2 (en) Method and device for imitation of upbringing in mobile terminal
US20210104087A1 (en) Avatar style transformation using neural networks
CN108874114B (en) Method and device for realizing emotion expression of virtual object, computer equipment and storage medium
EP3702914A2 (en) Mobile virtual and augmented reality system
JP6461630B2 (en) COMMUNICATION SYSTEM, COMMUNICATION DEVICE, PROGRAM, AND DISPLAY METHOD
CN108885498A (en) Electronic device and the in an electronic method of offer information
US20150155006A1 (en) Method, system, and computer-readable memory for rhythm visualization
CN107320114A (en) Shooting processing method, system and its equipment detected based on brain wave
CN107479781A (en) A kind of update method and terminal of application icon color
CN110300951A (en) Media item attachment system
EP2690847A1 (en) Virtual assistant for a telecommunication system
US11960792B2 (en) Communication assistance program, communication assistance method, communication assistance system, terminal device, and non-verbal expression program
WO2008004813A1 (en) System and method for managing a character entity in a mobile terminal
CN108697935A (en) Incarnation in virtual environment
CN110019743A (en) Information processing unit and the computer-readable medium for storing program
CN108141490A (en) For handling the electronic equipment of image and its control method
EP4147425A1 (en) Messaging system with a carousel of related entities
TW201003512A (en) User interface, device and method for displaying a stable screen view
CN109448069A (en) A kind of template generation method and mobile terminal
CN112870697B (en) Interaction method, device, equipment and medium based on virtual relation maintenance program
WO2016128862A1 (en) Sequence of contexts wearable
KR20210032159A (en) Human emotional expression tools on the basis of wireless communication system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 07768600

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

NENP Non-entry into the national phase

Ref country code: RU

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC OF 200409

122 Ep: pct application non-entry in european phase

Ref document number: 07768600

Country of ref document: EP

Kind code of ref document: A1