WO2007069361A1 - Information processing terminal - Google Patents

Information processing terminal Download PDF

Info

Publication number
WO2007069361A1
WO2007069361A1 PCT/JP2006/314521 JP2006314521W WO2007069361A1 WO 2007069361 A1 WO2007069361 A1 WO 2007069361A1 JP 2006314521 W JP2006314521 W JP 2006314521W WO 2007069361 A1 WO2007069361 A1 WO 2007069361A1
Authority
WO
WIPO (PCT)
Prior art keywords
call
emotion
processing terminal
information processing
information
Prior art date
Application number
PCT/JP2006/314521
Other languages
French (fr)
Japanese (ja)
Inventor
Ryouta Yoshida
Takaaki Nishi
Tetsurou Sugimoto
Tomoko Obama
Hideaki Matsuo
Junichi Oshiro
Original Assignee
Matsushita Electric Industrial Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Matsushita Electric Industrial Co., Ltd. filed Critical Matsushita Electric Industrial Co., Ltd.
Publication of WO2007069361A1 publication Critical patent/WO2007069361A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/68Details of telephonic subscriber devices with means for recording information, e.g. telephone number during a conversation

Definitions

  • the present invention relates to an information processing terminal capable of specifying the emotion of the creator of the email and the emotion of the speaker during the call from the character data described in the email and the voice during the call. About.
  • Recent mobile phones, PDAs (Personal Digital Assistance), and other mobile devices have a history of incoming calls (at least when the call arrives and if the caller's phone number and caller can be identified, the caller's name and , The items that are also configured in chronological order according to the incoming time) and outgoing call history (at least when the phone call time and the destination phone number and destination can be specified) And call history such as items arranged in chronological order according to the call origination time) and mail reception history (at least the time when the mail was received, the sender's email address and sender can be specified)
  • the name of the sender, the email body, and the items that are configured in a time-sequential manner according to the reception time are sent in the email transmission history (at least the email transmission time and the email address of the recipient. If you can identify the address and destination, you can display the email history such as the destination name and the body of the email, arranged in chronological order according to the sending time) (For example, Patent Document 1).
  • Patent Document 1 Japanese Patent Laid-Open No. 11-275209
  • the present invention has been made in view of the above circumstances, and can support recalling the contents of past calls and the contents of e-mail text without forcing the user to spend time and effort.
  • An object is to provide a processing terminal.
  • An information processing terminal includes an input means for inputting emotion specifying information composed of at least character data or voice, and an emotion specifying an emotion based on the emotion specifying information input by the input means. And a display unit for displaying information related to the emotion specified by the emotion specifying unit.
  • the information processing terminal of the present invention includes mail transmission / reception means for transmitting or receiving an electronic mail, and the input means is character data described in the electronic mail transmitted or received by the mail transmission / reception means.
  • the emotion specifying means specifies an emotion based on the character data input by the input means, and the display means transmits or receives by the mail transmitting / receiving means specified by the emotion specifying means. Including information about emotions corresponding to selected e-mails.
  • At least a transmission time or at least for each e-mail in the order in which the display unit transmits or receives e-mails by the mail transmission / reception unit includes those that display information about the reception time, destination or source, and the emotion.
  • the information processing terminal of the present invention includes the information processing terminal in which the display unit displays, for each date, information related to the emotion corresponding to the email transmitted or received on the date by the mail transmission / reception unit. .
  • the display means for each target person registered in the telephone directory function, the display means also receives an e-mail transmitted to the target person by the mail transmitting / receiving means or the target person power. Including information on the emotions corresponding to the received e-mail.
  • the mobile phone user's emotions and the mobile phone are used to By notifying mobile phone users of the feelings of other people who have communicated with users, mobile phone users can more easily remember past communications and provide an environment.
  • any one of the information related to the emotion specified by the display unit for each of the plurality of e-mails transmitted or received by the mail transmitting / receiving unit is included.
  • the terminal user can confirm at a glance the emotion specified for each of a plurality of emails.
  • the information processing terminal includes a calling means for making a call, wherein the input means inputs a call voice by the calling means, and the emotion specifying means inputs by the input means. Emotions are specified based on the voice of the call, and the display means specifies information related to emotions about the called party, the caller, or both the called party and the caller who have made a call by the calling means specified by the emotion specifying means. Including things to display.
  • the display means makes a call or an incoming call in the order in which the display means makes or receives a call. Including those that display information about followers and the emotions.
  • the information processing terminal is such that the display means displays, for each date, information related to the emotion corresponding to a call made or received by the call means on the date. Including.
  • the display means displays, for each target person registered in the telephone directory function, information on the emotion about the target person who has called by the calling means. Including things to do.
  • the mobile phone user's emotions and the mobile phone can be used to By notifying mobile phone users of the feelings of other people who have communicated with users, mobile phone users can more easily remember past communications and provide an environment.
  • the display means displays at least one of the information related to the emotion specified for each of a plurality of calls transmitted or received by the call means. Including things to do.
  • the terminal user can confirm at a glance the emotions specified for each of a plurality of calls.
  • an information processing terminal of the present invention includes a call storage unit that stores a call voice by the call unit, and a call playback unit that plays back the call voice stored in the call storage unit.
  • the call reproduction means reproduces the call voice specifying the information related to the emotion displayed by the display means among the call voices stored in the call storage means.
  • the call storage unit stores a portion in which the emotion specified by the emotion specifying unit is reflected in the call voice by the call unit. Then, the call reproduction means reproduces at least the part of the call stored in the call storage means.
  • the information processing terminal includes the information processing terminal in which the call reproduction means reproduces the location with a predetermined time before the start time of the location as a reproduction start time.
  • FIG. 1 is a configuration diagram of a mobile phone according to a first embodiment of the present invention.
  • FIG. 3 shows information stored in the emotion identification device in the mobile phone according to the first embodiment of the present invention.
  • FIG. 4 is a processing flow for displaying a call history by the mobile phone according to the first embodiment of the present invention.
  • FIG. 5 is a display example of a call history by the mobile phone according to the first embodiment of the present invention.
  • Figure 5 (a) shows the outgoing call history.
  • Figure 5 (b) shows the incoming call history.
  • Fig. 5 (c) shows an example of a feeling mark displayed in the outgoing call history or incoming call history.
  • FIG. 6 Display example of a schedule book by the mobile phone according to the first embodiment of the present invention
  • FIG. 7 shows an example of a telephone directory display by the mobile phone according to the first embodiment of the present invention.
  • FIG. 8 Sorting example for each emotion by the mobile phone according to the first embodiment of the present invention.
  • FIG. 9 is another configuration diagram of the mobile phone according to the first embodiment of the present invention.
  • FIG. 10 shows another example of the information list stored in the emotion identification device in the mobile phone according to the first embodiment of the present invention.
  • FIG. 11 Processing flow for displaying the mail reception history by the mobile phone of the first embodiment of the present invention.
  • FIG. 13 shows information stored in the emotion identification device in the mobile phone according to the second embodiment of the present invention.
  • FIG. 14 is a processing flow for displaying the call history of the mobile phone according to the second embodiment of the present invention.
  • FIG. 15 is a display example of call history by the mobile phone according to the second embodiment of the present invention.
  • FIG. 16 Display example of schedule book by mobile phone according to the second embodiment of the present invention
  • FIG. 17 is a display example of a telephone directory by the mobile phone according to the first embodiment of the present invention.
  • FIG. 18 is a processing flow for recording voice of a call by the mobile phone according to the second embodiment of the present invention (Example 1).
  • FIG. 19 is a processing flow for recording a call voice by the mobile phone according to the second embodiment of the present invention (Example 2).
  • FIG. 1 shows a configuration diagram of the mobile phone according to the first embodiment of the present invention.
  • the mobile phone according to the first embodiment of the present invention includes a call device 10, an emotion identification device 20, a PIM (Personal Information Manager) application group 30, a display control unit 40, and a display unit 50.
  • a configuration having various functions included in a mobile communication terminal such as a recent mobile phone or PDA (Personal Digital assistant) may be used.
  • the call device 10 includes a communication unit 101, an audio signal output unit 102, a call speaker 103, and a call microphone 104.
  • the communication unit 101 performs mobile wireless communication with a mobile phone base station, and realizes a voice call by transmitting and receiving a voice signal between the mobile phone user of the present invention and another telephone user.
  • the audio signal output unit 102 receives an audio signal received from another telephone through the communication unit 101, an audio signal of the mobile phone user picked up by the call microphone 104, or both of the audio signals. Output to 20. Also, the audio signal output unit 102 outputs the voice received by other telephone power to the call speaker 103, and outputs the call voice of the mobile phone user collected by the call microphone 104 to the communication unit 101. To do.
  • the emotion identification device 20 includes an emotion estimation unit 201, an emotion accumulation unit 202, an emotion identification unit 203, and an emotion information storage unit 204.
  • Emotion estimation unit 201 emits the voice from the volume of the voice, the waveform of the voice, the pitch of the voice, or the phoneme included as information in the received voice signal and the transmitted voice signal input from communication device 10 Estimate the emotions of other telephone users and those of the mobile phone users (such an emotion estimation method is proposed in, for example, International Publication No. WO00Z62279).
  • the emotion estimation unit 201 indicates the degree of emotion expressed by values of 0, 1, and 2 for each factor of emotion information composed of love, joy, anger, sadness, and neutral (normal) shown in FIG.
  • the emotion estimation unit 201 does not necessarily have to estimate both the emotion of the other telephone user and the emotion of the mobile phone user.
  • the emotion of either the other telephone user or the mobile phone user is not necessarily estimated in advance. If it is set to estimate the voice, either the incoming voice signal or the outgoing voice signal may be input from the voice signal output unit 102 of the communication device 10.
  • Emotion accumulation section 202 accumulates the numerical value for each factor input from emotion estimation section 201 in correspondence with the input time or order for each received voice signal and transmitted voice signal.
  • the emotion accumulation unit 202 receives a numerical value for each of a series of factors from the emotion estimation unit 201 (the series here refers to the input start force of the numerical value from the emotion estimation unit 201 to the emotion accumulation unit 202, which is the end of input.
  • the value for each factor in the series is input as a set of data (hereinafter, a set of data is referred to as one call). (Referred to as minute data).
  • minute data If the emotion accumulation unit 202 accumulates the numerical value for each factor input from the emotion estimation unit 201 for each received voice signal and transmitted voice signal, the data for one call is received voice data. One for each signal and transmitted voice signal.
  • the emotion identification unit 203 reads the data for one call from the emotion accumulation unit 202, analyzes the numerical value for each factor, identifies one characteristic factor from the read numerical value, One of them
  • the emotion represented by the factor is output to the emotion information storage unit 204.
  • the emotion identification unit 203 uses the factor with the most powerful numerical value in the data for one call as the characteristic emotion, so that the content that is strongly impressed during the call is displayed. It is possible to identify feelings with emphasis. Another method is to identify emotions by placing emphasis on the content of the entire call, with the characteristic that the most powerful factor is the sum of the call start power in the data for one call and the end of the call.
  • the emotion By selecting the factor that has the most powerful value immediately before the end of the call in the data for one call as the characteristic emotion, the emotion can be identified with emphasis on the reverberation of the conversation. .
  • the emotion identifying unit 203 identifies one characteristic factor from each of the data for one call for each received voice signal and transmitted voice signal, the one factor is used for the received voice signal and the transmitted voice signal. Both forces, which are specified, are also output.
  • Emotion information storage unit 204 inputs an emotion represented by one factor from emotion identification unit 203, and further, information related to a call made by communication unit 101 of call device 10 (the call is transmitted) One of the incoming calls, call start time 'call end time, and other telephone identification information (for example, the telephone number of the called party) are input from the communication unit 101.
  • the emotion information storage unit 204 stores the emotion input from the emotion specifying unit 203 and various information input from the communication unit 101 in association with each other (the emotion specifying unit 203 stores 1 for each received voice signal and transmitted voice signal). If one characteristic factor is identified from each of the data for the call, the force that is the signal source of either the received voice signal or the transmitted voice signal is memorized.
  • the emotion information storage unit 204 stores the emotion identified from the received voice signal (i.e., the other party's emotion) and the emotion identified from the transmitted voice signal (i.e.
  • the emotion of the mobile phone user of the invention is memorized. If it is set in advance to estimate the feeling of either one of the other phone users or the mobile phone user (data for one call c), Emotions are not memorized (in Fig. 3, it is shown that they are not memorized as “one”).
  • the PIM application group 30 manages personal information and sends the personal information to mobile phone users. It is composed of multiple application cards for using information. Examples of the PIM application include a call history application 301 for displaying a call history, a scheduler application 302 for supporting schedule management by a mobile phone user, or a phone book application 303 for registering various personal information. Can be mentioned.
  • the display control unit 40 activates and executes one of the applications in the PIM application group, extracts data necessary for processing of the executed application from the emotion information storage unit 2004, and displays the data in the display unit 50. Display various information. A display example of the display unit 50 when the call history application, scheduler application, or phone book application is started and executed will be described later.
  • the emotion identification device 20 accepts an input operation by an operation key (not shown) provided in the mobile phone of the present invention from a mobile phone user, and an emotion should be specified (1.
  • the other party, 2. the mobile phone user himself, or both, are determined (step S401).
  • the call device 10 When the call device 10 starts a call (step S402, YES), the call device 10 outputs a reception voice signal and a transmission voice signal to the emotion identification device 20 (step S403).
  • the emotion identifying device 20 is a factor of emotion information composed of affection, joy, anger, sadness, and neutral (normal) for the target speech signal of the speech signal input from the communication device 10. Each time, the degree of emotion is continuously estimated at predetermined time intervals (step S404).
  • the emotion identification device 20 analyzes the numerical value for each factor estimated during the call, and the numerical force has one characteristic factor.
  • the emotion expressed by the one factor is stored as the emotion of the other party or mobile phone user in this call (step S406. At this time, as shown in FIG. I remember with the feelings I made).
  • the display control unit 40 starts the call history application 301 by accepting the input operation by the operation key provided in the mobile phone of the present invention by the mobile phone user power (step S407, YES).
  • Program code for application 301 Accordingly, the emotion specified for each call stored in the emotion specifying device 20 and information related to the call are read out, and the information is displayed on the display unit 50 in a predetermined display format (step S408).
  • FIG. 5 shows a display example of a call history by the mobile phone according to the first embodiment of the present invention.
  • Fig. 5 (a) is a transmission history generated based on the information stored in the emotion identification device 20 shown in Fig. 3, and
  • Fig. 5 (b) is based on the information stored in the emotion identification device 20 shown in Fig. 3. This is the incoming call history generated at.
  • Fig. 5 (c) is an example of a mark indicating emotion displayed in the outgoing call history or incoming call history.
  • the display control unit 30 sets the item "outgoing / incoming" in the information list stored in the emotion identifying device 20 shown in Fig. 3 to "outgoing" ”(Corresponding to data a for one call), and the other party whose emotion set in step S401 should be specified, that is, the item“ signal source ”is“ received voice ”.
  • the data of the items “call start time”, “call destination telephone number”, and “emotion” are extracted from the minute data, and the items of the call history in FIG. 5 (a), “date and time”, “name”, “emotion” "Is displayed at the corresponding location.
  • the emotion of the other party is displayed.
  • the mobile phone user's own emotion is displayed, and the other party's emotion and the cellular phone user's own emotion are displayed. It may be configured to display both at the same time! / ⁇ .
  • the emotion of the item “signal source” force S “sending speech” is extracted from the information list stored in the emotion identification device 20 shown in FIG. , Display.
  • the display control unit 30 In order to display the incoming call history shown in FIG. 5 (b), the display control unit 30 also sets the item "outgoing / incoming" in the information list stored in the emotion identification device 20 shown in FIG. "Incoming call” (data corresponding to one call b and c), and the other party who should specify the emotion set in step S401, that is, the item “Signal source” is "Received voice" From the data for one call, the data of the items "call start time”, "call destination phone number”, and “emotion” are extracted, and the items of the incoming call history in Fig. 5 (b) are "Received date" and "Name". , “Emotion” is displayed in the corresponding part. Note that the name “Jiro Matsushita” is displayed in the “name” field of the incoming call history in FIG. 5B instead of the telephone number “09000001111”! This is the same as the reason described above. is there.
  • the call history is displayed on the mobile phone according to the first embodiment of the present invention.
  • FIG. 6 shows a display example of the schedule book by the mobile phone according to the first embodiment of the present invention.
  • Fig. 6 (a) is a calendar displayed from the schedule book
  • Fig. 6 (b) is a display example displaying emotions for each date in the calendar
  • Fig. 6 (c) is a display on a specific date. It is a display example that displays the transition of emotion
  • the display control unit 40 starts up the scheduler application 302 by accepting the input operation by the operation key provided in the mobile phone of the present invention also by the mobile phone user power (step S407, YES), first, the display control unit 40 in FIG. Display the calendar shown in a). Furthermore, when the display control unit 40 receives from the mobile phone user an input operation instructing to display the mobile phone user's own emotions on each date using the operation keys, the emotion identifying device 20 shown in FIG. From the list of stored information, extract the emotion of the mobile phone user who is the target of emotion identification, that is, the data for one call whose item “Signal Source” is “Transmission Voice”. Put out. In the list of information stored in the emotion identification device 20 shown in Fig.
  • the emotion of the mobile phone user identified by the call on September 9, 2005 is “joy”. Since the mobile phone user's own emotion identified in (1) is “normal”, the display control unit 40 displays a mark indicating a feeling of joy on September 9, 2005, in the corresponding date column. On September 10, 2005, a mark representing normal feeling is displayed. For other dates, these processes are performed in the same way, and the emotions for each date in the calendar are displayed as shown in Fig. 6 (b).
  • the emotions of mobile phone users are identified by one call on September 9, 2005 and September 10, 2005, respectively. Since there was only one emotion that was given power, it would be good to display a mark representing that one emotion on each date. However, there may be multiple different emotions identified in each call on the same date in the information list. In such a case, the display control unit 40, among a plurality of calls on a single date, the emotion specified by the call with the newest call date and time, the emotion specified by the call with the longest call time, The emotion that has been identified most frequently is displayed on the display unit 50 as a total emotion on that date. In addition, although the structure which displays the total feeling in a certain day was demonstrated here, you may display the total feeling in predetermined periods, such as a week and a 1-month unit.
  • the level of the emotion is set for the emotion specified for each call. In order to do so, weighting may be performed, and based on the degree of the emotion, it may be possible to identify the total emotion.
  • the display control unit 40 weights the emotion specified for each call so that the newer the call date or the longer the call time, the larger the numerical value indicating the degree of that emotion (for example, the call date and time).
  • the threshold is calculated by the formula shown in Equation 1.
  • the numbers of “joy weight”, “anger weight”, “sadness weight”, and “love weight” are used for emotions that are difficult to express in life (ie, "love”). On the contrary, it becomes smaller for emotions that are easily expressed in life, and the value of “threshold weight” is the initial value of the threshold. For example, by increasing the “weight of love”, the threshold value calculated when “affection” is included in the emotions identified in multiple calls is increased, and as a result, the numerical value “ The degree of “affection” easily exceeds this threshold, but the degree of emotions other than “affection” does not easily exceed this threshold, so it is easy to identify “affection” as a comprehensive emotion. By setting the threshold in this way, even if the owner is an individual whose joy and power is specified for each call, the affection is specified in several calls, so that the overall feeling can be expressed in love. And can reduce emotional bias.
  • the display control unit 40 also selects a specific date using the operation keys (in FIG. 6 (b), select September 2! /), And the emotion of the mobile phone user on that date is displayed.
  • the item “call start time” in the information list stored in the emotion identification device 20 shown in FIG. 3 is that specific date. so
  • the mobile phone user's own emotions for which emotions are to be identified that is, data for one call whose item “signal source” is “sending speech” are extracted.
  • the display control unit 40 refers to the items “call start time” and “emotion” in the extracted data for one call, and for example, as shown in FIG. Display your feelings in chronological order.
  • the display control unit 40 displays a mark representing the emotion of the mobile phone user as shown in FIGS. 6 (b) and 6 (c). It may be displayed. At this time, if the display control unit 40 receives an input operation from the mobile phone user instructing to display an emotion of the other person on each date by the operation key (at this time, the other person also includes the time of the input operation). Specified by the phone number entered or specified by the phone number registered in the phone book if the other person to be identified is already registered in the phone book) From the list of information stored in device 20, extract the emotions of others, that is, the item “Signal Source” is “Received Voice” and the item “Destination Phone Number” matches the entered phone number. To do.
  • FIG. 7 shows a display example of the telephone directory by the mobile phone according to the first embodiment of the present invention.
  • Fig. 7 (a) is a display example of the name and phone number of each person registered in the phone book
  • Fig. 7 (b) is a display example of emotions for each individual registered in the phone book
  • Fig. 7 (c) shows a display example of an individual's emotional call status.
  • the display control unit 40 performs an input operation using an operation key provided in the mobile phone of the present invention.
  • the phone book application 303 is started by accepting the talk user power (step S407, YES), for example, as shown in Fig. 7 (a), the name of the person whose name begins with a line and its individual Are displayed for each individual.
  • the display control unit 40 accepts an input operation for instructing to display the personal emotion being displayed by the operation key, the mobile phone user power is received, among the information lists stored in the emotion identification device 20 shown in FIG. Of the person whose emotion is to be identified, that is, the item “Signal source” is “Received voice” and the item “Called phone number” matches the phone number registered for that individual.
  • Extract data for one call In the list of information stored in the emotion identification device 20 shown in FIG. 3, the telephone number “09000000000” matches the telephone number of the displayed name “Taro Matsushita” and the telephone number “09000001111” is displayed.
  • the display control unit 40 puts marks representing the extracted emotions “sorrow” and “love” in the corresponding “Taro Matsushita” and “Jiro Matsushita” fields. Display.
  • these processes are performed in the same way, and for each individual registered in the phone book as shown in Fig. 7 (b), it is obtained from the past call of that individual. Display the sentiment.
  • the display control unit 40 has the latest call date and time, the emotion specified by the call, the longest call time, the emotion specified by the call, and the most specific among the emotions of a certain individual. It is also possible to cause the display unit 50 to display a number of emotions that have been repeated many times as the overall emotion of the individual.
  • the display control unit 40 displays the appearance frequency of emotions specified by a plurality of calls with an individual as shown in the display example of the emotional call status of an individual in FIG. 7 (c). May be.
  • a specific individual displayed in the phone book is selected by the operation key ("Taro Matsushita" is selected in 7 (b)) and out of 25 calls with the selected individual. Show how many times each emotion was identified (and the percentage of all calls)! /
  • the emotion specified for each call with the individual is weighted to set the level of that emotion, and the overall emotion is identified from the level of emotion. It is also possible to do.
  • the display control unit 40 weights the emotion specified for each call with the individual so that the newer the call date and time or the longer the call time, the larger the numerical value indicating the degree of the emotion (for example, The number of seconds, which is the difference between subtracting the current date and time from the date and time of the call and subtracting the current date and time, is used as the degree of emotion, or the number of seconds of the call time is used as the degree of emotion.
  • the degree of emotion expressed in numbers! Of ⁇ ⁇ / ⁇ (which may be a numerical value of emotions specified for each call and the sum of those values for each emotion), the degree is indicated by the largest value exceeding a certain threshold. If the weighted emotion does not exceed the predetermined threshold, “normal” is displayed as the total emotion of the individual. According to this configuration, an emotion with a particularly high degree of emotion identified among all calls made with a certain individual is displayed as the total emotion of that individual. It is possible to improve the accuracy of identifying overall feelings.
  • the threshold value is calculated by the formula shown in Equation 1.
  • the phone book can be used to call a specific other person in the past by simply looking at the phone book! Since it is possible to judge whether it is hot or not, it is possible to help the mobile phone user remember the past call partner without taking time and effort.
  • FIG. 8 shows an example of sorting for each emotion by the mobile phone according to the first embodiment of the present invention.
  • the display control unit 40 uses the operation keys provided on the mobile phone of the present invention to perform an input operation to instruct the mobile phone user to display a list of call partners whose emotions were "sad” in past calls.
  • the item “Signal source” is “Received voice”
  • the item “Emotion” is “Sorrow”. Extract something.
  • the display control unit 40 further counts the number of times the callee telephone number has been extracted when the callee telephone number that has already been extracted is extracted again. When the display control unit 40 extracts all the corresponding telephone numbers in the information list, as shown in FIG.
  • the personal names corresponding to the telephone numbers in the descending order of the number of extractions (individual names are called If the phone number is already registered in the phone book application 303, the phone number is identified) and the number of times of extraction is displayed.
  • the total number of calls with the extracted call destination telephone number and the ratio of the number of extractions to the total number of calls are also displayed. . By displaying such information, the mobile phone user can roughly know what other people have felt in calls made with a specific other person in the past.
  • FIG. 9 shows another configuration diagram of the mobile phone according to the first embodiment of the present invention.
  • the mobile phone according to the first embodiment of the present invention includes a PIM (Personal Information Manager) application group 30, a display unit 50, a communication device 60, an emotion identification device 70, and a display control unit 80.
  • PIM Personal Information Manager
  • the description of the same reference numerals as in FIG. 1 is as described above, so the description is omitted. Abbreviated.
  • the call device 60 includes a mail transmission / reception unit 601 and a character information output unit 602.
  • the mail transmission / reception unit 601 performs mobile wireless communication with the mobile radio base station, receives an e-mail addressed to the mail address assigned to the mobile phone of the present invention, and addresses an arbitrary mail address from the mobile phone of the present invention.
  • the character information output unit 602 is a character data described in the e-mail received via the mail transmitting / receiving unit 601 or the e-mail for transmission (the character string described in the title name or e-mail text in the e-mail). At least a part of the character data about the) is output to the emotion identification device 70.
  • the character information output unit 602 outputs the received e-mail data to the display control unit 80 to display the e-mail, and displays the e-mail by operating the operation key (not shown) provided in the mobile phone of the present invention.
  • E-mail data for transmission created on 50 is input from the display control unit 80.
  • the emotion identification device 70 includes an emotion estimation unit 701, an emotion accumulation unit 702, an emotion identification unit 703, and an emotion information storage unit 704.
  • Emotion estimation unit 701 estimates the emotion of the mail creator describing the character string from the character data input from communication device 20.
  • the emotion estimation unit 701 for example, expresses the degree of emotion expressed by the values of 0, 1, 2 for each factor of emotion information composed of love, joy, anger, sadness, neutral (normal) (0: emotion None, 1: Emotional weakness, 2: Emotional strongness, input character data (a character string that has at least one character strength, a mark that expresses an image, etc.) This mark is called a pictogram Is estimated for each sentence from the head of the sentence, and each estimated value is sequentially output to the emotion storage unit 702.
  • the emotion accumulating unit 702 accumulates the numerical values for each factor input from the emotion estimating unit 701 in association with the input order.
  • Emotion accumulation unit 702 inputs a numerical value for each of a series of factors from emotion estimation unit 701 (the series here refers to the input start force of numerical input from emotion estimation unit 701 to emotion accumulation unit 702) Then, the numerical value for each factor in the series is stored as a set of data (hereinafter, a set of data is referred to as data for one mail).
  • Emotion identification unit 703 reads the data for one email from emotion accumulation unit 702, and The numerical value for each factor is analyzed, one characteristic factor is identified from the read numerical value, and the emotion represented by the one factor is output to the emotion information storage unit 704.
  • the emotion identification unit 703 uses the factor with the largest numerical value in the data for one email as the characteristic emotion, thereby identifying the content that is strongly impressed in the email. Emotion can be identified with emphasis. Another method is to identify emotions with emphasis on the content of the entire email, by using the factors that have the largest sum of the numbers from the beginning of the email to the end of the email in the data for one email as the characteristic emotion. In addition, it is possible to identify emotions with emphasis on the reverberation of the email content by making the characteristic emotion the factor with the most powerful numerical value at the end of the email sentence in the data for one email. .
  • Emotion information storage section 704 inputs an emotion represented by one factor from emotion identification section 703, and further, information on mail transmission / reception performed by mail transmission / reception section 601 of communication device 60 (the mail)
  • the mail transmission / reception unit 601 inputs the mail transmission / reception error or the mail transmission time (mail reception time, destination, and sender's mail address).
  • the emotion information storage unit 704 stores emotions input from the emotion identification unit 703 and various information input from the mail transmission / reception unit 601 in association with each other.
  • FIG. 10 shows another example of the information list stored in the emotion identifying device in the mobile phone according to the first embodiment of the present invention. In FIG.
  • the emotion information storage unit 704 is an emotion in which the mail power is also specified for each piece of data for one email (that is, the sentiment of the mobile phone user of the present invention who is the mail creator in the case of outgoing mail). If it is a received mail, it is configured to memorize the emotion of the mail partner who is the mail creator).
  • the display control unit 80 activates and executes one of the applications in the PIM application group, extracts data necessary for processing of the executed application from the emotion information storage unit 700, and displays the data in the display unit 50. Display various information. A display example of the display unit 50 when the call history application, scheduler application, or phone book application is started and executed will be described later.
  • step S1101 When communication device 60 starts mail transmission or mail reception (YES in step S1101), it outputs character data described in the received electronic mail or electronic mail for transmission to emotion identification device 70.
  • step S1102 The emotion identifying device 70 reads the text data input from the communication device 60 for each sentence from the beginning of the input character data for each factor of emotion information composed of affection, joy, anger, sadness, and neutral (normal). Or, it is estimated for each phrase (step S 1103).
  • the emotion identification device 70 When the emotion identification device 70 finishes inputting the character data of the communication device 60 (step S 1104, YES), it analyzes the numerical value for each factor estimated from the beginning to the end of the series of character data, One characteristic factor is identified, and the emotion represented by the one factor is stored as the mail creator's emotion (step S 1105. At this time, as shown in FIG. Memorize with specific emotions).
  • the display control unit 80 starts the mail history application 304 by accepting the input operation by the operation key provided in the mobile phone of the present invention by the mobile phone user power (step S1106, YES).
  • the program code of the history application 304 the emotion specified for each email stored in the emotion identification device 70 and information related to the email are read, and the information is displayed on the display unit 50 in a predetermined display format. (Step S 1107).
  • the display control unit 40 can receive multiple emails received on a single date, Among the multiple emails received, the date and time when the email was received is the newest, the emotion identified by the email, the number of characters in the email body is the most, the emotion identified by the email, and the number of times identified the most Are displayed on the display unit 50 as the total feelings of the date or the individual.
  • the display control unit 40 weights the emotion specified for each email so that the newer the email reception date or the more the number of characters in the email body, the larger the numerical value indicating the degree of the emotion (for example, E-mail reception date / time power
  • the number of seconds after subtracting the current date and time for displaying comprehensive emotions is numerically expressed as the degree of the emotion, or the amount of data in the mail text is expressed as the degree of emotion.
  • the mobile phone of the first embodiment of the present invention it is possible to think about the contents of past calls and the contents of e-mail text without forcing the mobile phone user to spend time and effort. It can help to put out.
  • mobile phone users can be notified by notifying mobile phone users of emotions of mobile phone users and the emotions of other parties that have communicated with those users using mobile phones. It is possible to provide an environment in which a person can easily remember past communications.
  • FIG. 12 shows a configuration diagram of the mobile phone according to the second embodiment of the present invention.
  • the mobile phone according to the second embodiment of the present invention includes a call device 10, an emotion identification device 20, a PIM (Personal Information Manager) application group 30, a display control unit 40, a display unit 50, a playback control unit 90, and a speaker. It is composed of 100.
  • the configuration of the mobile phone according to the second embodiment of the present invention is the same as the configuration of the mobile phone according to the first embodiment of the present invention shown in FIG. 1 except that the voice storage unit 105, the playback control unit 90, and the speaker 100 of the communication device 10 are provided. It is an added one.
  • the voice signal output unit 102 emotionally receives a voice signal received by another telephone force via the communication unit 101, a voice signal of the mobile phone user picked up by the call microphone 104, or both voice signals. Output to specific device 20. Also, the voice signal output unit 102 outputs the voice received by other telephone capabilities to the call speaker 103, and outputs the call voice of the mobile phone user collected by the call microphone 104 to the communication unit 101. In addition, the audio signal output unit 102 outputs to the audio storage unit 105 the audio signal received by the other telephones via the communication unit 101 and the audio signal of the mobile phone user collected by the call microphone 104. To do.
  • the voice storage unit 105 of the call device 10 When the voice storage unit 105 of the call device 10 is notified by the communication unit 101 that a call with another telephone has been started and receives a voice signal from the voice signal output unit 102, the voice storage unit 105 starts recording the voice signal. To do.
  • the audio storage unit 105 receives the audio signal from the audio signal output unit 102.
  • the recording of the audio signal is ended, and the communication unit 101 is notified that the recording is completed.
  • the communication unit 105 when notifying the voice storage unit 105 that a call with another telephone has started, is identification information for identifying the call (the time at which the call was started or ended is used to identify the call).
  • the voice storage unit 105 records the identification information and the voice signal to be recorded in association with each other (the voice recording unit 105 identifies the voice signal based on the identification information).
  • Emotion information storage unit 204 receives an emotion expressed by one factor from emotion identification unit 203, and further, information related to a call made by communication unit 101 of call device 10 (the call is transmitted) Any incoming call, call start time 'call end time, and other phone identification information (for example, the phone number of the callee), and the presence of audio data that recorded the call by the call device 10. Input from the communication unit 101.
  • the emotion information storage unit 204 stores the emotion inputted from the emotion identification unit 203, the various information inputted from the communication unit 101, and the presence / absence of recorded audio data in association with each other (the emotion identification unit 203 receives the received voice signal).
  • FIG. 13 shows a list of information stored in the emotion identifying device in the mobile phone according to the second embodiment of the present invention.
  • the information list stored in the emotion identifying device in the mobile phone according to the second embodiment of the present invention is obtained by adding the item “recorded data presence / absence” to the information list in FIG.
  • the reproduction control unit 90 is an audio signal specified by identification information (identification information is input by an input operation using an operation key provided on a mobile phone) among the audio signals stored in the audio storage unit 105. Is output and output to the speaker 100 to output sound.
  • the call device 10 When the call device 10 starts a call (step S402, YES), the call device 10 outputs the received voice signal and the transmitted voice signal to the emotion identifying device 20, and further starts recording the voice signal. (Step S1401).
  • the emotion identifying device 20 is a factor of emotion information composed of affection, joy, anger, sadness, and two recommendationsls (normal) for the target speech signal of the speech signal input from the call device 10. Each time, the degree of emotion is continuously estimated at a predetermined time interval (step S404).
  • the emotion identification device 20 analyzes the numerical value for each factor estimated during the call, and sets the characteristic factor to 1 And identify the emotion expressed by one of the factors as the other party's or mobile phone user's emotion in this call (step S1402). The presence / absence of voice data is stored in association with the identified emotion). Further, when the input of the audio signal is completed, the call device 10 ends the recording of the audio signal, and records the recorded data and identification information for identifying the recorded data in association with each other (step S1403).
  • the display control unit 40 starts the call history application 301 by accepting the input operation using the operation keys provided in the mobile phone of the present invention by the mobile phone user power (step S407, YES).
  • the program code of the application 301 the emotion identified for each call stored in the emotion identification device 20, the information related to the call and the presence / absence of recorded voice data are read, and the information is displayed on the display unit 50. Display in a predetermined display format (step S 1404).
  • FIG. 15 shows a display example of a call history by the mobile phone according to the second embodiment of the present invention.
  • Fig. 15 (a) is a transmission history generated based on the information stored in the emotion identification device 20 shown in Fig. 13, and
  • Fig. 15 (b) shows the information stored in the emotion identification device 20 shown in Fig. 13. This is the original received history.
  • the display control unit 30 sets the item "outgoing / incoming" in the information list stored in the emotion identifying device 20 shown in Fig. 13 to "outgoing". (In this case, the data a for one call is applicable) From the data for one call with the other party, that is, the item “Signal Source” is “Received Voice”, the data of the items “Call Start Time”, “Destination Phone Number”, “Emotion”, “Recorded Data” Is extracted and displayed at the corresponding locations of the items of “Originating date”, “Name”, “Emotion”, and “Recorded voice” in the outgoing history in Fig. 15 (a).
  • the display control unit 30 In order to display the incoming call history shown in FIG. 5 (b), the display control unit 30 also sets the item "outgoing / incoming" in the information list stored in the emotion identification device 20 shown in FIG. "Incoming call” (data corresponding to one call b and c), and the other party who should specify the emotion set in step S401, that is, the item “Signal source” is "Received voice" From the data for one call, the data of the items “call start time”, “call destination phone number”, “emotion”, “recorded data presence / absence” are extracted, and the incoming call history in Fig. 15 (b) is extracted. Displayed in the corresponding sections of the items “Date and time of incoming call”, “Name”, “Emotion”, and “Recorded voice”.
  • FIG. 16 shows a display example of the schedule book by the mobile phone according to the second embodiment of the present invention.
  • Fig. 16 (a) is a calendar displayed in the schedule book
  • Fig. 16 (b) is a display example that displays emotions for each date in the calendar
  • Fig. 16 (c) is for a specific date. It is a display example that displays the transition of emotion.
  • FIGS. 6 (a) and 6 (b) are the same as FIGS. 6 (a) and 6 (b) of the first embodiment, and the processing of the display control unit 40 for displaying these is also performed. Since it is the same as that of the first embodiment, description thereof is omitted.
  • the display control unit 40 selects a specific date using the operation keys (in FIG. 16 (b), selects September 2), and changes the emotion of the mobile phone user on that date.
  • the item “call start time” in the information list stored in the emotion identification device 20 shown in FIG. 13 is that specific date, and
  • the mobile phone user's own emotions for which emotions are to be identified, that is, data for one call whose item “signal source” is “sending speech” are extracted.
  • the display control unit 40 extracts Referring to the items “call start time”, “emotion”, and “recorded data presence / absence” in the data for one call, for example, as shown in Fig. 6 (c), the mobile phone user himself on September 2 “Sound recording” is displayed along with the feeling of, and is displayed in chronological order.
  • FIG. 17 shows a display example of the phone book by the mobile phone according to the first embodiment of the present invention.
  • Fig. 17 (a) is a display example of each person's name and phone number registered in the phone book
  • Fig. 17 (b) is an example of emotion display for each individual registered in the phone book. .
  • the display control unit 40 starts up the phone book application 303 by accepting the input operation using the operation keys provided in the mobile phone of the present invention by the mobile phone user power (step S407, YES). For example, FIG. As shown in (a), the name of the person whose name begins with a line is displayed for each individual person's name and the individual's telephone number. Further, when the display control unit 40 receives an input operation for instructing to display the personal emotion being displayed by the operation key from the mobile phone user, the display control unit 40 of the information list stored in the emotion identification device 20 shown in FIG.
  • the item “Signal source” is “Received voice” and the item “Called phone number” matches the phone number registered for that individual. Yes Extract data for one call.
  • the telephone number “09000000000” matches the telephone number of the displayed name “Taro Matsushita” and the telephone number “09000001111” is displayed.
  • the display control unit 40 displays the marks representing the extracted emotions “sorrow” and “love” in the corresponding “Taro Matsushita” and “Jiro Matsushita” fields. .
  • the display control unit 40 displays “Recorded voice present” in the corresponding column as shown in FIG. indicate.
  • the mobile phone according to the second embodiment of the present invention has [display format using call history], [schedule This section describes the process of playing back the recorded audio after displaying the presence or absence of the recorded audio as described in [Display Format Using Single Book] and [Display Format Using Phone Book].
  • the control unit 90 identifies the information displayed in the emotion information storage unit 204 as to which one call data the selected display location is displayed based on, and identifies the data for one call.
  • the voice signal to be reproduced is read out of the voice signals stored in the voice storage unit 105 from the item “call start time” in (in the voice storage unit 105, the call start time as the identification information and the voice to be recorded The signal is stored in association with the signal).
  • the reproduction control unit 105 outputs the read audio signal to the speaker 100, so that the mobile phone can output sound up to the end of the recorded voice.
  • the mobile phone of the second embodiment of the present invention when the mobile phone user remembers the content of a past call, it is sometimes specified in advance and symbolizes the content. In addition to displaying emotions, if the call voice is recorded, the presence or absence of the recorded data is also displayed, and the recorded data is played back so that the mobile phone user can confirm the contents. Can remember.
  • the mobile phone according to the second embodiment of the present invention does not need to record every call one by one, so that a mobile phone user can record a voice signal during a call by performing a predetermined operation. It may be configured.
  • information about the call, and the presence or absence of recorded audio data are displayed in a predetermined display format, if there is no recorded audio data, for example, Fig. 15 (b), Fig. 17 As shown in (b), “No sound” is displayed at the corresponding location.
  • FIG. 18 shows the second of the present invention.
  • the processing flow (Example 1) which records the audio
  • the call device 10 When the call device 10 starts a call (step S402, YES), the call device 10 outputs a received voice signal and a transmitted voice signal to the emotion identifying device 20. (Step S403).
  • the emotion identifying device 20 is a factor of emotion information composed of affection, joy, anger, sadness, and neutral (normal) for the speech signal to which emotion should be identified among the audio signals input from the call device 10. Every time, the degree of emotion is continuously estimated at a predetermined time interval (step S404). In the process of step S404, if there is a factor that exceeds the threshold that is at least one of the estimated emotion information factors (S1801, YES), the call device 10 records the received voice signal and the transmitted voice signal. (S1802.
  • the emotion estimation unit 201 for determining whether there is a factor with a numerical value exceeding a threshold value among the estimated emotion information factors is stored in the voice storage unit 105.
  • a control signal that starts recording the received voice signal and the transmitted voice signal will be output).
  • the emotion identification device 20 repeats these processes until the call by the call device 10 is completed (step S405, YES).
  • the emotion identification device 20 has the ability to start a call with respect to the received voice signal for each of the emotional information elements composed of affection, joy, anger, sadness, and neutral (normal) as shown in Fig. 2.
  • the communication device 10 has a pleasure factor in the interval from 10 seconds to 15 seconds after the call starts. Since it was estimated to be 2, the recording of the call voice from this interval to the end of the call is started (first recorded audio), and the anger factor is 2 in the interval of 45 seconds after the start of the call for 50 seconds It is estimated that the recording of the call voice from this section to the end of the call is started (second recorded voice), and the grief factor is estimated to be 2 in the section of 55 seconds after the start of the call. Therefore, this segment strength also records the call voice until the end of the call. Start (third recorded voice).
  • the emotion identification device 20 analyzes the numerical value for each factor estimated during the one call, specifies one characteristic factor from the numerical value, The emotion expressed by the one factor is stored as the feeling of the other party or the user of the mobile phone in this call (step S 1402. At this time, as shown in FIG. 13, information about the call and recorded voice data are recorded. Is stored in association with the identified emotion).
  • the call device 10 ends the recording of each call audio, and the recorded audio that has started recording after the factor specified in the process of step S 1402 is estimated to be 2 (for example, step If the factor specified in the process of S1402 is pleasure, the first recorded voice) is recorded in association with the identification information for identifying the recorded data (S1803.
  • the communication device 10 may delete the second and third recorded voices after step S1803.
  • the communication device 10 leaves the second and third recorded voices as second and third candidates representing the emotion of the call, and proceeds to the process of step S 1803 to obtain the second and third recorded voices.
  • the identification information for identifying the recorded data may be recorded in association with (In this case, [Display format using call history], [Display format using schedule book], [Phone book When displaying the presence / absence of recorded audio as described in [Display format used], display the number of recorded audios in order of priority, or display only the number of recorded audios).
  • FIG. 19 shows a processing flow (Example 2) for recording a call voice by the mobile phone according to the second embodiment of the present invention.
  • the call device 10 When the call device 10 starts a call (step S402, YES), the call device 10 outputs a received voice signal and a transmitted voice signal to the emotion identification device 20, and further starts recording the voice signal. (Step S403).
  • the emotion identifying device 20 is a factor of emotion information composed of affection, joy, anger, sadness, and two Amls (normal) for the target speech signal of the speech signal input from the call device 10. Each time, the degree of emotion is continuously estimated at a predetermined time interval (step S404). In the process of step S404, If there is a factor with a numerical value exceeding the threshold that is at least one of the emotional information factors (step S 1901, YES), the emotion identification device 20 provides information on the factor and the time when the factor exceeded the threshold.
  • Step S 1902. the emotion estimation unit 201 determines the factor of the emotion information. If it is estimated that there is a factor with a numerical value exceeding the threshold value, the emotion accumulation unit 202 stores the factor and the time when the factor exceeds the threshold value).
  • the emotion identification device 20 repeats these processes until the call by the call device 10 is completed (step S405, YES).
  • the emotion starter 20 can continuously determine the call start power for the received voice signal at the end of the call for each emotion information factor consisting of affection, joy, anger, sadness, and neutral (normal) shown in FIG.
  • the call device 10 will be in the interval from 10 seconds to 15 seconds after the start of the call. Since the joy factor was estimated to be 2, the joy factor and the time 10 seconds after the start of the call were memorized (first tag information), and the interval from 45 to 50 seconds after the start of the call Therefore, the anger factor and the time 45 seconds after the start of the call are memorized (second tag information), and 50 to 55 seconds after the start of the call.
  • the sadness factor was estimated to be 2 in the interval It stores a child, and the time after the call start 50 seconds, the (third tag information).
  • the emotion identification device 20 analyzes the numerical value for each factor estimated during the one call, specifies one characteristic factor from the numerical value, The emotion expressed by the one factor is stored as the other party's or mobile phone user's emotion in this call, and the information about the time when the specified factor exceeds the threshold (tag information) is stored (step information). S 1903. At this time, in addition to the correspondence between the information related to the call shown in FIG. 13, the presence / absence of the recorded voice data, and the identified emotion, tag information is also stored in association with each other). Further, when the input of the audio signal is completed, call device 10 ends the recording of the audio signal, and records the recorded data and identification information for identifying the recorded data in association with each other (step S 1403).
  • Playback should be started from the audio location that can be identified from the tag information. Furthermore, by starting playback of the recorded voice from a location that is a predetermined time before the voice location that can be identified from the tag information, it is possible to avoid missing the voice location that is the main cause of identifying the emotion of the recorded speech.
  • the information processing terminal of the present invention it is possible to support recalling the contents of past calls and the contents of e-mail texts without forcing the user to spend time and effort. This is useful in the field of information processing terminals that can identify the emotion of the creator of the email and the emotion of the speaker during the call from the character string described in the email and the voice during the call.

Abstract

An information processing terminal that can assist the user to remember his/her past conversation contents or message body contents without forcing the user to spend time and effort. The information processing terminal comprises a conversation device (10) for entering feeling identification information composed of at least a voice, a feeling identification device (20) for identifying a feeling on the basis of the feeling identification information entered into the conversation device (10) and a display section (50) for displaying information on the feeling identified by the feeling identification device (20).

Description

明 細 書  Specification
情報処理端末  Information processing terminal
技術分野  Technical field
[0001] 本発明は、電子メールに記載されて 、る文字データや通話時の音声から、その電 子メールの作成者の感情や通話中の話者の感情を特定することができる情報処理 端末に関する。  The present invention relates to an information processing terminal capable of specifying the emotion of the creator of the email and the emotion of the speaker during the call from the character data described in the email and the voice during the call. About.
背景技術  Background art
[0002] 近年の携帯電話、 PDA (Personal Digital Assistance)などの携帯端末は、着信履 歴 (少なくとも、電話着信した時刻と、発信元の電話番号や発信元を特定できる場合 は発信元の名称と、力も構成される項目を、着信時刻に従って、時系列に配列したも の)や発信履歴 (少なくとも、電話発信した時刻と、発信先の電話番号や発信先を特 定できる場合は発信先の名称と、力も構成される項目を、発信時刻に従って、時系列 に配列したもの)などの通話履歴や、メール受信履歴 (少なくとも、メール受信した時 刻と、送信元のメールアドレスや送信元を特定できる場合は送信元の名称と、メール 本文と、力も構成される項目を、受信時刻に従って、時系列に配列したもの)ゃメー ル送信履歴 (少なくとも、メール送信した時刻と、送信先のメールアドレスや送信先を 特定できる場合は送信先の名称と、メール本文と、から構成される項目を、送信時刻 に従って、時系列に配列したもの)などのメール履歴を表示させることができるものが ほとんどである(例えば、特許文献 1)。  [0002] Recent mobile phones, PDAs (Personal Digital Assistance), and other mobile devices have a history of incoming calls (at least when the call arrives and if the caller's phone number and caller can be identified, the caller's name and , The items that are also configured in chronological order according to the incoming time) and outgoing call history (at least when the phone call time and the destination phone number and destination can be specified) And call history such as items arranged in chronological order according to the call origination time) and mail reception history (at least the time when the mail was received, the sender's email address and sender can be specified) In this case, the name of the sender, the email body, and the items that are configured in a time-sequential manner according to the reception time are sent in the email transmission history (at least the email transmission time and the email address of the recipient. If you can identify the address and destination, you can display the email history such as the destination name and the body of the email, arranged in chronological order according to the sending time) (For example, Patent Document 1).
特許文献 1:特開平 11― 275209号公報  Patent Document 1: Japanese Patent Laid-Open No. 11-275209
発明の開示  Disclosure of the invention
発明が解決しょうとする課題  Problems to be solved by the invention
[0003] し力しながら、携帯端末利用者は、従来の携帯端末が表示する通話履歴やメール 履歴を見ただけでは、過去の通話内容やメール本文内容を思い出すことは困難であ ることはもちろんのこと、その通話やメールをどのような感情の中でやりとりしていたか すら思い出すことは困難である。 [0003] However, it is difficult for a mobile terminal user to recall past call contents and mail text contents simply by looking at the call history and mail history displayed on a conventional mobile terminal. Of course, it is difficult to remember even what feelings you were communicating with the call or email.
[0004] 過去の通話内容を思い出すことができるように、従来の携帯端末の中には通話音 声を保存しておくものもあるが、保存しておいた通話音声を再生するまでの操作に手 間が力かったり、通話音声を終盤まで再生しないとその通話内容を思い出すことがで きな力つたりするため、過去の通話内容を思い出すには時間がかかってしまう。また、 メール本文内容を思い出すには、メール履歴にはその履歴を構成する項目毎にメー ル本文内容が付加されていることから、メール本文内容を思い出したい 1つの項目を 選択し、そのメール本文を表示させれば良いが、携帯端末利用者に選択操作を強い ることになるため、また携帯端末利用者はメール本文を読み直さなければならな 、た め、携帯端末利用者に手間を力 4ナさせてしまう。 [0004] In order to be able to recall the contents of past calls, some conventional mobile terminals have a call sound. Some voices are saved, but it takes a lot of time to play back the saved call voice, or if you do not play the call voice until the end, you can remember the contents of the call. It takes a long time to remember the contents of past calls. In addition, to remember the contents of the mail text, the mail history is appended to each item that makes up the mail history, so select one item that you want to remember the mail text content and select the mail text. Is displayed, but the mobile terminal user is forced to perform a selection operation, and the mobile terminal user must reread the mail text. I will let you four.
[0005] 本発明は、上記事情に鑑みてなされたもので、利用者に手間や時間をかけることを 強いること無ぐ過去の通話内容やメール本文の内容を思い出すことを支援すること ができる情報処理端末を提供することを目的とする。  [0005] The present invention has been made in view of the above circumstances, and can support recalling the contents of past calls and the contents of e-mail text without forcing the user to spend time and effort. An object is to provide a processing terminal.
課題を解決するための手段  Means for solving the problem
[0006] 本発明の情報処理端末は、少なくとも文字データまたは音声により構成される感情 特定用情報を入力する入力手段と、前記入力手段により入力した感情特定用情報 に基づいて、感情を特定する感情特定手段と、前記感情特定手段により特定した感 情に関する情報を表示する表示手段と、を備えるものである。  [0006] An information processing terminal according to the present invention includes an input means for inputting emotion specifying information composed of at least character data or voice, and an emotion specifying an emotion based on the emotion specifying information input by the input means. And a display unit for displaying information related to the emotion specified by the emotion specifying unit.
[0007] この構成によれば、端末利用者が過去の情報処理端末を利用したコミュニケーショ ン内容を思い出すことを支援することができる  [0007] According to this configuration, it is possible to support the terminal user to remember the contents of communication using the past information processing terminal.
[0008] また、本発明の情報処理端末は、電子メールを送信または受信するメール送受信 手段を備え、前記入力手段が、前記メール送受信手段により送信または受信した電 子メールに記載されている文字データを入力し、前記感情特定手段が、前記入力手 段により入力した文字データに基づいて、感情を特定し、前記表示手段が、前記感 情特定手段により特定した、前記メール送受信手段により送信または受信した電子メ ールに対応する感情に関する情報を表示する、ものを含む。  [0008] Further, the information processing terminal of the present invention includes mail transmission / reception means for transmitting or receiving an electronic mail, and the input means is character data described in the electronic mail transmitted or received by the mail transmission / reception means. The emotion specifying means specifies an emotion based on the character data input by the input means, and the display means transmits or receives by the mail transmitting / receiving means specified by the emotion specifying means. Including information about emotions corresponding to selected e-mails.
[0009] この構成によれば、端末利用者が過去のメール本文の内容を思い出すことを支援 することができる。  [0009] According to this configuration, it is possible to support the terminal user to remember the contents of the past mail text.
[0010] また、本発明の情報処理端末は、前記表示手段が、前記メール送受信手段により 電子メールを送信または受信した順に、各電子メール毎に少なくとも、送信時刻また は受信時刻、送信先または送信元、および前記感情に関する情報を表示する、もの を含む。 [0010] Further, in the information processing terminal of the present invention, at least a transmission time or at least for each e-mail in the order in which the display unit transmits or receives e-mails by the mail transmission / reception unit. Includes those that display information about the reception time, destination or source, and the emotion.
[0011] また、本発明の情報処理端末は、前記表示手段が、日付毎に、前記メール送受信 手段により当該日付に送信または受信した電子メールに対応する前記感情に関する 情報を表示する、ものを含む。  [0011] Further, the information processing terminal of the present invention includes the information processing terminal in which the display unit displays, for each date, information related to the emotion corresponding to the email transmitted or received on the date by the mail transmission / reception unit. .
[0012] また、本発明の情報処理端末は、前記表示手段が、電話帳機能に登録されている 対象者毎に、前記メール送受信手段により当該対象者に送信した電子メールまたは 当該対象者力も受信した電子メールに対応する前記感情に関する情報を表示する、 ものを含む。  [0012] In addition, in the information processing terminal of the present invention, for each target person registered in the telephone directory function, the display means also receives an e-mail transmitted to the target person by the mail transmitting / receiving means or the target person power. Including information on the emotions corresponding to the received e-mail.
[0013] この構成によれば、また、メール送受信履歴機能、スケジュール管理機能、または 電話等機能などを実行する各種 PIMアプリケーションに連動して、携帯電話利用者 の感情や携帯電話を利用して当該利用者とコミュニケーションを取った相手の感情を 携帯電話利用者に通知することにより、携帯電話利用者が過去のコミュニケーション をより思い出し易 、環境を提供することができる。  [0013] According to this configuration, in conjunction with various PIM applications that execute functions such as an email transmission / reception history function, a schedule management function, or a telephone function, the mobile phone user's emotions and the mobile phone are used to By notifying mobile phone users of the feelings of other people who have communicated with users, mobile phone users can more easily remember past communications and provide an environment.
[0014] また、本発明の記載の情報処理端末は、前記表示手段が、前記メール送受信手段 により送信または受信した複数の電子メール毎に特定された前記感情に関する情報 のうちの、いずれか 1つを表示する、ものを含む。  [0014] Further, in the information processing terminal according to the present invention, any one of the information related to the emotion specified by the display unit for each of the plurality of e-mails transmitted or received by the mail transmitting / receiving unit. Including things that display.
[0015] この構成によれば、複数のメール毎に特定された感情を、端末利用者は一目で確 認することができる。  [0015] According to this configuration, the terminal user can confirm at a glance the emotion specified for each of a plurality of emails.
[0016] また、本発明の情報処理端末は、通話を行なう通話手段を備え、前記入力手段が 、前記通話手段による通話音声を入力し、前記感情特定手段が、前記入力手段によ り入力した通話音声に基づいて、感情を特定し、前記表示手段が、前記感情特定手 段により特定した、前記通話手段により通話した着信者、発信者または着信者および 発信者の両者についての感情に関する情報を表示する、ものを含む。  [0016] Further, the information processing terminal according to the present invention includes a calling means for making a call, wherein the input means inputs a call voice by the calling means, and the emotion specifying means inputs by the input means. Emotions are specified based on the voice of the call, and the display means specifies information related to emotions about the called party, the caller, or both the called party and the caller who have made a call by the calling means specified by the emotion specifying means. Including things to display.
[0017] この構成によれば、端末利用者が過去の通話内容を思い出すことを支援すること ができる。  [0017] According to this configuration, it is possible to assist the terminal user in recalling past call contents.
[0018] また、本発明の情報処理端末は、前記表示手段が、前記通話手段により発信また は着信した順に、各通話毎に少なくとも、発信時刻または着信時刻、着信者または発 信者、および前記感情に関する情報を表示する、ものを含む。 [0018] Further, in the information processing terminal according to the present invention, at least for each call, the display means makes a call or an incoming call in the order in which the display means makes or receives a call. Including those that display information about followers and the emotions.
[0019] また、本発明の情報処理端末は、前記表示手段が、日付毎に、前記通話手段によ り当該日付に発信または着信した通話に対応する前記感情に関する情報を表示す る、ものを含む。  [0019] Further, the information processing terminal according to the present invention is such that the display means displays, for each date, information related to the emotion corresponding to a call made or received by the call means on the date. Including.
[0020] また、本発明の情報処理端末は、前記表示手段が、電話帳機能に登録されている 対象者毎に、前記通話手段により通話した当該対象者についての前記感情に関す る情報を表示する、ものを含む。  [0020] Further, in the information processing terminal of the present invention, the display means displays, for each target person registered in the telephone directory function, information on the emotion about the target person who has called by the calling means. Including things to do.
[0021] この構成によれば、また、メール送受信履歴機能、スケジュール管理機能、または 電話等機能などを実行する各種 PIMアプリケーションに連動して、携帯電話利用者 の感情や携帯電話を利用して当該利用者とコミュニケーションを取った相手の感情を 携帯電話利用者に通知することにより、携帯電話利用者が過去のコミュニケーション をより思い出し易 、環境を提供することができる。  [0021] According to this configuration, in conjunction with various PIM applications that execute functions such as an email transmission / reception history function, a schedule management function, or a telephone function, the mobile phone user's emotions and the mobile phone can be used to By notifying mobile phone users of the feelings of other people who have communicated with users, mobile phone users can more easily remember past communications and provide an environment.
[0022] また、本発明の情報処理端末は、前記表示手段が、前記通話手段により発信また は着信した複数の通話毎に特定された前記感情に関する情報のうちの、 、ずれか 1 つを表示する、ものを含む。  [0022] Further, in the information processing terminal of the present invention, the display means displays at least one of the information related to the emotion specified for each of a plurality of calls transmitted or received by the call means. Including things to do.
[0023] この構成によれば、複数の通話毎に特定された感情を、端末利用者は一目で確認 することができる。  [0023] According to this configuration, the terminal user can confirm at a glance the emotions specified for each of a plurality of calls.
[0024] また、本発明の情報処理端末は、前記通話手段による通話音声を記憶する通話記 憶手段と、前記通話記憶手段に記憶した通話音声を再生する通話再生手段と、を備 え、前記通話再生手段が、前記通話記憶手段に記憶した通話音声のうちの、前記表 示手段により表示している前記感情に関する情報を特定した通話音声を再生する、 ものを含む。  [0024] Further, an information processing terminal of the present invention includes a call storage unit that stores a call voice by the call unit, and a call playback unit that plays back the call voice stored in the call storage unit. The call reproduction means reproduces the call voice specifying the information related to the emotion displayed by the display means among the call voices stored in the call storage means.
[0025] この構成によれば、端末利用者が過去の通話内容を思い出したいときに予め特定 しておいたその内容を象徴する感情を表示するのに加え、その通話音声を録音し、 その録音データを再生する。これにより、携帯電話利用者は、その内容を確実に思 い出すことができる。  [0025] According to this configuration, when the terminal user wants to remember the contents of a past call, in addition to displaying the emotion symbolizing the contents specified in advance, the call voice is recorded and the recording is performed. Play the data. As a result, the mobile phone user can surely remember the contents.
[0026] また、本発明の情報処理端末は、前記通話記憶手段が、前記通話手段による通話 音声のうちの、前記感情特定手段により特定した感情が反映されている箇所を記憶 し、前記通話再生手段が、前記通話記憶手段に記憶した通話のうちの、少なくとも前 記箇所を再生する、ものを含む。 [0026] Further, in the information processing terminal according to the present invention, the call storage unit stores a portion in which the emotion specified by the emotion specifying unit is reflected in the call voice by the call unit. Then, the call reproduction means reproduces at least the part of the call stored in the call storage means.
[0027] この構成によれば、録音しておいた録音データのうちの、端末利用者に過去の通 話内容を思い出させることができるデータ箇所を効果的に再生することができる。こ れにより、携帯電話利用者は、過去の通話内容を思い出すためにかける手間や時間 を最小限に抑えることができる。  [0027] According to this configuration, it is possible to effectively reproduce the data portion that can remind the terminal user of the contents of past conversations in the recorded data that has been recorded. As a result, mobile phone users can minimize the time and effort required to recall the contents of past calls.
[0028] また、本発明の情報処理端末は、前記通話再生手段が、前記箇所の開始時点から 予め設定した時間分前を再生開始時点として、前記箇所を再生する、ものを含む。  [0028] Further, the information processing terminal according to the present invention includes the information processing terminal in which the call reproduction means reproduces the location with a predetermined time before the start time of the location as a reproduction start time.
[0029] この構成によれば、録音音声の感情が反映された音声箇所の所定の時間前力 録 音音声の再生を開始することにより、録音音声の感情を特定する主な原因となった音 声箇所を聞き逃すことがなくなる。  [0029] According to this configuration, the sound that has been the main cause of identifying the emotion of the recorded voice by starting playback of the recorded voice for a predetermined time before the voice portion in which the emotion of the recorded voice is reflected is started. Never miss a voice.
発明の効果  The invention's effect
[0030] 本発明の情報処理端末によれば、利用者に手間や時間をかけることを強いること 無ぐ過去の通話内容やメール本文の内容を思い出すことを支援することができる。 図面の簡単な説明  [0030] According to the information processing terminal of the present invention, it is possible to support recalling the contents of past calls and the contents of e-mail texts without forcing the user to spend time and effort. Brief Description of Drawings
[0031] [図 1]本発明の第 1実施形態の携帯電話の構成図 FIG. 1 is a configuration diagram of a mobile phone according to a first embodiment of the present invention.
[図 2]愛情、喜び、怒り、哀しみ、ニュートラルカ 構成される感情情報の因子と、その 因子毎の感情の度合い  [Figure 2] Affection information factors that are composed of affection, joy, anger, sorrow, and neutrality, and the degree of emotion for each factor
[図 3]本発明の第 1実施形態の携帯電話における感情特定装置が記憶する情報一  FIG. 3 shows information stored in the emotion identification device in the mobile phone according to the first embodiment of the present invention.
[図 4]本発明の第 1実施形態の携帯電話による通話履歴を表示する処理フロー FIG. 4 is a processing flow for displaying a call history by the mobile phone according to the first embodiment of the present invention.
[図 5]本発明の第 1実施形態の携帯電話による通話履歴の表示例。図 5 (a)は、発信 履歴。図 5 (b)は、着信履歴。図 5 (c)は、発信履歴または着信履歴に表示する、感 情を示すマークの一例  FIG. 5 is a display example of a call history by the mobile phone according to the first embodiment of the present invention. Figure 5 (a) shows the outgoing call history. Figure 5 (b) shows the incoming call history. Fig. 5 (c) shows an example of a feeling mark displayed in the outgoing call history or incoming call history.
[図 6]本発明の第 1実施形態の携帯電話によるスケジュール帳の表示例  FIG. 6: Display example of a schedule book by the mobile phone according to the first embodiment of the present invention
[図 7]本発明の第 1実施形態の携帯電話による電話帳の表示例  FIG. 7 shows an example of a telephone directory display by the mobile phone according to the first embodiment of the present invention.
[図 8]本発明の第 1実施形態の携帯電話による感情毎のソート例  [FIG. 8] Sorting example for each emotion by the mobile phone according to the first embodiment of the present invention.
[図 9]本発明の第 1実施形態の携帯電話の他の構成図 [図 10]本発明の第 1実施形態の携帯電話における感情特定装置が記憶する情報一 覧の他の例 FIG. 9 is another configuration diagram of the mobile phone according to the first embodiment of the present invention. FIG. 10 shows another example of the information list stored in the emotion identification device in the mobile phone according to the first embodiment of the present invention.
[図 11]本発明の第 1実施形態の携帯電話によるメール受信履歴を表示する処理フロ [図 12]本発明の第 2実施形態の携帯電話の構成図  [FIG. 11] Processing flow for displaying the mail reception history by the mobile phone of the first embodiment of the present invention. [FIG.
[図 13]本発明の第 2実施形態の携帯電話における感情特定装置が記憶する情報一  FIG. 13 shows information stored in the emotion identification device in the mobile phone according to the second embodiment of the present invention.
[図 14]本発明の第 2実施形態の携帯電話による通話履歴を表示する処理フローFIG. 14 is a processing flow for displaying the call history of the mobile phone according to the second embodiment of the present invention.
[図 15]本発明の第 2実施形態の携帯電話による通話履歴の表示例 FIG. 15 is a display example of call history by the mobile phone according to the second embodiment of the present invention.
[図 16]本発明の第 2実施形態の携帯電話によるスケジュール帳の表示例  FIG. 16: Display example of schedule book by mobile phone according to the second embodiment of the present invention
[図 17]本発明の第 1実施形態の携帯電話による電話帳の表示例  FIG. 17 is a display example of a telephone directory by the mobile phone according to the first embodiment of the present invention.
[図 18]本発明の第 2実施形態の携帯電話による通話音声を録音する処理フロー(実 施例 1)  FIG. 18 is a processing flow for recording voice of a call by the mobile phone according to the second embodiment of the present invention (Example 1).
[図 19]本発明の第 2実施形態の携帯電話による通話音声を録音する処理フロー(実 施例 2)  FIG. 19 is a processing flow for recording a call voice by the mobile phone according to the second embodiment of the present invention (Example 2).
符号の説明 Explanation of symbols
10 通話装置 10 Telephone equipment
101 通信部 101 Communication Department
102 音声信号出力部 102 Audio signal output section
103 通話用スピーカ 103 Call speaker
104 通話用マイク 104 Microphone for calls
105 音声記憶部 105 Voice memory
20、 70 感情特定装置 20, 70 Emotion identification device
201、 701 感情推定部  201, 701 Emotion estimation unit
202、 702 感情蓄積部  202, 702 Emotion accumulation part
203、 703 感情特定部  203, 703 Emotion identification part
204、 704 感情情報記憶部  204, 704 Emotion information storage
30 PIMアプリケーション群 301 通話履歴アプリケーション 30 PIM applications 301 Call history application
302 スケジューラアプリケーション  302 Scheduler application
303 電話帳アプリケーション  303 Phonebook application
304 メール履歴アプリケーション  304 Mail history application
40、 80 表示制御部  40, 80 Display control unit
50 表示部  50 Display
60 通信装置  60 Communication equipment
601 メール送受信部  601 Mail sending / receiving part
602 文字情報出力部  602 Character information output section
90 再生制御部  90 Playback control section
100 スピーカ  100 speakers
発明を実施するための最良の形態  BEST MODE FOR CARRYING OUT THE INVENTION
[0033] 以下、本発明を実施する情報処理端末の形態を、携帯電話を例に挙げて、詳細に 説明する。  [0033] Hereinafter, a form of an information processing terminal that implements the present invention will be described in detail by taking a mobile phone as an example.
[0034] (第 1実施形態)  [0034] (First embodiment)
図 1に、本発明の第 1実施形態の携帯電話の構成図を示す。本発明の第 1実施形 態の携帯電話は、通話装置 10、感情特定装置 20、 PIM (Personal Infomation Mana ger)アプリケーション群 30、表示制御部 40、表示部 50、とを含んで構成される。なお 、近年の携帯電話、 PDA (Personal Digital assistant)などの携帯通信端末が備える 各種機能を有する構成であっても構わな ヽ。  FIG. 1 shows a configuration diagram of the mobile phone according to the first embodiment of the present invention. The mobile phone according to the first embodiment of the present invention includes a call device 10, an emotion identification device 20, a PIM (Personal Information Manager) application group 30, a display control unit 40, and a display unit 50. It should be noted that a configuration having various functions included in a mobile communication terminal such as a recent mobile phone or PDA (Personal Digital assistant) may be used.
[0035] 通話装置 10は、通信部 101、音声信号出力部 102、通話用スピーカ 103、通話用 マイク 104を備える。通信部 101は、携帯電話用基地局と携帯無線通信を行い、本 発明の携帯電話利用者と他の電話機利用者との間で音声信号の送受することにより 、音声通話を実現する。音声信号出力部 102は、通信部 101を介して他の電話機か ら受信した音声信号、通話用マイク 104により収音した当該携帯電話利用者の音声 信号、またはその両方の音声信号を感情特定装置 20に出力する。また、音声信号 出力部 102は、他の電話機力も受信した音声を通話用スピーカ 103に出力し、通話 用マイク 104により収音した当該携帯電話利用者の通話音声を通信部 101に出力 する。 The call device 10 includes a communication unit 101, an audio signal output unit 102, a call speaker 103, and a call microphone 104. The communication unit 101 performs mobile wireless communication with a mobile phone base station, and realizes a voice call by transmitting and receiving a voice signal between the mobile phone user of the present invention and another telephone user. The audio signal output unit 102 receives an audio signal received from another telephone through the communication unit 101, an audio signal of the mobile phone user picked up by the call microphone 104, or both of the audio signals. Output to 20. Also, the audio signal output unit 102 outputs the voice received by other telephone power to the call speaker 103, and outputs the call voice of the mobile phone user collected by the call microphone 104 to the communication unit 101. To do.
[0036] 感情特定装置 20は、感情推定部 201、感情蓄積部 202、感情特定部 203、感情 情報記憶部 204を備える。感情推定部 201は、通話装置 10から入力される受話音 声信号および送話音声信号に情報として含まれる声の音量、声の波形、声のピッチ 、または、音韻などから、その音声を発した他の電話機利用者の感情および当該携 帯電話利用者の感情をそれぞれ推定する (このような感情推定方法は、例えば国際 公開番号 WO00Z62279などで提案されている)。感情推定部 201は、例えば、図 2に示す愛情、喜び、怒り、哀しみ、ニュートラル (通常)から構成される感情情報の因 子毎に、 0、 1、 2の値により表現したその感情の度合い(0:感情なし、 1 :感情弱、 2: 感情強)を通話開始から通話終了にかけて、受話音声信号および送話音声信号毎 に、継続的に所定の時間間隔で推定し、その推定した各々の数値を感情蓄積部 20 2に逐次出力する。なお、感情推定部 201は、他の電話機利用者の感情および当該 携帯電話利用者の感情を両方推定する必要は必ずしも無ぐ予め他の電話機利用 者または当該携帯電話利用者のどちらか一方の感情を推定するよう設定されている 場合には、通話装置 10の音声信号出力部 102から受話音声信号または送話音声 信号のどちらか一方を入力するようにすれば良 、。  The emotion identification device 20 includes an emotion estimation unit 201, an emotion accumulation unit 202, an emotion identification unit 203, and an emotion information storage unit 204. Emotion estimation unit 201 emits the voice from the volume of the voice, the waveform of the voice, the pitch of the voice, or the phoneme included as information in the received voice signal and the transmitted voice signal input from communication device 10 Estimate the emotions of other telephone users and those of the mobile phone users (such an emotion estimation method is proposed in, for example, International Publication No. WO00Z62279). For example, the emotion estimation unit 201 indicates the degree of emotion expressed by values of 0, 1, and 2 for each factor of emotion information composed of love, joy, anger, sadness, and neutral (normal) shown in FIG. (0: No emotion, 1: Emotional weakness, 2: Emotional strong), from the beginning of the call to the end of the call, for each received voice signal and transmitted voice signal, continuously estimated at predetermined time intervals, and each of the estimated Are sequentially output to the emotion storage unit 202. The emotion estimation unit 201 does not necessarily have to estimate both the emotion of the other telephone user and the emotion of the mobile phone user. The emotion of either the other telephone user or the mobile phone user is not necessarily estimated in advance. If it is set to estimate the voice, either the incoming voice signal or the outgoing voice signal may be input from the voice signal output unit 102 of the communication device 10.
[0037] 感情蓄積部 202は、感情推定部 201から入力された各々の因子毎の数値を、受話 音声信号および送話音声信号毎に、入力された時刻または順番に対応させて蓄積 する。感情蓄積部 202は、感情推定部 201から一連の各因子毎の数値 (ここでいう一 連とは、感情推定部 201から感情蓄積部 202への数値の入力開始力も入力終了を 指し、これはほとんど通話装置 10によるある通話における通話開始力も通話終了の 期間を指す)を入力されると、その一連の各因子毎の数値をひとまとまりのデータ(以 後、ひとまとまりのデータのことを 1通話分のデータと称する)として記憶する。なお、 感情蓄積部 202は、感情推定部 201から入力された各々の因子毎の数値を受話音 声信号および送話音声信号毎に蓄積していた場合は、 1通話分のデータは受話音 声信号および送話音声信号毎に 1つずつ記憶することになる。  [0037] Emotion accumulation section 202 accumulates the numerical value for each factor input from emotion estimation section 201 in correspondence with the input time or order for each received voice signal and transmitted voice signal. The emotion accumulation unit 202 receives a numerical value for each of a series of factors from the emotion estimation unit 201 (the series here refers to the input start force of the numerical value from the emotion estimation unit 201 to the emotion accumulation unit 202, which is the end of input. When the call start power in a call by the call device 10 is input, the value for each factor in the series is input as a set of data (hereinafter, a set of data is referred to as one call). (Referred to as minute data). If the emotion accumulation unit 202 accumulates the numerical value for each factor input from the emotion estimation unit 201 for each received voice signal and transmitted voice signal, the data for one call is received voice data. One for each signal and transmitted voice signal.
[0038] 感情特定部 203は、感情蓄積部 202から 1通話分のデータを読み出し、各々の因 子毎の数値を解析し、その読み出した数値カゝら特徴的な因子を 1つ特定し、その 1つ の因子により表される感情を感情情報記憶部 204に出力する。感情特定部 203は、 特徴的な感情を 1つ特定する際、 1通話分のデータにおいて数値が最も大き力つた 因子を特徴的な感情とすることにより、通話中の強く印象に残った内容を重視して感 情を特定することができる。別の特定方法としては、 1通話分のデータにおける通話 開始力も通話終了にかけて合計した数値が最も大き力 た因子を特徴的な感情とす ることにより、通話全体の内容を重視して感情を特定することができ、さら〖こ、 1通話 分のデータにおける通話終了間際に数値が最も大き力つた因子を特徴的な感情と することにより、会話の余韻を重視して感情を特定することができる。なお、感情特定 部 203は、受話音声信号および送話音声信号毎の 1通話分のデータそれぞれから 特徴的な因子を 1つ特定した場合、その 1つの因子が受話音声信号および送話音声 信号のどちら力も特定されたものであるの力、も併せて出力する。 [0038] The emotion identification unit 203 reads the data for one call from the emotion accumulation unit 202, analyzes the numerical value for each factor, identifies one characteristic factor from the read numerical value, One of them The emotion represented by the factor is output to the emotion information storage unit 204. When identifying one characteristic emotion, the emotion identification unit 203 uses the factor with the most powerful numerical value in the data for one call as the characteristic emotion, so that the content that is strongly impressed during the call is displayed. It is possible to identify feelings with emphasis. Another method is to identify emotions by placing emphasis on the content of the entire call, with the characteristic that the most powerful factor is the sum of the call start power in the data for one call and the end of the call. By selecting the factor that has the most powerful value immediately before the end of the call in the data for one call as the characteristic emotion, the emotion can be identified with emphasis on the reverberation of the conversation. . In addition, when the emotion identifying unit 203 identifies one characteristic factor from each of the data for one call for each received voice signal and transmitted voice signal, the one factor is used for the received voice signal and the transmitted voice signal. Both forces, which are specified, are also output.
[0039] 感情情報記憶部 204は、感情特定部 203からある 1つの因子により表される感情を 入力し、さらに、通話装置 10の通信部 101により行なった通話に関する情報 (その通 話が発信'着信のいずれか、通話開始時刻'通話終了時刻、および他の電話機の識 別情報 (例えば通話先の電話番号))を、通信部 101から入力する。感情情報記憶部 204は、感情特定部 203から入力した感情と、通信部 101から入力した各種情報と、 を対応付けて記憶する (感情特定部 203が受話音声信号および送話音声信号毎の 1通話分のデータそれぞれから特徴的な因子を 1つ特定した場合、その 1つの因子 が受話音声信号および送話音声信号のどちらの信号源のものである力も記憶する) 。図 3に、本発明の第 1実施形態の携帯電話における感情特定装置が記憶する情報 一覧を示す。図 3において、感情情報記憶部 204は、 1通話分のデータ毎に、受話 音声信号から特定された感情 (すなわち、通話相手の感情)と送話音声信号から特 定された感情 (すなわち、本発明の携帯電話利用者の感情)とを記憶する構成として いる。なお、予め他の電話機利用者または当該携帯電話利用者のどちらか一方の感 情を推定するよう設定されていた場合(1通話分のデータ c)、推定するよう設定されて いな力つた者の感情は記憶されない(図 3では、「一」で記憶されていないことを表記 している)。 [0039] Emotion information storage unit 204 inputs an emotion represented by one factor from emotion identification unit 203, and further, information related to a call made by communication unit 101 of call device 10 (the call is transmitted) One of the incoming calls, call start time 'call end time, and other telephone identification information (for example, the telephone number of the called party) are input from the communication unit 101. The emotion information storage unit 204 stores the emotion input from the emotion specifying unit 203 and various information input from the communication unit 101 in association with each other (the emotion specifying unit 203 stores 1 for each received voice signal and transmitted voice signal). If one characteristic factor is identified from each of the data for the call, the force that is the signal source of either the received voice signal or the transmitted voice signal is memorized. FIG. 3 shows a list of information stored in the emotion identification device in the mobile phone according to the first embodiment of the present invention. In FIG. 3, the emotion information storage unit 204 stores the emotion identified from the received voice signal (i.e., the other party's emotion) and the emotion identified from the transmitted voice signal (i.e. The emotion of the mobile phone user of the invention is memorized. If it is set in advance to estimate the feeling of either one of the other phone users or the mobile phone user (data for one call c), Emotions are not memorized (in Fig. 3, it is shown that they are not memorized as “one”).
[0040] PIMアプリケーション群 30は、個人情報を管理し、携帯電話利用者にその個人情 報を利用させるための複数のアプリケーションカゝら構成される。 PIMアプリケーション は、例えば通話履歴を表示するための通話履歴アプリケーション 301、携帯電話利 用者によりスケジュール管理を支援するスケジューラアプリケーション 302、または様 々な個人情報を登録するための電話帳アプリケーション 303、などが挙げられる。 [0040] The PIM application group 30 manages personal information and sends the personal information to mobile phone users. It is composed of multiple application cards for using information. Examples of the PIM application include a call history application 301 for displaying a call history, a scheduler application 302 for supporting schedule management by a mobile phone user, or a phone book application 303 for registering various personal information. Can be mentioned.
[0041] 表示制御部 40は、 PIMアプリケーション群のうちのいずれかのアプリケーションを 起動 ·実行し、実行したアプリケーションの処理に必要なデータを感情情報記憶部 2 04から抽出して、表示部 50に各種情報を表示させる。通話履歴アプリケーション、ス ケジユーラアプリケーション、または電話帳アプリケーションを起動 ·実行した場合の 表示部 50の表示例は、後述する。  [0041] The display control unit 40 activates and executes one of the applications in the PIM application group, extracts data necessary for processing of the executed application from the emotion information storage unit 2004, and displays the data in the display unit 50. Display various information. A display example of the display unit 50 when the call history application, scheduler application, or phone book application is started and executed will be described later.
[0042] 次に、本発明の第 1実施形態の携帯電話による処理を、図 4に示す本発明の第 1 実施形態の携帯電話による通話履歴を表示する処理フローを参照して、説明する。  Next, processing by the mobile phone of the first embodiment of the present invention will be described with reference to a processing flow for displaying a call history by the mobile phone of the first embodiment of the present invention shown in FIG.
[0043] ます、感情特定装置 20は、本発明の携帯電話に備わる操作キー(図示せず)によ る入力操作を携帯電話利用者から受け付けることによって、感情を特定すべき対象 ( 1.通話相手、 2.携帯電話利用者自身、または 3.その両者)を決定する (ステップ S 401)。  [0043] In addition, the emotion identification device 20 accepts an input operation by an operation key (not shown) provided in the mobile phone of the present invention from a mobile phone user, and an emotion should be specified (1. The other party, 2. the mobile phone user himself, or both, are determined (step S401).
[0044] 通話装置 10は、通話を開始すると (ステップ S402、 YES)、感情特定装置 20に受 話音声信号および送話音声信号を出力する (ステップ S403)。感情特定装置 20は、 通話装置 10から入力した音声信号のうちの感情を特定すべき対象の音声信号につ いて、愛情、喜び、怒り、哀しみ、ニュートラル (通常)から構成される感情情報の因子 毎に、その感情の度合 、を継続的に所定の時間間隔で推定する (ステップ S404)。 感情特定装置 20は、通話装置 10による通話が終了すると (ステップ S405、 YES) , その 1通話の間に推定した各々の因子毎の数値を解析し、その数値力も特徴的な因 子を 1つ特定し、その 1つの因子により表される感情をこの通話における通話相手ま たは携帯電話利用者の感情として記憶する (ステップ S406。このとき、図 3に示すよう に、通話に関する情報を上記特定した感情と共に記憶する)。  When the call device 10 starts a call (step S402, YES), the call device 10 outputs a reception voice signal and a transmission voice signal to the emotion identification device 20 (step S403). The emotion identifying device 20 is a factor of emotion information composed of affection, joy, anger, sadness, and neutral (normal) for the target speech signal of the speech signal input from the communication device 10. Each time, the degree of emotion is continuously estimated at predetermined time intervals (step S404). When the call by the call device 10 ends (YES in step S405), the emotion identification device 20 analyzes the numerical value for each factor estimated during the call, and the numerical force has one characteristic factor. The emotion expressed by the one factor is stored as the emotion of the other party or mobile phone user in this call (step S406. At this time, as shown in FIG. I remember with the feelings I made).
[0045] その後、表示制御部 40は、本発明の携帯電話に備わる操作キーによる入力操作 を携帯電話利用者力も受け付けることによって、通話履歴アプリケーション 301を起 動すると(ステップ S407、 YES)、通話履歴アプリケーション 301のプログラムコード に従って、感情特定装置 20に記憶されている各通話毎に特定された感情とその通 話に関する情報とを読み出し、表示部 50にこれらの情報を所定の表示形式で表示さ せる(ステップ S408)。 [0045] After that, the display control unit 40 starts the call history application 301 by accepting the input operation by the operation key provided in the mobile phone of the present invention by the mobile phone user power (step S407, YES). Program code for application 301 Accordingly, the emotion specified for each call stored in the emotion specifying device 20 and information related to the call are read out, and the information is displayed on the display unit 50 in a predetermined display format (step S408).
[0046] [通話履歴を利用した表示形式] [0046] [Display format using call history]
図 5に、本発明の第 1実施形態の携帯電話による通話履歴の表示例を示す。図 5 ( a)は、図 3に示す感情特定装置 20が記憶する情報を元に生成した発信履歴であり、 図 5 (b)は、図 3に示す感情特定装置 20が記憶する情報を元に生成した着信履歴で ある。なお、図 5 (c)は、発信履歴または着信履歴に表示する、感情を示すマークの 一例である。  FIG. 5 shows a display example of a call history by the mobile phone according to the first embodiment of the present invention. Fig. 5 (a) is a transmission history generated based on the information stored in the emotion identification device 20 shown in Fig. 3, and Fig. 5 (b) is based on the information stored in the emotion identification device 20 shown in Fig. 3. This is the incoming call history generated at. Fig. 5 (c) is an example of a mark indicating emotion displayed in the outgoing call history or incoming call history.
[0047] 表示制御部 30は、図 5 (a)に示す発信履歴を表示するために、図 3に示す感情特 定装置 20が記憶する情報一覧のうちの、項目「発着信」が「発信」のもの(1通話分の データ aが該当)で、さらに、ステップ S401で設定された感情を特定すべき対象であ る通話相手、すなわち項目「信号源」が「受話音声」である 1通話分のデータから、項 目「通話開始時刻」、「通話先電話番号」、「感情」のデータを抽出し、図 5 (a)の発信 履歴の項目「発信日時」、「名称」、「感情」の該当箇所に表示させる。なお、図 5 (a) の発信履歴の項目「名称」には、通話先電話番号「09000000000」ではなく名称「 松下太郎」が表示されて!、るが、これは既にこの電話番号が「松下太郎」の個人情報 として電話帳アプリケーション 303に登録されていたため、この電話番号の代わりに 名称が表示されたためである。これにより、発信履歴を確認した携帯電話利用者は、 通話相手を容易に特定することができる。  [0047] In order to display the outgoing call history shown in Fig. 5 (a), the display control unit 30 sets the item "outgoing / incoming" in the information list stored in the emotion identifying device 20 shown in Fig. 3 to "outgoing" ”(Corresponding to data a for one call), and the other party whose emotion set in step S401 should be specified, that is, the item“ signal source ”is“ received voice ”. The data of the items “call start time”, “call destination telephone number”, and “emotion” are extracted from the minute data, and the items of the call history in FIG. 5 (a), “date and time”, “name”, “emotion” "Is displayed at the corresponding location. Note that the name “Taro Matsushita” is displayed in the “name” field of the outgoing call history in FIG. 5 (a) instead of the telephone number “09000000000”! This is because the name was displayed instead of this telephone number because it was registered in the phone book application 303 as personal information of “Taro”. As a result, the mobile phone user who has confirmed the call history can easily identify the other party.
[0048] なお、図 5 (a)では、通話相手の感情を表示するようにしたが、携帯電話利用者自 身の感情を表示する構成や、通話相手の感情と携帯電話利用者自身の感情とを同 時に表示する構成であっても構わな!/ヽ。携帯電話利用者自身の感情を表示する場 合には、図 3に示す感情特定装置 20が記憶する情報一覧のうちの、項目「信号源」 力 S「送話音声」である感情を抽出し、表示させれば良い。また、感情を、その感情を表 現する文字 (哀しみや愛情など)とその感情を表すマークによって表示するようにした 力 この 2つを両方表示する必要は必ずしも無ぐどちらか一方でも構わないし、また 別の表示形態 (静止画像、動画像画像など)によって表示しても構わない。 [0049] 表示制御部 30はまた、図 5 (b)に示す着信履歴を表示するために、図 3に示す感 情特定装置 20が記憶する情報一覧のうちの、項目「発着信」が「着信」のもの(1通話 分のデータ b、 cが該当)で、さらに、ステップ S401で設定された感情を特定すべき対 象である通話相手、すなわち項目「信号源」が「受話音声」である 1通話分のデータか ら、項目「通話開始時刻」、「通話先電話番号」、「感情」のデータを抽出し、図 5 (b)の 着信履歴の項目「着信日時」、「名称」、「感情」の該当箇所に表示させる。なお、図 5 (b)の着信履歴の項目「名称」には、通話先電話番号「09000001111」ではなく名 称「松下次郎」が表示されて!、るが、これは上述した理由と同様である。 [0048] In FIG. 5 (a), the emotion of the other party is displayed. However, the mobile phone user's own emotion is displayed, and the other party's emotion and the cellular phone user's own emotion are displayed. It may be configured to display both at the same time! / ヽ. When displaying the emotion of the mobile phone user himself / herself, the emotion of the item “signal source” force S “sending speech” is extracted from the information list stored in the emotion identification device 20 shown in FIG. , Display. Also, it is not necessary to display both of these emotions with the characters that express those emotions (sorrow, affection, etc.) and the marks that represent those emotions. Further, it may be displayed in another display form (still image, moving image, etc.). [0049] In order to display the incoming call history shown in FIG. 5 (b), the display control unit 30 also sets the item "outgoing / incoming" in the information list stored in the emotion identification device 20 shown in FIG. "Incoming call" (data corresponding to one call b and c), and the other party who should specify the emotion set in step S401, that is, the item "Signal source" is "Received voice" From the data for one call, the data of the items "call start time", "call destination phone number", and "emotion" are extracted, and the items of the incoming call history in Fig. 5 (b) are "Received date" and "Name". , “Emotion” is displayed in the corresponding part. Note that the name “Jiro Matsushita” is displayed in the “name” field of the incoming call history in FIG. 5B instead of the telephone number “09000001111”! This is the same as the reason described above. is there.
[0050] 本発明の第 1実施形態の携帯電話による通話履歴表示により、この通話履歴を一  [0050] The call history is displayed on the mobile phone according to the first embodiment of the present invention.
目見ただけで、過去の通話内容によって通話相手または携帯電話利用者がどのよう な感情であつたかを判断することができるため、携帯電話利用者が手間を力けること 無しに、過去の通話内容を思い出すことを支援することができる。  It is possible to determine the feelings of the other party or the mobile phone user based on the content of the past call, just by looking at it, so the past call can be made without any effort by the mobile phone user. Can help you remember the contents.
[0051] [スケジュール帳を利用した表示形式]  [0051] [Display format using schedule book]
次に、本発明の第 1実施形態の携帯電話によるスケジュールを表示する処理につ いて説明する。スケジュールを表示する処理フローは、上述したステップ S407、 S40 8で通話履歴アプリケーション 301を起動、実行した代わりに、スケジューラアプリケ ーシヨン 302を起動、実行する以外、図 4と共通である。図 6に、本発明の第 1実施形 態の携帯電話によるスケジュール帳の表示例を示す。図 6 (a)は、スケジュール帳に より表示したカレンダーであり、図 6 (b)は、カレンダーの各日付毎の感情を表示した 表示例であり、図 6 (c)は、特定の日付における感情の変遷を表示した表示例である  Next, processing for displaying a schedule by the mobile phone according to the first embodiment of the present invention will be described. The processing flow for displaying the schedule is the same as that in FIG. 4 except that the scheduler application 302 is started and executed instead of starting and executing the call history application 301 in steps S407 and S408 described above. FIG. 6 shows a display example of the schedule book by the mobile phone according to the first embodiment of the present invention. Fig. 6 (a) is a calendar displayed from the schedule book, Fig. 6 (b) is a display example displaying emotions for each date in the calendar, and Fig. 6 (c) is a display on a specific date. It is a display example that displays the transition of emotion
[0052] 表示制御部 40は、本発明の携帯電話に備わる操作キーによる入力操作を携帯電 話利用者力も受け付けることによって、スケジューラアプリケーション 302を起動する と (ステップ S407、 YES)、まず図 6 (a)に示すカレンダーを表示する。さら〖こ、表示 制御部 40は、操作キーにより各日付における携帯電話利用者自身の感情を表示す るよう指示する入力操作を携帯電話利用者から受け付けると、図 3に示す感情特定 装置 20が記憶する情報一覧のうちの、感情を特定すべき対象である携帯電話利用 者自身の感情、すなわち項目「信号源」が「送話音声」である 1通話分のデータを抽 出する。図 3に示す感情特定装置 20が記憶する情報一覧では、 2005年 9月 9日に おける通話で特定された携帯電話利用者自身の感情は「喜び」であり、 2005年 9月 10日における通話で特定された携帯電話利用者自身の感情は「通常」であるため、 表示制御部 40は、該当する日付の欄のうち、 2005年 9月 9日には喜びの感情を表 すマークを、 2005年 9月 10日には通常の感情を表すマークを、それぞれ表示させる 。他の日付についても、これらの処理を同様に行ない、図 6 (b)に示すようにカレンダ 一の各日付毎の感情を表示させる。 [0052] When the display control unit 40 starts up the scheduler application 302 by accepting the input operation by the operation key provided in the mobile phone of the present invention also by the mobile phone user power (step S407, YES), first, the display control unit 40 in FIG. Display the calendar shown in a). Furthermore, when the display control unit 40 receives from the mobile phone user an input operation instructing to display the mobile phone user's own emotions on each date using the operation keys, the emotion identifying device 20 shown in FIG. From the list of stored information, extract the emotion of the mobile phone user who is the target of emotion identification, that is, the data for one call whose item “Signal Source” is “Transmission Voice”. Put out. In the list of information stored in the emotion identification device 20 shown in Fig. 3, the emotion of the mobile phone user identified by the call on September 9, 2005 is “joy”. Since the mobile phone user's own emotion identified in (1) is “normal”, the display control unit 40 displays a mark indicating a feeling of joy on September 9, 2005, in the corresponding date column. On September 10, 2005, a mark representing normal feeling is displayed. For other dates, these processes are performed in the same way, and the emotions for each date in the calendar are displayed as shown in Fig. 6 (b).
[0053] なお、図 3に示す感情特定装置 20が記憶する情報一覧では、携帯電話利用者自 身の感情が 2005年 9月 9日、 2005年 9月 10日それぞれにおいて 1度の通話で特定 された 1つの感情しかな力つたため、各日付にその 1つの感情を表すマークを表示す るようにすれば良力つた。しかし、 1つの日付に、複数の通話それぞれで特定された 複数の異なる感情が情報一覧中にある場合も考えられる。このような場合、表示制御 部 40は、 1つの日付における複数の通話の中で、通話日時が最も新しい通話によつ て特定された感情や、最も通話時間の長い通話によって特定された感情や、最も特 定された回数の多かった感情、をその日付における総合的な感情として表示部 50に 表示させる。なお、ここでは、ある一日における総合的な感情を表示させる構成につ いて説明したが、 1週間、 1ヶ月単位などの所定の期間における総合的な感情を表示 させても構わない。 [0053] In the list of information stored in emotion identification device 20 shown in Fig. 3, the emotions of mobile phone users are identified by one call on September 9, 2005 and September 10, 2005, respectively. Since there was only one emotion that was given power, it would be good to display a mark representing that one emotion on each date. However, there may be multiple different emotions identified in each call on the same date in the information list. In such a case, the display control unit 40, among a plurality of calls on a single date, the emotion specified by the call with the newest call date and time, the emotion specified by the call with the longest call time, The emotion that has been identified most frequently is displayed on the display unit 50 as a total emotion on that date. In addition, although the structure which displays the total feeling in a certain day was demonstrated here, you may display the total feeling in predetermined periods, such as a week and a 1-month unit.
[0054] または、 1つの日付に、複数の通話それぞれで特定された複数の異なる感情が情 報一覧中にある場合、各通話毎に特定された感情に対して、その感情の度合いを設 定するために重み付けを行な 、、その感情の度合 、に基づ 、て総合的な感情を特 定することも考えられる。表示制御部 40は、各通話毎に特定された感情に対して、通 話日時が新しいほどまたは通話時間が長いほどその感情の度合いを示す数値が大 きくなるよう重み付けを行い (例えば、通話日時力 総合的な感情を表示させる現在 日時を引いた差分の秒数分をその感情の度合いとして数値ィ匕したり、通話時間の秒 数分をその感情の度合!、として数値化する)、数値化された感情の度合!ヽ (各通話 毎に特定された感情を数値ィ匕し、各感情毎にそれらの数値を合計したものであって もよ 、;)のうち、ある閾値を越えかつ最も大き 、数値によってその度合 、が示されて!/ヽ る感情を総合的な感情として表示させ、一方、重み付けされた感情に所定の閾値を 越えるものが無ければ、「通常」を総合的な感情として表示させる。この構成によれば[0054] Alternatively, if there are multiple different emotions specified in each of multiple calls in the information list on one date, the level of the emotion is set for the emotion specified for each call. In order to do so, weighting may be performed, and based on the degree of the emotion, it may be possible to identify the total emotion. The display control unit 40 weights the emotion specified for each call so that the newer the call date or the longer the call time, the larger the numerical value indicating the degree of that emotion (for example, the call date and time). Force to display the total emotion The number of seconds of the difference minus the date and time is numerically expressed as the degree of that emotion, or the number of seconds of the call time is digitized as the degree of that emotion!), Numerical value The degree of emotion that has been converted into a certain level! ヽ (which may be a numerical value of the emotions specified for each call and the sum of those values for each emotion;) The maximum size is indicated by the numerical value! / ヽ If the weighted emotion does not exceed the predetermined threshold, “normal” is displayed as the total emotion. According to this configuration
、ある 1日に行なわれた全ての通話毎に特定される感情のうち、特に感情の度合いが 大きいものをその 1日の総合的な感情として表示するため、ある 1日の総合的な感情 を特定する精度を向上することができる。 In order to display the feelings that are particularly strong among the feelings identified for every call made on a certain day as the total feelings for that day, The accuracy of identification can be improved.
[0055] なお、各感情の度合いを示す数値に基づいて上述の閾値を算出し、設定すること により、普段の生活の中で表現され難い感情 (例えば、愛情、喜び、怒り、哀しみ、二 ユートラル力 構成される感情情報の因子のうち、「愛情」という感情は普段の生活の 中で表現され難い感情である)がある通話から特定された場合、その表現され難い感 情が総合的な感情として表示され易くすることもできる。例えば、閾値を数 1に示す式 により算出する。  [0055] By calculating and setting the above threshold based on the numerical value indicating the degree of each emotion, emotions that are difficult to express in everyday life (for example, affection, joy, anger, sadness, two neutrals) If the emotion is identified from a phone call, the emotion of “affection” is an emotion that is difficult to express in everyday life). Can be easily displayed. For example, the threshold is calculated by the formula shown in Equation 1.
[数 1]  [Number 1]
¾値 : ( ( [喜びの度合い] X [喜びの重み] ) + ( [怒りの度合い] X [怒りの重み] ) +  ¾ value: (([degree of joy] X [weight of joy]) + ([degree of anger] X [weight of anger]) +
( [哀しみの度合い] X [哀しみの重み] ) + ( [愛情の度合い] X [愛情の重み] ) ) ) ([Sorrow Degree] X [Sorrow Weight]) + ([Degree of Love X X [Sweet Weight])))
C [喜びの度合い] + [怒りの度合い] + [哀しみの度合い] + [愛情の度合い] ) X [K値の重み] C [degree of joy] + [degree of anger] + [degree of sadness] + [degree of love]) X [weight of K value]
[0056] ここで、「喜びの重み」、「怒りの重み」、「悲しみの重み」および「愛情の重み」の数 値は、生活の中で表現され難い感情 (すなわち「愛情」)に対しては大きくなり、逆に 生活の中で表現され易い感情に対しては小さくなり、また、「閾値の重み」の数値は、 閾値の初期値である。例えば、「愛情の重み」を大きくすることにより、複数の通話で 特定された感情に「愛情」が含まれている場合に算出される閾値が大きくなり、その結 果、数値ィヒされた「愛情」の度合いはこの閾値を越え易ぐ逆に「愛情」以外の感情の 度合いはこの閾値を越え難くなるため、「愛情」が総合的な感情として特定され易くな る。このように閾値を設定することにより、各通話毎において喜びば力りが特定されて しまう所有者個人であっても、数回の通話において愛情が特定されることで、総合的 な感情を愛情とすることができ、感情の偏りを軽減することができる。 [0056] Here, the numbers of "joy weight", "anger weight", "sadness weight", and "love weight" are used for emotions that are difficult to express in life (ie, "love"). On the contrary, it becomes smaller for emotions that are easily expressed in life, and the value of “threshold weight” is the initial value of the threshold. For example, by increasing the “weight of love”, the threshold value calculated when “affection” is included in the emotions identified in multiple calls is increased, and as a result, the numerical value “ The degree of “affection” easily exceeds this threshold, but the degree of emotions other than “affection” does not easily exceed this threshold, so it is easy to identify “affection” as a comprehensive emotion. By setting the threshold in this way, even if the owner is an individual whose joy and power is specified for each call, the affection is specified in several calls, so that the overall feeling can be expressed in love. And can reduce emotional bias.
[0057] 表示制御部 40はまた、操作キーにより特定の日付を選択し(図 6 (b)では、 9月 2を 選択して!/、る)、その日付における携帯電話利用者の感情の変遷を表示するよう指 示する入力操作を携帯電話利用者から受け付けると、図 3に示す感情特定装置 20 が記憶する情報一覧のうちの、項目「通話開始時刻」がその特定の日付であるもので 、かつ感情を特定すべき対象である携帯電話利用者自身の感情、すなわち項目「信 号源」が「送話音声」である 1通話分のデータを抽出する。表示制御部 40は、抽出し た 1通話分のデータにおける項目「通話開始時刻」、「感情」を参照して、例えば図 6 ( c)に示すように、 9月 2日における携帯電話利用者自身の感情を時系列に沿って表 示する。 [0057] The display control unit 40 also selects a specific date using the operation keys (in FIG. 6 (b), select September 2! /), And the emotion of the mobile phone user on that date is displayed. When an input operation instructing to display the transition is received from the mobile phone user, the item “call start time” in the information list stored in the emotion identification device 20 shown in FIG. 3 is that specific date. so In addition, the mobile phone user's own emotions for which emotions are to be identified, that is, data for one call whose item “signal source” is “sending speech” are extracted. The display control unit 40 refers to the items “call start time” and “emotion” in the extracted data for one call, and for example, as shown in FIG. Display your feelings in chronological order.
[0058] なお、表示制御部 40は、図 6 (b)、 (c)に示すように、携帯電話利用者の感情を表 すマークを表示するようにした力 他者の感情を表すマークを表示するようにしても良 い。このとき、表示制御部 40は、操作キーにより各日付におけるある他者の感情を表 示するよう指示する入力操作を携帯電話利用者から受け付けると (このとき、他者は、 この入力操作時に併せて入力される電話番号によって特定され、あるいは特定した い他者が既に電話帳に登録されている場合は、電話帳に登録された電話番号によ つて特定される)、図 3に示す感情特定装置 20が記憶する情報一覧のうちの他者の 感情、すなわち項目「信号源」が「受話音声」のもので、かつ項目「通話先電話番号」 が入力された電話番号と合致するものを抽出する。  [0058] Note that the display control unit 40 displays a mark representing the emotion of the mobile phone user as shown in FIGS. 6 (b) and 6 (c). It may be displayed. At this time, if the display control unit 40 receives an input operation from the mobile phone user instructing to display an emotion of the other person on each date by the operation key (at this time, the other person also includes the time of the input operation). Specified by the phone number entered or specified by the phone number registered in the phone book if the other person to be identified is already registered in the phone book) From the list of information stored in device 20, extract the emotions of others, that is, the item “Signal Source” is “Received Voice” and the item “Destination Phone Number” matches the entered phone number. To do.
[0059] 本発明の第 1実施形態の携帯電話によるスケジュール表示により、このスケジユー ルをー目見ただけで、過去に行なった特定の他者との通話においてその他者または 携帯電話利用者がどのような感情であつたかを判断することができるため、携帯電話 利用者が手間を力 4ナること無しに、過去の通話相手とのことを思い出すことを支援す ることがでさる。  [0059] By displaying the schedule on the mobile phone according to the first embodiment of the present invention, it is possible to determine which other person or mobile phone user has used to make a call with a specific other person in the past. Since it is possible to determine whether or not the emotion has been felt, it is possible to help the mobile phone user remember the past call partner without having to go through the trouble.
[0060] [電話帳を利用した表示形式]  [0060] [Display format using phone book]
次に、本発明の第 1実施形態の携帯電話による電話帳を表示する処理について説 明する。電話帳を表示する処理フローは、上述したステップ S407、 S408で通話履 歴アプリケーション 301を起動、実行した代わりに、電話帳アプリケーション 303を起 動、実行する以外、図 4と共通である。図 7に、本発明の第 1実施形態の携帯電話に よる電話帳の表示例を示す。図 7 (a)は、電話帳に登録された各個人の名称と電話 番号の表示例であり、図 7 (b)は、電話帳に登録された各個人毎の感情の表示例で あり、図 7 (c)は、ある個人の感情通話状況の表示例である。  Next, processing for displaying a telephone directory by the mobile phone according to the first embodiment of the present invention will be described. The processing flow for displaying the phone book is the same as that in FIG. 4 except that the phone book application 303 is started and executed instead of starting and executing the call history application 301 in steps S407 and S408 described above. FIG. 7 shows a display example of the telephone directory by the mobile phone according to the first embodiment of the present invention. Fig. 7 (a) is a display example of the name and phone number of each person registered in the phone book, and Fig. 7 (b) is a display example of emotions for each individual registered in the phone book. Fig. 7 (c) shows a display example of an individual's emotional call status.
[0061] 表示制御部 40は、本発明の携帯電話に備わる操作キーによる入力操作を携帯電 話利用者力も受け付けることによって、電話帳アプリケーション 303を起動すると (ス テツプ S407、 YES)、例えば図 7 (a)に示すように、名称の先頭文字がマ行である個 人の名称とその個人の電話番号とを、各個人毎に表示する。さらに、表示制御部 40 は、操作キーにより表示中の個人の感情を表示するよう指示する入力操作を携帯電 話利用者力 受け付けると、図 3に示す感情特定装置 20が記憶する情報一覧のうち の、感情を特定すべき対象である表示中の個人、すなわち項目「信号源」が「受話音 声」で、かつ項目「通話先電話番号」がその個人に登録されている電話番号と合致す る 1通話分のデータを抽出する。図 3に示す感情特定装置 20が記憶する情報一覧で は、通話先電話番号「09000000000」が表示中の名称「松下太郎」の電話番号と 合致し、通話先電話番号「09000001111」が表示中の名称「松下次郎」の電話番 号と合致するため、表示制御部 40は、該当する「松下太郎」、「松下次郎」の欄に、抽 出した感情「哀しみ」、「愛情」を表すマークを表示させる。「松下花子」などの他の名 称についても、これらの処理を同様に行ない、図 7 (b)に示すように電話帳に登録さ れた各個人毎に、その個人の過去の通話から得られた感情を表示する。 [0061] The display control unit 40 performs an input operation using an operation key provided in the mobile phone of the present invention. When the phone book application 303 is started by accepting the talk user power (step S407, YES), for example, as shown in Fig. 7 (a), the name of the person whose name begins with a line and its individual Are displayed for each individual. Further, when the display control unit 40 accepts an input operation for instructing to display the personal emotion being displayed by the operation key, the mobile phone user power is received, among the information lists stored in the emotion identification device 20 shown in FIG. Of the person whose emotion is to be identified, that is, the item “Signal source” is “Received voice” and the item “Called phone number” matches the phone number registered for that individual. Extract data for one call. In the list of information stored in the emotion identification device 20 shown in FIG. 3, the telephone number “09000000000” matches the telephone number of the displayed name “Taro Matsushita” and the telephone number “09000001111” is displayed. In order to match the telephone number of the name “Jiro Matsushita”, the display control unit 40 puts marks representing the extracted emotions “sorrow” and “love” in the corresponding “Taro Matsushita” and “Jiro Matsushita” fields. Display. For other names such as “Hanako Matsushita”, these processes are performed in the same way, and for each individual registered in the phone book as shown in Fig. 7 (b), it is obtained from the past call of that individual. Display the sentiment.
[0062] なお、図 3に示す感情特定装置 20が記憶する情報一覧では、表示中の個人の感 情が、 1度の通話で特定された 1つの感情しかな力つたため、その 1つの感情を表す マークを表示するようにすれば良力つた。しかし、ある個人の感情において、複数の 通話それぞれによって特定された複数の異なる感情が情報一覧中にある場合も考え られる。このような場合、表示制御部 40は、ある個人の感情の中で、通話日時が最も 新 、通話によって特定された感情や、最も通話時間の長 、通話によって特定され た感情や、最も特定された回数の多カゝつた感情、をその個人の総合的な感情として 表示部 50に表示させるようにしてもよい。また、表示制御部 40は、図 7 (c)のある個 人の感情通話状況の表示例に示すように、ある個人との複数の通話によって特定さ れた感情の出現頻度を表示するようにしても良い。図 7 (c)では、操作キーにより電話 帳に表示中の特定の個人を選択し (7 (b)では、「松下太郎」を選択している)、選択 した個人との通話 25回のうち、それぞれの感情が特定された回数 (および、全通話 回数に占める割合)を示して!/、る。  [0062] In the information list stored in emotion identification device 20 shown in Fig. 3, since the emotion of the individual being displayed has only one emotion identified in one call, that emotion It would be good to display the mark that represents. However, there may be cases where a person's emotions have multiple different emotions in the information list identified by multiple calls. In such a case, the display control unit 40 has the latest call date and time, the emotion specified by the call, the longest call time, the emotion specified by the call, and the most specific among the emotions of a certain individual. It is also possible to cause the display unit 50 to display a number of emotions that have been repeated many times as the overall emotion of the individual. Further, the display control unit 40 displays the appearance frequency of emotions specified by a plurality of calls with an individual as shown in the display example of the emotional call status of an individual in FIG. 7 (c). May be. In Fig. 7 (c), a specific individual displayed in the phone book is selected by the operation key ("Taro Matsushita" is selected in 7 (b)) and out of 25 calls with the selected individual. Show how many times each emotion was identified (and the percentage of all calls)! /
[0063] または、ある個人の感情において、複数の通話それぞれによって特定された複数 の異なる感情が情報一覧中にある場合、その個人との各通話毎に特定された感情に 対して、その感情の度合いを設定するために重み付けを行ない、感情の度合いから 総合的な感情を特定することも考えられる。表示制御部 40は、その個人との各通話 毎に特定された感情に対して、通話日時が新しいほどまたは通話時間が長いほどそ の感情の度合いを示す数値が大きくなるよう重み付けを行い(例えば、通話日時から 総合的な感情を表示させる現在日時を引いた差分の秒数分をその感情の度合いと して数値ィ匕したり、通話時間の秒数分をその感情の度合いとして数値ィ匕する)、数値 化された感情の度合!/ヽ (各通話毎に特定された感情を数値化し、各感情毎にそれら の数値を合計したものであってもよい)のうち、ある閾値を越えかつ最も大きい数値に よってその度合 、が示されて 、る感情をその個人の総合的な感情として表示させ、 一方、重み付けされた感情に所定の閾値を越えるものが無ければ、「通常」をその個 人の総合的な感情として表示させる。この構成によれば、ある個人との間で行なわれ た全ての通話毎に特定される感情のうち、特に感情の度合いが大きいものをその個 人の総合的な感情として表示するため、ある個人の総合的な感情を特定する精度を 向上することができる。なお、各感情の度合いを示す数値に基づいて上述の閾値を 算出し、設定することにより、普段の生活の中で表現され難い感情 (例えば、愛情、 喜び、怒り、哀しみ、ニュートラルカも構成される感情情報の因子のうち、「愛情」とい う感情は普段の生活の中で表現され難!、感情である)がある通話から特定された場 合、その表現され難い感情が総合的な感情として表示され易くすることもできる (例え ば、閾値を数 1に示す式により算出する。 [0063] Alternatively, in a certain individual emotion, a plurality of specified by each of a plurality of calls If there are different emotions in the information list, the emotion specified for each call with the individual is weighted to set the level of that emotion, and the overall emotion is identified from the level of emotion. It is also possible to do. The display control unit 40 weights the emotion specified for each call with the individual so that the newer the call date and time or the longer the call time, the larger the numerical value indicating the degree of the emotion (for example, The number of seconds, which is the difference between subtracting the current date and time from the date and time of the call and subtracting the current date and time, is used as the degree of emotion, or the number of seconds of the call time is used as the degree of emotion. ), The degree of emotion expressed in numbers! Of 数 値 / ヽ (which may be a numerical value of emotions specified for each call and the sum of those values for each emotion), the degree is indicated by the largest value exceeding a certain threshold. If the weighted emotion does not exceed the predetermined threshold, “normal” is displayed as the total emotion of the individual. According to this configuration, an emotion with a particularly high degree of emotion identified among all calls made with a certain individual is displayed as the total emotion of that individual. It is possible to improve the accuracy of identifying overall feelings. By calculating and setting the above thresholds based on numerical values indicating the degree of each emotion, emotions that are difficult to express in everyday life (for example, affection, joy, anger, sadness, and neutrality are also configured. If the emotion is identified from a call that has a feeling of “love” that is difficult to express in everyday life !, it is an emotion that is difficult to express. (For example, the threshold value is calculated by the formula shown in Equation 1.)
[0064] 本発明の第 1実施形態の携帯電話による電話帳表示により、この電話帳を一目見 ただけで、過去に特定の他者と行なった通話にお!、てその他者がどのような感情で あつたかを判断することができるため、携帯電話利用者が手間をかけること無しに、 過去の通話相手とのことを思い出すことを支援することができる。  [0064] By displaying the phone book on the mobile phone according to the first embodiment of the present invention, the phone book can be used to call a specific other person in the past by simply looking at the phone book! Since it is possible to judge whether it is hot or not, it is possible to help the mobile phone user remember the past call partner without taking time and effort.
[0065] [その他の表示形式]  [0065] [Other display formats]
上述の本発明の第 1実施形態の携帯電話では、 PIMアプリケーション実行時の表 示形式に感情を表す情報を追加表示する例にっ 、て詳細に説明したが、これらの P IMアプリケーション実行時の表示形式を利用しなくても、感情特定装置 20が記憶す る情報一覧にあるデータのうち、所定の条件を満たすデータを抽出し、その抽出した データやそのデータ力 算出される計算結果などを表示して、過去に他者と行なった 通話においてその他者のがどのような感情であつたかを、概略的に携帯電話利用者 に知らせることもできる。図 8には、本発明の第 1実施形態の携帯電話による感情毎 のソート例を示す。 In the mobile phone according to the first embodiment of the present invention described above, the example in which information representing emotion is additionally displayed in the display format when executing the PIM application has been described in detail. Even without using the display format, the emotion identification device 20 remembers Extract data that satisfies the specified conditions from the data in the list of information, display the extracted data and the calculation results of the data power, etc. It is also possible to inform mobile phone users about how their feelings are. FIG. 8 shows an example of sorting for each emotion by the mobile phone according to the first embodiment of the present invention.
[0066] 表示制御部 40は、本発明の携帯電話に備わる操作キーにより、過去の通話におい て感情が「哀しみ」であった通話相手の一覧を表示するよう指示する入力操作を携帯 電話利用者から受け付けると、図 3に示す感情特定装置 20が記憶する情報一覧の「 通話先電話番号」のうち、項目「信号源」が「受話音声」で、かつ項目「感情」が「哀し み」であるものを抽出する。表示制御部 40はさらに、既に抽出済みの通話先電話番 号を再度抽出するときには、その通話先電話番号を抽出した抽出回数をカウントする 。表示制御部 40は、情報一覧中の該当する全ての通話先電話番号を抽出すると、 図 8に示すように、抽出回数が多い順に通話先電話番号に該当する個人名称 (個人 名称は、通話先電話番号が既に電話帳アプリケーション 303に登録されていれば、 その電話番号から特定される)とその抽出回数を表示させる。なお、図 8の表示例で は、個人名称と抽出回数に加えて、抽出した通話先電話番号との全通話回数と、そ の全通話回数に占める抽出回数の割合も合わせて表示している。このような情報を 表示することにより、携帯電話利用者は、過去に特定の他者と行なった通話におい てその他者のがどのような感情であつたかを、概略的に知ることができる。  [0066] The display control unit 40 uses the operation keys provided on the mobile phone of the present invention to perform an input operation to instruct the mobile phone user to display a list of call partners whose emotions were "sad" in past calls. Of the information list stored in the emotion identification device 20 shown in FIG. 3, the item “Signal source” is “Received voice” and the item “Emotion” is “Sorrow”. Extract something. The display control unit 40 further counts the number of times the callee telephone number has been extracted when the callee telephone number that has already been extracted is extracted again. When the display control unit 40 extracts all the corresponding telephone numbers in the information list, as shown in FIG. 8, the personal names corresponding to the telephone numbers in the descending order of the number of extractions (individual names are called If the phone number is already registered in the phone book application 303, the phone number is identified) and the number of times of extraction is displayed. In addition, in the display example of FIG. 8, in addition to the individual name and the number of extractions, the total number of calls with the extracted call destination telephone number and the ratio of the number of extractions to the total number of calls are also displayed. . By displaying such information, the mobile phone user can roughly know what other people have felt in calls made with a specific other person in the past.
[0067] ところで、これまでの本発明の第 1実施形態の携帯電話では、音声通話時に入力さ れる受話音声信号および送話音声信号から、その音声を発した他の電話機利用者 の感情および当該携帯電話利用者の感情をそれぞれ推定する場合について説明し たが、文字列からその文字列を記述した人物の感情を推定する場合について、以後 説明する。  [0067] By the way, in the mobile phone according to the first embodiment of the present invention thus far, from the received voice signal and the transmitted voice signal inputted at the time of voice call, the emotions of other telephone users who have made the voice and the relevant The case of estimating mobile phone user emotions has been described. The case of estimating the emotion of a person who describes a character string from a character string will be described below.
[0068] 図 9に、本発明の第 1実施形態の携帯電話の他の構成図を示す。本発明の第 1実 施开態の携帯電話は、 PIM (Personal Infomation Manager)アプリケーション群 30、 表示部 50、通信装置 60、感情特定装置 70、表示制御部 80、とを含んで構成される 。図 1の参照符号と同一のものについての説明は、上述したとおりなので、説明を省 略する。 FIG. 9 shows another configuration diagram of the mobile phone according to the first embodiment of the present invention. The mobile phone according to the first embodiment of the present invention includes a PIM (Personal Information Manager) application group 30, a display unit 50, a communication device 60, an emotion identification device 70, and a display control unit 80. The description of the same reference numerals as in FIG. 1 is as described above, so the description is omitted. Abbreviated.
[0069] 通話装置 60は、メール送受信部 601、文字情報出力部 602を備える。メール送受 信部 601は、携帯無線用基地局と携帯無線通信を行い、本発明の携帯電話に割り 当てられたメールアドレス宛ての電子メールの受信、および本発明の携帯電話から 任意のメールアドレス宛ての電子メールの送信を実現する。文字情報出力部 602は 、メール送受信部 601を介して受信した電子メール、あるいは送信用の電子メールに 記載されている文字データ (電子メールにおける、タイトル名やメール本文に記載さ れている文字列についての文字データの少なくとも一部)を感情特定装置 70に出力 する。また、文字情報出力部 602は、受信した電子メールデータを表示制御部 80に 出力してその電子メールを表示させ、また本発明の携帯電話が備える操作キー(図 示せず)による操作によって表示部 50上に作成された送信用の電子メールデータを 表示制御部 80から入力する。  The call device 60 includes a mail transmission / reception unit 601 and a character information output unit 602. The mail transmission / reception unit 601 performs mobile wireless communication with the mobile radio base station, receives an e-mail addressed to the mail address assigned to the mobile phone of the present invention, and addresses an arbitrary mail address from the mobile phone of the present invention. To send e-mail. The character information output unit 602 is a character data described in the e-mail received via the mail transmitting / receiving unit 601 or the e-mail for transmission (the character string described in the title name or e-mail text in the e-mail). At least a part of the character data about the) is output to the emotion identification device 70. The character information output unit 602 outputs the received e-mail data to the display control unit 80 to display the e-mail, and displays the e-mail by operating the operation key (not shown) provided in the mobile phone of the present invention. E-mail data for transmission created on 50 is input from the display control unit 80.
[0070] 感情特定装置 70は、感情推定部 701、感情蓄積部 702、感情特定部 703、感情 情報記憶部 704を備える。感情推定部 701は、通信装置 20から入力される文字デ ータから、その文字列を記述したメール作成者の感情をそれぞれ推定する。感情推 定部 701は、例えば、愛情、喜び、怒り、哀しみ、ニュートラル (通常)から構成される 感情情報の因子毎に、 0、 1、 2の値により表現したその感情の度合い (0 :感情なし、 1 :感情弱、 2 :感情強)を、入力された文字データ (少なくとも 1以上の文字力も成る文 字列や、あるイメージを表現するするマーク等力 成る。このマークを絵文字と称する こともある)の先頭から一文毎ある 、は一文節毎に推定し、その推定した各々の数値 を感情蓄積部 702に逐次出力する。  The emotion identification device 70 includes an emotion estimation unit 701, an emotion accumulation unit 702, an emotion identification unit 703, and an emotion information storage unit 704. Emotion estimation unit 701 estimates the emotion of the mail creator describing the character string from the character data input from communication device 20. The emotion estimation unit 701, for example, expresses the degree of emotion expressed by the values of 0, 1, 2 for each factor of emotion information composed of love, joy, anger, sadness, neutral (normal) (0: emotion None, 1: Emotional weakness, 2: Emotional strongness, input character data (a character string that has at least one character strength, a mark that expresses an image, etc.) This mark is called a pictogram Is estimated for each sentence from the head of the sentence, and each estimated value is sequentially output to the emotion storage unit 702.
[0071] 感情蓄積部 702は、感情推定部 701から入力された各々の因子毎の数値を、入力 された順番に対応させて蓄積する。感情蓄積部 702は、感情推定部 701から一連の 各因子毎の数値 (ここでいう一連とは、感情推定部 701から感情蓄積部 702への数 値の入力開始力 入力終了を指す)を入力されると、その一連の各因子毎の数値を ひとまとまりのデータ(以後、ひとまとまりのデータのことを 1メール分のデータと称する )として記憶する。  The emotion accumulating unit 702 accumulates the numerical values for each factor input from the emotion estimating unit 701 in association with the input order. Emotion accumulation unit 702 inputs a numerical value for each of a series of factors from emotion estimation unit 701 (the series here refers to the input start force of numerical input from emotion estimation unit 701 to emotion accumulation unit 702) Then, the numerical value for each factor in the series is stored as a set of data (hereinafter, a set of data is referred to as data for one mail).
[0072] 感情特定部 703は、感情蓄積部 702から 1メール分のデータを読み出し、各々の 因子毎の数値を解析し、その読み出した数値カゝら特徴的な因子を 1つ特定し、その 1 つの因子により表される感情を感情情報記憶部 704に出力する。感情特定部 703は 、特徴的な感情を 1つ特定する際、 1メール分のデータにおいて数値が最も大きかつ た因子を特徴的な感情とすることにより、メール中の強く印象に残った内容を重視し て感情を特定することができる。別の特定方法としては、 1メール分のデータにおける メール序盤からメール終盤にかけて合計した数値が最も大きカゝつた因子を特徴的な 感情とすることにより、メール全体の内容を重視して感情を特定することができ、さら に、 1メール分のデータにおけるメール文終盤に数値が最も大き力つた因子を特徴的 な感情とすることにより、メール内容の余韻を重視して感情を特定することができる。 [0072] Emotion identification unit 703 reads the data for one email from emotion accumulation unit 702, and The numerical value for each factor is analyzed, one characteristic factor is identified from the read numerical value, and the emotion represented by the one factor is output to the emotion information storage unit 704. When identifying one characteristic emotion, the emotion identification unit 703 uses the factor with the largest numerical value in the data for one email as the characteristic emotion, thereby identifying the content that is strongly impressed in the email. Emotion can be identified with emphasis. Another method is to identify emotions with emphasis on the content of the entire email, by using the factors that have the largest sum of the numbers from the beginning of the email to the end of the email in the data for one email as the characteristic emotion. In addition, it is possible to identify emotions with emphasis on the reverberation of the email content by making the characteristic emotion the factor with the most powerful numerical value at the end of the email sentence in the data for one email. .
[0073] 感情情報記憶部 704は、感情特定部 703からある 1つの因子により表される感情を 入力し、さらに、通話装置 60のメール送受信部 601により行なったメール送受信に関 する情報 (そのメールが送信 ·受信の ヽずれか、メール送信時刻'メール受信時刻、 および送信先、送信元のメールアドレス)を、メール送受信部 601から入力する。感 情情報記憶部 704は、感情特定部 703から入力した感情と、メール送受信部 601か ら入力した各種情報と、を対応付けて記憶する。図 10に、本発明の第 1実施形態の 携帯電話における感情特定装置が記憶する情報一覧の他の例を示す。図 10におい て、感情情報記憶部 704は、 1メール分のデータ毎に、そのメール力も特定される感 情 (すなわち、送信メールであればメール作成者である本発明の携帯電話利用者の 感情を、受信メールであればメール作成者であるメール相手の感情)を記憶する構 成としている。 [0073] Emotion information storage section 704 inputs an emotion represented by one factor from emotion identification section 703, and further, information on mail transmission / reception performed by mail transmission / reception section 601 of communication device 60 (the mail) The mail transmission / reception unit 601 inputs the mail transmission / reception error or the mail transmission time (mail reception time, destination, and sender's mail address). The emotion information storage unit 704 stores emotions input from the emotion identification unit 703 and various information input from the mail transmission / reception unit 601 in association with each other. FIG. 10 shows another example of the information list stored in the emotion identifying device in the mobile phone according to the first embodiment of the present invention. In FIG. 10, the emotion information storage unit 704 is an emotion in which the mail power is also specified for each piece of data for one email (that is, the sentiment of the mobile phone user of the present invention who is the mail creator in the case of outgoing mail). If it is a received mail, it is configured to memorize the emotion of the mail partner who is the mail creator).
[0074] 表示制御部 80は、 PIMアプリケーション群のうちのいずれかのアプリケーションを 起動 ·実行し、実行したアプリケーションの処理に必要なデータを感情情報記憶部 7 04から抽出して、表示部 50に各種情報を表示させる。通話履歴アプリケーション、ス ケジユーラアプリケーション、または電話帳アプリケーションを起動 ·実行した場合の 表示部 50の表示例は、後述する。  [0074] The display control unit 80 activates and executes one of the applications in the PIM application group, extracts data necessary for processing of the executed application from the emotion information storage unit 700, and displays the data in the display unit 50. Display various information. A display example of the display unit 50 when the call history application, scheduler application, or phone book application is started and executed will be described later.
[0075] 次に、本発明の第 1実施形態の携帯電話による別の処理を、図 11に示す本発明の 第 1実施形態の携帯電話によるメール受信履歴を表示する処理フローを参照して、 説明する。 [0076] 通信装置 60は、メール送信またはメール受信を開始すると (ステップ S1101、 YES )、受信した電子メール、あるいは送信用の電子メールに記載されている文字データ を感情特定装置 70に出力する。(ステップ S1102)。感情特定装置 70は、通信装置 60から入力した文字データについて、愛情、喜び、怒り、哀しみ、ニュートラル (通常 )から構成される感情情報の因子毎に、入力された文字データの先頭から一文毎あ るいは一文節毎に、推定する (ステップ S 1103)。感情特定装置 70は、通信装置 60 力もの文字データの入力が終了すると (ステップ S 1104、 YES)、その一連の文字デ ータの先頭から終わりにかけて推定した各々の因子毎の数値を解析し、特徴的な因 子を 1つ特定し、その 1つの因子により表される感情をメール作成者の感情として記 憶する (ステップ S 1105。このとき、図 10に示すように、メールに関する情報を上記特 定した感情と共に記憶する)。 Next, another process by the mobile phone of the first embodiment of the present invention is described with reference to a processing flow for displaying a mail reception history by the mobile phone of the first embodiment of the present invention shown in FIG. explain. When communication device 60 starts mail transmission or mail reception (YES in step S1101), it outputs character data described in the received electronic mail or electronic mail for transmission to emotion identification device 70. (Step S1102). The emotion identifying device 70 reads the text data input from the communication device 60 for each sentence from the beginning of the input character data for each factor of emotion information composed of affection, joy, anger, sadness, and neutral (normal). Or, it is estimated for each phrase (step S 1103). When the emotion identification device 70 finishes inputting the character data of the communication device 60 (step S 1104, YES), it analyzes the numerical value for each factor estimated from the beginning to the end of the series of character data, One characteristic factor is identified, and the emotion represented by the one factor is stored as the mail creator's emotion (step S 1105. At this time, as shown in FIG. Memorize with specific emotions).
[0077] その後、表示制御部 80は、本発明の携帯電話に備わる操作キーによる入力操作 を携帯電話利用者力も受け付けることによって、メール履歴アプリケーション 304を起 動すると(ステップ S 1106、 YES)、メール履歴アプリケーション 304のプログラムコー ドに従って、感情特定装置 70に記憶されている各メール毎に特定された感情とその メールに関する情報とを読み出し、表示部 50にこれらの情報を所定の表示形式で表 示させる(ステップ S 1107)。  [0077] After that, the display control unit 80 starts the mail history application 304 by accepting the input operation by the operation key provided in the mobile phone of the present invention by the mobile phone user power (step S1106, YES). According to the program code of the history application 304, the emotion specified for each email stored in the emotion identification device 70 and information related to the email are read, and the information is displayed on the display unit 50 in a predetermined display format. (Step S 1107).
[0078] なお、表示制御部 80による、メール履歴、スケジュール帳、電話帳を表示する具体 的な処理は、上述した [通話履歴を利用した表示形式]、 [スケジュール帳を利用した 表示形式]、 [電話帳を利用した表示形式]において、抽出すべきデータ対象の名称 がー部異なる (感情を特定すべきメール作成者を、図 3の項目「信号源」における「受 話音声」、「送話音声」で判別する代わりに、図 10の項目「送受信」における「送信」、 「受信」によって携帯電話利用者力メール相手かを判別し、メールの送受信時刻を特 定するために図 3の項目「通話開始時刻」の代わりに図 10の項目「送受信時刻」を用 い、メール相手を識別するために図 3の項目「通話先電話番号」の代わりに図 10の 項目「相手先メールアドレス」を用い)以外共通であるため、説明を省略する。また、 [ スケジュール帳を利用した表示形式]において、 1つの日付に、複数のメールそれぞ れで特定された複数の異なる感情が情報一覧中にある場合や、 [電話帳を利用した 表示形式]において、ある個人の感情に、複数のメールそれぞれによって特定された 複数の異なる感情が情報一覧中にある場合、表示制御部 40は、 1つの日付に受信 した複数のメールやある個人力 受信した複数のメールの中で、メール受信日時が 最も新し 、メールによって特定された感情や、最もメール本文の文字数が多 、メール によって特定された感情や、最も特定された回数の多カゝつた感情、をその日付やそ の個人における総合的な感情として表示部 50に表示させる。 [0078] Note that the specific process of displaying the mail history, schedule book, and phone book by the display control unit 80 is as described in [Display format using call history], [Display format using schedule book], In [Display format using phone book], the name of the data object to be extracted is different. (The mail creator whose emotion should be specified is indicated by “Receiving voice” and “Sending” in the item “Signal source” in Fig. 3. Instead of discriminating by “spoken voice”, it is determined whether it is a mobile phone user power mail partner by “send” or “receive” in the item “send / receive” in FIG. The item “Send / Receive Time” in FIG. 10 is used instead of the item “Call Start Time”, and the item “Recipient Email Address” in FIG. 10 is used instead of the item “Call Target Phone Number” in FIG. Is common except for , The description thereof is omitted. Also, in [Display format using schedule book], if there are multiple different emotions identified in multiple emails on one date in the information list, or [Use phone book In the [Display Format], if there are multiple different emotions specified by multiple emails in the information list in the display list, the display control unit 40 can receive multiple emails received on a single date, Among the multiple emails received, the date and time when the email was received is the newest, the emotion identified by the email, the number of characters in the email body is the most, the emotion identified by the email, and the number of times identified the most Are displayed on the display unit 50 as the total feelings of the date or the individual.
[0079] または、 1つの日付に、複数のメールそれぞれで特定された複数の異なる感情が情 報一覧中にある場合や、ある個人の感情に、複数のメールそれぞれによって特定さ れた複数の異なる感情が情報一覧中にある場合、各メール毎に特定された感情に 対して、その感情の度合いを設定するために重み付けを行ない、感情の度合いから 総合的な感情を特定することも考えられる。表示制御部 40は、各メール毎に特定さ れた感情に対して、メール受信日時が新しいほどまたはメール本文の文字数が多い ほどその感情の度合いを示す数値が大きくなるよう重み付けを行い(例えば、メール 受信日時力 総合的な感情を表示させる現在日時を引いた差分の秒数分をその感 情の度合 、として数値ィヒしたり、メール本文のデータ量をその感情の度合 、として数 値ィ匕する)、数値ィ匕された感情のうち、ある閾値を越えかつ最も大きい数値によって その度合いが示されている感情を総合的な感情として表示させ、一方、重み付けさ れた感情に所定の閾値を越えるものが無ければ、「通常」を総合的な感情として表示 させる。この構成によれば、ある 1日に行なわれた全てのメール毎に特定される感情 のうち、特に感情の度合いが大きいものをその 1日の総合的な感情として表示するた め、ある 1日の総合的な感情を特定する精度を向上することができる。  [0079] Or, if there are multiple different emotions specified in each of multiple emails in the information list on one date, or multiple different emails specified by each of multiple emails for an individual emotion When emotions are listed in the information list, it is possible to weight the emotions specified for each e-mail to set the level of emotions and to specify the total emotions based on the level of emotions. The display control unit 40 weights the emotion specified for each email so that the newer the email reception date or the more the number of characters in the email body, the larger the numerical value indicating the degree of the emotion (for example, E-mail reception date / time power The number of seconds after subtracting the current date and time for displaying comprehensive emotions is numerically expressed as the degree of the emotion, or the amount of data in the mail text is expressed as the degree of emotion. Of emotions that exceed a certain threshold and whose degree is indicated by the largest numerical value is displayed as a comprehensive emotion, while weighted emotions are displayed at a predetermined threshold. If there is nothing that exceeds, “normal” is displayed as a general emotion. According to this configuration, the emotions that are identified for every email performed on a certain day and that have a particularly high degree of emotion are displayed as the total emotion for that day. The accuracy of specifying the overall emotion can be improved.
[0080] 以上のように、本発明の第 1実施形態の携帯電話によれば、携帯電話利用者に手 間や時間をかけることを強いること無ぐ過去の通話内容やメール本文の内容を思い 出すことを支援することができる。また、既存の各種 PIMアプリケーションに連動して 、携帯電話利用者の感情や携帯電話を利用して当該利用者とコミュニケーションを 取った相手の感情を携帯電話利用者に通知することにより、携帯電話利用者が過去 のコミュニケーションをより思い出し易い環境を提供することができる。  [0080] As described above, according to the mobile phone of the first embodiment of the present invention, it is possible to think about the contents of past calls and the contents of e-mail text without forcing the mobile phone user to spend time and effort. It can help to put out. In addition, in conjunction with various existing PIM applications, mobile phone users can be notified by notifying mobile phone users of emotions of mobile phone users and the emotions of other parties that have communicated with those users using mobile phones. It is possible to provide an environment in which a person can easily remember past communications.
[0081] (第 2実施形態) 本発明の第 2実施形態の携帯電話では、第 1実施形態のように携帯電話利用者が 過去の通話内容を思い出した 、ときに予め特定してぉ 、たその内容を象徴する感情 を表示するのに加え、その通話音声を録音していればその録音データの有無を併せ て表示し、さらにその録音データのうちの、携帯電話利用者に過去の通話内容を効 果的に思い出させることができるデータ箇所を再生するものである。これにより、携帯 電話利用者は、過去の通話内容を思い出すためにかける手間や時間を最小限に抑 えつつ、その内容を確実に思い出すことができる。図 12に、本発明の第 2実施形態 の携帯電話の構成図を示す。 [0081] (Second Embodiment) In the mobile phone according to the second embodiment of the present invention, when the mobile phone user recalls the contents of a past call as in the first embodiment, the emotion that symbolizes the content is specified in advance. In addition, if the call voice is recorded, the presence / absence of the recorded data is also displayed, and the mobile phone user of the recorded data can be effectively reminded of the past call contents. The data portion that can be reproduced. As a result, the mobile phone user can remember the contents of the call while minimizing the time and effort required to remember the contents of the past call. FIG. 12 shows a configuration diagram of the mobile phone according to the second embodiment of the present invention.
[0082] 本発明の第 2実施形態の携帯電話は、通話装置 10、感情特定装置 20、 PIM (Per sonal Infomation Manager)アプリケーション群 30、表示制御部 40、表示部 50、再生 制御部 90、スピーカ 100とを含んで構成される。本発明の第 2実施形態の携帯電話 の構成は、図 1に示す本発明の第 1実施形態の携帯電話の構成に、通話装置 10の 音声記憶部 105、再生制御部 90、およびスピーカ 100を追カ卩したものである。なお、 この構成の追カ卩に伴い、通話装置 10の通信部 101、音声信号出力部 102、感情特 定装置 20の感情放棄億部 204、および表示制御部 40に処理すべき機能を追加す る力 それ以外の図 1の参照符号と同一のものについての説明は、第 1実施形態に ぉ 、て記載したとおりなので、説明を省略する。  [0082] The mobile phone according to the second embodiment of the present invention includes a call device 10, an emotion identification device 20, a PIM (Personal Information Manager) application group 30, a display control unit 40, a display unit 50, a playback control unit 90, and a speaker. It is composed of 100. The configuration of the mobile phone according to the second embodiment of the present invention is the same as the configuration of the mobile phone according to the first embodiment of the present invention shown in FIG. 1 except that the voice storage unit 105, the playback control unit 90, and the speaker 100 of the communication device 10 are provided. It is an added one. With the addition of this configuration, functions to be processed are added to the communication unit 101 of the communication device 10, the voice signal output unit 102, the emotion abandonment unit 204 of the emotion identification device 20, and the display control unit 40. The description of the other parts that are the same as those in FIG. 1 is the same as that described in the first embodiment, and thus the description thereof is omitted.
[0083] 音声信号出力部 102は、通信部 101を介して他の電話機力 受信した音声信号、 通話用マイク 104により収音した当該携帯電話利用者の音声信号、またはその両方 の音声信号を感情特定装置 20に出力する。また、音声信号出力部 102は、他の電 話機力も受信した音声を通話用スピーカ 103に出力し、通話用マイク 104により収音 した当該携帯電話利用者の通話音声を通信部 101に出力する。さらに、音声信号出 力部 102は、通信部 101を介して他の電話機力も受信した音声信号、および通話用 マイク 104により収音した当該携帯電話利用者の音声信号を音声記憶部 105に出 力する。  [0083] The voice signal output unit 102 emotionally receives a voice signal received by another telephone force via the communication unit 101, a voice signal of the mobile phone user picked up by the call microphone 104, or both voice signals. Output to specific device 20. Also, the voice signal output unit 102 outputs the voice received by other telephone capabilities to the call speaker 103, and outputs the call voice of the mobile phone user collected by the call microphone 104 to the communication unit 101. In addition, the audio signal output unit 102 outputs to the audio storage unit 105 the audio signal received by the other telephones via the communication unit 101 and the audio signal of the mobile phone user collected by the call microphone 104. To do.
[0084] 通話装置 10の音声記憶部 105は、他の電話機と通話を開始したことを通信部 101 から通知され、音声信号出力部 102から音声信号を入力すると、その音声信号の録 音を開始する。音声記憶部 105は、音声信号出力部 102からの音声信号の入力が 終了すると音声信号の録音を終了し、録音が完了したことを通信部 101に通知する。 なお、通信部 105は、他の電話機と通話を開始したことを音声記憶部 105に通知す る際にその通話を識別するための識別情報 (通話を開始あるいは終了した時刻をそ の通話の識別情報としたり、通話を開始する度にそに通話に所定の番号を割り当て たり、することが考えられる。ここでは、通話開始時刻を通話を識別する識別情報とし て扱うこととする)を通知しており、音声記憶部 105は、この識別情報と録音する音声 信号とを対応付けて記録する(音声記録部 105は、この識別情報により音声信号を 識別することになる)。 [0084] When the voice storage unit 105 of the call device 10 is notified by the communication unit 101 that a call with another telephone has been started and receives a voice signal from the voice signal output unit 102, the voice storage unit 105 starts recording the voice signal. To do. The audio storage unit 105 receives the audio signal from the audio signal output unit 102. When the recording is finished, the recording of the audio signal is ended, and the communication unit 101 is notified that the recording is completed. The communication unit 105, when notifying the voice storage unit 105 that a call with another telephone has started, is identification information for identifying the call (the time at which the call was started or ended is used to identify the call). Information, or assigning a predetermined number to the call every time a call is started, where the call start time is treated as identification information for identifying the call) The voice storage unit 105 records the identification information and the voice signal to be recorded in association with each other (the voice recording unit 105 identifies the voice signal based on the identification information).
[0085] 感情情報記憶部 204は、感情特定部 203からある 1つの因子により表される感情を 入力し、さらに、通話装置 10の通信部 101により行なった通話に関する情報 (その通 話が発信'着信のいずれか、通話開始時刻'通話終了時刻、および他の電話機の識 別情報 (例えば通話先の電話番号))と、通話装置 10によるこの通話を録音した音声 データが存在することと、を通信部 101から入力する。感情情報記憶部 204は、感情 特定部 203から入力した感情と、通信部 101から入力した各種情報と、録音した音声 データの有無と、を対応付けて記憶する (感情特定部 203が受話音声信号および送 話音声信号毎の 1通話分のデータそれぞれから特徴的な因子を 1つ特定した場合、 その 1つの因子が受話音声信号および送話音声信号のどちらの信号源のものである 力も記憶する)。図 13に、本発明の第 2実施形態の携帯電話における感情特定装置 が記憶する情報一覧を示す。本発明の第 2実施形態の携帯電話における感情特定 装置が記憶する情報一覧は、図 3の情報一覧に項目「録音データの有無」を追加し たものである。  [0085] Emotion information storage unit 204 receives an emotion expressed by one factor from emotion identification unit 203, and further, information related to a call made by communication unit 101 of call device 10 (the call is transmitted) Any incoming call, call start time 'call end time, and other phone identification information (for example, the phone number of the callee), and the presence of audio data that recorded the call by the call device 10. Input from the communication unit 101. The emotion information storage unit 204 stores the emotion inputted from the emotion identification unit 203, the various information inputted from the communication unit 101, and the presence / absence of recorded audio data in association with each other (the emotion identification unit 203 receives the received voice signal). When one characteristic factor is identified from each of the data for one call for each transmitted voice signal, the force that is the signal source of either the received voice signal or the transmitted voice signal is stored as one factor. ). FIG. 13 shows a list of information stored in the emotion identifying device in the mobile phone according to the second embodiment of the present invention. The information list stored in the emotion identifying device in the mobile phone according to the second embodiment of the present invention is obtained by adding the item “recorded data presence / absence” to the information list in FIG.
[0086] 再生制御部 90は、音声記憶部 105に記憶された音声信号のうち、識別情報 (携帯 電話に備わる操作キーによる入力操作等によって識別情報は入力される)により特 定される音声信号を読み出し、スピーカ 100に出力することによって音声出力する。  [0086] The reproduction control unit 90 is an audio signal specified by identification information (identification information is input by an input operation using an operation key provided on a mobile phone) among the audio signals stored in the audio storage unit 105. Is output and output to the speaker 100 to output sound.
[0087] 次に、本発明の第 2実施形態の携帯電話による処理を、図 14に示す本発明の第 2 実施形態の携帯電話による通話履歴を表示する処理フローを参照して、説明する。 なお、図 4に示す本発明の第 1実施形態の携帯電話による通話履歴を表示する処理 フローと同一の参照符号が割り当てられた処理は、第 1実施形態で述べた通りである ため説明を省略する。 Next, processing by the mobile phone of the second embodiment of the present invention will be described with reference to a processing flow for displaying a call history by the mobile phone of the second embodiment of the present invention shown in FIG. The process assigned the same reference numerals as the process flow for displaying the call history by the mobile phone according to the first embodiment of the present invention shown in FIG. 4 is as described in the first embodiment. Therefore, explanation is omitted.
[0088] 通話装置 10は、通話を開始すると (ステップ S402、 YES)、感情特定装置 20に受 話音声信号および送話音声信号を出力し、さらに、その音声信号の録音を開始する 。(ステップ S1401)。感情特定装置 20は、通話装置 10から入力した音声信号のうち の感情を特定すべき対象の音声信号について、愛情、喜び、怒り、哀しみ、二ユート ラル (通常)から構成される感情情報の因子毎に、その感情の度合いを継続的に所 定の時間間隔で推定する (ステップ S404)。感情特定装置 20は、通話装置 10による 通話が終了すると (ステップ S405、 YES)、その 1通話の間に推定した各々の因子毎 の数値を解析し、その数値カゝら特徴的な因子を 1つ特定し、その 1つの因子により表 される感情をこの通話における通話相手または携帯電話利用者の感情として記憶す る (ステップ S1402。このとき、図 13に示すように、通話に関する情報と録音した音声 データの有無とを、上記特定した感情と対応付けて記憶する)。また、通話装置 10は 、音声信号の入力が終了すると、音声信号の録音を終了し、その録音データとその 録音データを識別するための識別情報とを対応付けて記録する (ステップ S1403)。  [0088] When the call device 10 starts a call (step S402, YES), the call device 10 outputs the received voice signal and the transmitted voice signal to the emotion identifying device 20, and further starts recording the voice signal. (Step S1401). The emotion identifying device 20 is a factor of emotion information composed of affection, joy, anger, sadness, and two uitals (normal) for the target speech signal of the speech signal input from the call device 10. Each time, the degree of emotion is continuously estimated at a predetermined time interval (step S404). When the call by the call device 10 ends (YES in step S405), the emotion identification device 20 analyzes the numerical value for each factor estimated during the call, and sets the characteristic factor to 1 And identify the emotion expressed by one of the factors as the other party's or mobile phone user's emotion in this call (step S1402). The presence / absence of voice data is stored in association with the identified emotion). Further, when the input of the audio signal is completed, the call device 10 ends the recording of the audio signal, and records the recorded data and identification information for identifying the recorded data in association with each other (step S1403).
[0089] その後、表示制御部 40は、本発明の携帯電話に備わる操作キーによる入力操作 を携帯電話利用者力も受け付けることによって、通話履歴アプリケーション 301を起 動すると(ステップ S407、 YES)、通話履歴アプリケーション 301のプログラムコード に従って、感情特定装置 20に記憶されている各通話毎に特定された感情とその通 話に関する情報と録音した音声データの有無とを読み出し、表示部 50にこれらの情 報を所定の表示形式で表示させる (ステップ S 1404)。  [0089] After that, the display control unit 40 starts the call history application 301 by accepting the input operation using the operation keys provided in the mobile phone of the present invention by the mobile phone user power (step S407, YES). According to the program code of the application 301, the emotion identified for each call stored in the emotion identification device 20, the information related to the call and the presence / absence of recorded voice data are read, and the information is displayed on the display unit 50. Display in a predetermined display format (step S 1404).
[0090] [通話履歴を利用した表示形式]  [0090] [Display format using call history]
図 15に、本発明の第 2実施形態の携帯電話による通話履歴の表示例を示す。図 1 5 (a)は、図 13に示す感情特定装置 20が記憶する情報を元に生成した発信履歴で あり、図 15 (b)は、図 13に示す感情特定装置 20が記憶する情報を元に生成した着 信履歴である。  FIG. 15 shows a display example of a call history by the mobile phone according to the second embodiment of the present invention. Fig. 15 (a) is a transmission history generated based on the information stored in the emotion identification device 20 shown in Fig. 13, and Fig. 15 (b) shows the information stored in the emotion identification device 20 shown in Fig. 13. This is the original received history.
[0091] 表示制御部 30は、図 15 (a)に示す発信履歴を表示するために、図 13に示す感情 特定装置 20が記憶する情報一覧のうちの、項目「発着信」が「発信」のもの(1通話分 のデータ aが該当)で、さら〖こ、ステップ S401で設定された感情を特定すべき対象で ある通話相手、すなわち項目「信号源」が「受話音声」である 1通話分のデータから、 項目「通話開始時刻」、「通話先電話番号」、「感情」、「録音データの有無」のデータ を抽出し、図 15 (a)の発信履歴の項目「発信日時」、「名称」、「感情」、「録音音声」の 該当箇所に表示させる。 [0091] In order to display the outgoing call history shown in Fig. 15 (a), the display control unit 30 sets the item "outgoing / incoming" in the information list stored in the emotion identifying device 20 shown in Fig. 13 to "outgoing". (In this case, the data a for one call is applicable) From the data for one call with the other party, that is, the item “Signal Source” is “Received Voice”, the data of the items “Call Start Time”, “Destination Phone Number”, “Emotion”, “Recorded Data” Is extracted and displayed at the corresponding locations of the items of “Originating date”, “Name”, “Emotion”, and “Recorded voice” in the outgoing history in Fig. 15 (a).
[0092] 表示制御部 30はまた、図 5 (b)に示す着信履歴を表示するために、図 13に示す感 情特定装置 20が記憶する情報一覧のうちの、項目「発着信」が「着信」のもの(1通話 分のデータ b、 cが該当)で、さらに、ステップ S401で設定された感情を特定すべき対 象である通話相手、すなわち項目「信号源」が「受話音声」である 1通話分のデータか ら、項目「通話開始時刻」、「通話先電話番号」、「感情」、「録音データの有無」のデ ータを抽出し、図 15 (b)の着信履歴の項目「着信日時」、「名称」、「感情」、「録音音 声」の該当箇所に表示させる。  [0092] In order to display the incoming call history shown in FIG. 5 (b), the display control unit 30 also sets the item "outgoing / incoming" in the information list stored in the emotion identification device 20 shown in FIG. "Incoming call" (data corresponding to one call b and c), and the other party who should specify the emotion set in step S401, that is, the item "Signal source" is "Received voice" From the data for one call, the data of the items “call start time”, “call destination phone number”, “emotion”, “recorded data presence / absence” are extracted, and the incoming call history in Fig. 15 (b) is extracted. Displayed in the corresponding sections of the items “Date and time of incoming call”, “Name”, “Emotion”, and “Recorded voice”.
[0093] [スケジュール帳を利用した表示形式]  [0093] [Display format using schedule book]
次に、本発明の第 2実施形態の携帯電話によるスケジュールを表示する処理につ いて説明する。スケジュールを表示する処理フローは、上述したステップ S407、 S14 04で通話履歴アプリケーション 301を起動、実行した代わりに、スケジューラアプリケ ーシヨン 302を起動、実行する以外、図 14と共通である。図 16に、本発明の第 2実施 形態の携帯電話によるスケジュール帳の表示例を示す。図 16 (a)は、スケジュール 帳により表示したカレンダーであり、図 16 (b)は、カレンダーの各日付毎の感情を表 示した表示例であり、図 16 (c)は、特定の日付における感情の変遷を表示した表示 例である。なお、図 16 (a)と図 16 (b)は、第 1実施形態の図 6 (a)と図 6 (b)と同じもの であり、これらを表示するための表示制御部 40の処理も第 1実施形態と同様であるた め、説明を省略する。  Next, a process for displaying a schedule by the mobile phone according to the second embodiment of the present invention will be described. The processing flow for displaying the schedule is the same as that in FIG. 14 except that the scheduler application 302 is started and executed instead of starting and executing the call history application 301 in steps S407 and S1404 described above. FIG. 16 shows a display example of the schedule book by the mobile phone according to the second embodiment of the present invention. Fig. 16 (a) is a calendar displayed in the schedule book, Fig. 16 (b) is a display example that displays emotions for each date in the calendar, and Fig. 16 (c) is for a specific date. It is a display example that displays the transition of emotion. FIGS. 16 (a) and 16 (b) are the same as FIGS. 6 (a) and 6 (b) of the first embodiment, and the processing of the display control unit 40 for displaying these is also performed. Since it is the same as that of the first embodiment, description thereof is omitted.
[0094] 表示制御部 40は、操作キーにより特定の日付を選択し(図 16 (b)では、 9月 2を選 択して 、る)、その日付における携帯電話利用者の感情の変遷を表示するよう指示 する入力操作を携帯電話利用者から受け付けると、図 13に示す感情特定装置 20が 記憶する情報一覧のうちの、項目「通話開始時刻」がその特定の日付であるもので、 かつ感情を特定すべき対象である携帯電話利用者自身の感情、すなわち項目「信 号源」が「送話音声」である 1通話分のデータを抽出する。表示制御部 40は、抽出し た 1通話分のデータにおける項目「通話開始時刻」、「感情」、「録音データの有無」を 参照して、例えば図 6 (c)に示すように、 9月 2日における携帯電話利用者自身の感 情と共に「録音音声 有」と表示し、時系列に沿って表示する。 [0094] The display control unit 40 selects a specific date using the operation keys (in FIG. 16 (b), selects September 2), and changes the emotion of the mobile phone user on that date. When an input operation for instructing display is accepted from a mobile phone user, the item “call start time” in the information list stored in the emotion identification device 20 shown in FIG. 13 is that specific date, and The mobile phone user's own emotions for which emotions are to be identified, that is, data for one call whose item “signal source” is “sending speech” are extracted. The display control unit 40 extracts Referring to the items “call start time”, “emotion”, and “recorded data presence / absence” in the data for one call, for example, as shown in Fig. 6 (c), the mobile phone user himself on September 2 “Sound recording” is displayed along with the feeling of, and is displayed in chronological order.
[0095] [電話帳を利用した表示形式] [0095] [Display format using phone book]
次に、本発明の第 2実施形態の携帯電話による電話帳を表示する処理について説 明する。電話帳を表示する処理フローは、上述したステップ S407、 S1404で通話履 歴アプリケーション 301を起動、実行した代わりに、電話帳アプリケーション 303を起 動、実行する以外、図 14と共通である。図 17に、本発明の第 1実施形態の携帯電話 による電話帳の表示例を示す。図 17 (a)は、電話帳に登録された各個人の名称と電 話番号の表示例であり、図 17 (b)は、電話帳に登録された各個人毎の感情の表示 例である。  Next, processing for displaying a telephone directory by the mobile phone according to the second embodiment of the present invention will be described. The processing flow for displaying the phone book is the same as that in FIG. 14 except that the phone book application 303 is started and executed instead of starting and executing the call history application 301 in steps S407 and S1404 described above. FIG. 17 shows a display example of the phone book by the mobile phone according to the first embodiment of the present invention. Fig. 17 (a) is a display example of each person's name and phone number registered in the phone book, and Fig. 17 (b) is an example of emotion display for each individual registered in the phone book. .
[0096] 表示制御部 40は、本発明の携帯電話に備わる操作キーによる入力操作を携帯電 話利用者力も受け付けることによって、電話帳アプリケーション 303を起動すると (ス テツプ S407、 YES)、例えば図 17 (a)に示すように、名称の先頭文字がマ行である 個人の名称とその個人の電話番号とを、各個人毎に表示する。さらに、表示制御部 4 0は、操作キーにより表示中の個人の感情を表示するよう指示する入力操作を携帯 電話利用者から受け付けると、図 13に示す感情特定装置 20が記憶する情報一覧の うちの、感情を特定すべき対象である表示中の個人、すなわち項目「信号源」が「受 話音声」で、かつ項目「通話先電話番号」がその個人に登録されている電話番号と合 致する 1通話分のデータを抽出する。図 13に示す感情特定装置 20が記憶する情報 一覧では、通話先電話番号「09000000000」が表示中の名称「松下太郎」の電話 番号と合致し、通話先電話番号「09000001111」が表示中の名称「松下次郎」の電 話番号と合致するため、表示制御部 40は、該当する「松下太郎」、「松下次郎」の欄 に、抽出した感情「哀しみ」、「愛情」を表すマークを表示させる。さらに、表示制御部 40は、抽出した 1通話分のデータにおける項目「録音データの有無」が「有」であれ ば、図 17 (b)のように、該当する欄に「録音音声 有」と表示する。  [0096] The display control unit 40 starts up the phone book application 303 by accepting the input operation using the operation keys provided in the mobile phone of the present invention by the mobile phone user power (step S407, YES). For example, FIG. As shown in (a), the name of the person whose name begins with a line is displayed for each individual person's name and the individual's telephone number. Further, when the display control unit 40 receives an input operation for instructing to display the personal emotion being displayed by the operation key from the mobile phone user, the display control unit 40 of the information list stored in the emotion identification device 20 shown in FIG. Of the person whose emotion is to be identified, that is, the item “Signal source” is “Received voice” and the item “Called phone number” matches the phone number registered for that individual. Yes Extract data for one call. In the list of information stored in the emotion identification device 20 shown in FIG. 13, the telephone number “09000000000” matches the telephone number of the displayed name “Taro Matsushita” and the telephone number “09000001111” is displayed. In order to match the telephone number of “Jiro Matsushita”, the display control unit 40 displays the marks representing the extracted emotions “sorrow” and “love” in the corresponding “Taro Matsushita” and “Jiro Matsushita” fields. . Further, if the item “Presence / absence of recording data” is “Yes” in the extracted data for one call, the display control unit 40 displays “Recorded voice present” in the corresponding column as shown in FIG. indicate.
[0097] [録音音声の再生]  [0097] [Playback of recorded audio]
本発明の第 2実施形態の携帯電話が、 [通話履歴を利用した表示形式]、 [スケジュ 一ル帳を利用した表示形式]、 [電話帳を利用した表示形式]に記述したように録音 音声の有無を表示した後、その録音音声を再生する処理について説明する。 The mobile phone according to the second embodiment of the present invention has [display format using call history], [schedule This section describes the process of playing back the recorded audio after displaying the presence or absence of the recorded audio as described in [Display Format Using Single Book] and [Display Format Using Phone Book].
[0098] 携帯電話が備える操作キーにより「録音音声 有」と表示されている表示箇所を選 択し、その録音音声を再生するよう指示する入力操作を携帯電話利用者から受け付 けると、再生制御部 90は、選択された表示箇所がどの 1通話分のデータを基に表示 されたものであるかを感情情報記憶部 204に記憶された情報一覧力 特定し、特定 した 1通話分のデータにおける項目「通話開始時刻」から、音声記憶部 105に記憶さ れた音声信号のうちの再生すべき音声信号を読み出す (音声記憶部 105は、この識 別情報である通話開始時刻と録音する音声信号とを対応付けて記憶して 、る)。再 生制御部 105は、読み出した音声信号をスピーカ 100に出力することにより、携帯電 話力 録音音声の最初力 最後までを音出力することができる。  [0098] Select the display location where “Recorded audio is present” is selected using the operation key provided on the mobile phone, and play back when the mobile phone user receives an input operation instructing to play the recorded audio. The control unit 90 identifies the information displayed in the emotion information storage unit 204 as to which one call data the selected display location is displayed based on, and identifies the data for one call. The voice signal to be reproduced is read out of the voice signals stored in the voice storage unit 105 from the item “call start time” in (in the voice storage unit 105, the call start time as the identification information and the voice to be recorded The signal is stored in association with the signal). The reproduction control unit 105 outputs the read audio signal to the speaker 100, so that the mobile phone can output sound up to the end of the recorded voice.
[0099] 本発明の第 2実施形態の携帯電話によれば、第 1実施形態のように携帯電話利用 者が過去の通話内容を思い出した 、ときに予め特定してぉ 、たその内容を象徴する 感情を表示するのに加え、その通話音声を録音していればその録音データの有無を 併せて表示し、さらにその録音データを再生することにより、携帯電話利用者は、そ の内容を確実に思い出すことができる。  [0099] According to the mobile phone of the second embodiment of the present invention, as in the first embodiment, when the mobile phone user remembers the content of a past call, it is sometimes specified in advance and symbolizes the content. In addition to displaying emotions, if the call voice is recorded, the presence or absence of the recorded data is also displayed, and the recorded data is played back so that the mobile phone user can confirm the contents. Can remember.
[0100] なお、本発明の第 2実施形態の携帯電話は、全ての通話を逐一録音する必要は無 ぐ携帯電話利用者が所定の操作を行なうことで通話時の音声信号を録音するように 構成しても良い。各通話毎に特定された感情とその通話に関する情報と録音した音 声データの有無とを所定の表示形式で表示させる際、録音した音声データがなけれ ば、例えば、図 15 (b)、図 17 (b)のように、「録音音声 無」を該当箇所に表示させる ようにする。  [0100] Note that the mobile phone according to the second embodiment of the present invention does not need to record every call one by one, so that a mobile phone user can record a voice signal during a call by performing a predetermined operation. It may be configured. When the emotion specified for each call, information about the call, and the presence or absence of recorded audio data are displayed in a predetermined display format, if there is no recorded audio data, for example, Fig. 15 (b), Fig. 17 As shown in (b), “No sound” is displayed at the corresponding location.
[0101] [通話音声の録音方法、再生方法の詳細]  [0101] [Details on how to record and play call audio]
上述した通話音声の録音、再生では、単に、通話開始から通話終了までの音声信 号を録音し、その録音した音声信号を最初から最後まで再生することを想定して ヽた 力 ここでは、通話音声の録音、再生を所定の条件を満たす箇所から行なうことにより 、録音する音声信号のデータ量を抑えたり、通話音声の中で過去の通話内容を時間 をかけずに思い出すことができる実施例について説明する。図 18に、本発明の第 2 実施形態の携帯電話による通話音声を録音する処理フロー (実施例 1)を示す。 In the above-described recording and playback of call voice, simply recording the voice signal from the start of the call to the end of the call and playing the recorded voice signal from the beginning to the end. About an embodiment in which recording and playback of voice can be performed from a location that satisfies a predetermined condition, thereby reducing the amount of data of the voice signal to be recorded and recalling the contents of past calls in the call voice without taking time. explain. FIG. 18 shows the second of the present invention. The processing flow (Example 1) which records the audio | voice by the mobile telephone of embodiment is shown.
[0102] 通話装置 10は、通話を開始すると (ステップ S402、 YES)、感情特定装置 20に受 話音声信号および送話音声信号を出力する。(ステップ S403)。感情特定装置 20は 、通話装置 10から入力した音声信号のうちの感情を特定すべき対象の音声信号に ついて、愛情、喜び、怒り、哀しみ、ニュートラル (通常)から構成される感情情報の因 子毎に、その感情の度合いを継続的に所定の時間間隔で推定する (ステップ S404) 。ステップ S404の処理において、推定した感情情報の因子の中に 1つでもある閾値 を越える数値の因子があれば (S1801、 YES)、通話装置 10は受話音声信号およ び送話音声信号の録音を開始する(S1802。この処理において、推定した感情情報 の因子の中に 1つでもある閾値を越える数値の因子があるカゝ否かを判定する感情推 定部 201が、音声記憶部 105に受話音声信号および送話音声信号の録音を開始す る制御信号を出力することになる)。感情特定装置 20は、通話装置 10による通話が 終了するまで (ステップ S405、 YES)、これらの処理を繰り返すことになる。例えば、 感情特定装置 20が図 2に示す愛情、喜び、怒り、哀しみ、ニュートラル (通常)から構 成される感情情報の因子毎に、受話音声信号に対して通話開始力 通話終了にか けて継続的に所定の時間間隔で推定した場合、ある感情情報の因子が 2以上である ときに録音を開始するとすると、通話装置 10は、通話開始後 10秒から 15秒の区間で 喜びの因子が 2であると推定されたことからこの区間から通話終了までの通話音声の 録音を開始し (第 1の録音音声)、通話開始後 45秒力も 50秒の区間で怒りの因子が 2であると推定されたことからこの区間から通話終了までの通話音声の録音を開始し ( 第 2の録音音声)、さらに通話開始後 50秒力も 55秒の区間で哀しみの因子が 2であ ると推定されたことからこの区間力も通話終了までの通話音声の録音を開始する (第 3の録音音声)。 [0102] When the call device 10 starts a call (step S402, YES), the call device 10 outputs a received voice signal and a transmitted voice signal to the emotion identifying device 20. (Step S403). The emotion identifying device 20 is a factor of emotion information composed of affection, joy, anger, sadness, and neutral (normal) for the speech signal to which emotion should be identified among the audio signals input from the call device 10. Every time, the degree of emotion is continuously estimated at a predetermined time interval (step S404). In the process of step S404, if there is a factor that exceeds the threshold that is at least one of the estimated emotion information factors (S1801, YES), the call device 10 records the received voice signal and the transmitted voice signal. (S1802. In this process, the emotion estimation unit 201 for determining whether there is a factor with a numerical value exceeding a threshold value among the estimated emotion information factors is stored in the voice storage unit 105. A control signal that starts recording the received voice signal and the transmitted voice signal will be output). The emotion identification device 20 repeats these processes until the call by the call device 10 is completed (step S405, YES). For example, the emotion identification device 20 has the ability to start a call with respect to the received voice signal for each of the emotional information elements composed of affection, joy, anger, sadness, and neutral (normal) as shown in Fig. 2. If the recording is started when the factor of certain emotion information is 2 or more when continuously estimated at a predetermined time interval, the communication device 10 has a pleasure factor in the interval from 10 seconds to 15 seconds after the call starts. Since it was estimated to be 2, the recording of the call voice from this interval to the end of the call is started (first recorded audio), and the anger factor is 2 in the interval of 45 seconds after the start of the call for 50 seconds It is estimated that the recording of the call voice from this section to the end of the call is started (second recorded voice), and the sorrow factor is estimated to be 2 in the section of 55 seconds after the start of the call. Therefore, this segment strength also records the call voice until the end of the call. Start (third recorded voice).
[0103] 通話装置 10による通話終了後、感情特定装置 20は、その 1通話の間に推定した 各々の因子毎の数値を解析し、その数値カゝら特徴的な因子を 1つ特定し、その 1つ の因子により表される感情をこの通話における通話相手または携帯電話利用者の感 情として記憶する (ステップ S 1402。このとき、図 13に示すように、通話に関する情報 と録音した音声データの有無とを、上記特定した感情と対応付けて記憶する)。また、 通話装置 10は、音声信号の入力が終了すると各通話音声の録音を終了し、ステップ S 1402の処理において特定された因子が 2と推定された以降に録音を開始した録 音音声 (例えば、ステップ S1402の処理において特定された因子が喜びであれば、 第 1の録音音声)とその録音データを識別するための識別情報とを対応付けて記録 する(S1803。この処理において、 1つの特徴的な因子を特定した感情特定部 203 力 その因子を音声記憶部 105に通知することになる)。なお、通話装置 10は、第 2、 3の録音音声を、ステップ S1803後削除しても構わない。また、通話装置 10は、第 2 、 3の録音音声を、この通話の感情を表す第 2、第 3の候補として残し、ステップ S 180 3の処理にぉ 、て第 2、 3の録音音声とその録音データを識別するための識別情報と を対応付けて記録しても構わない (このとき、 [通話履歴を利用した表示形式]、 [スケ ジュール帳を利用した表示形式]、 [電話帳を利用した表示形式]に記述したように録 音音声の有無を表示するときには、録音音声の個数分を優先度順に複数表示させ たり、録音音声の個数のみを表示させたりする)。 [0103] After the call by the call device 10, the emotion identification device 20 analyzes the numerical value for each factor estimated during the one call, specifies one characteristic factor from the numerical value, The emotion expressed by the one factor is stored as the feeling of the other party or the user of the mobile phone in this call (step S 1402. At this time, as shown in FIG. 13, information about the call and recorded voice data are recorded. Is stored in association with the identified emotion). Also, When the input of the audio signal is completed, the call device 10 ends the recording of each call audio, and the recorded audio that has started recording after the factor specified in the process of step S 1402 is estimated to be 2 (for example, step If the factor specified in the process of S1402 is pleasure, the first recorded voice) is recorded in association with the identification information for identifying the recorded data (S1803. In this process, one characteristic The emotion specifying unit 203 that has identified the factor 203 will notify the voice storage unit 105 of the factor). Note that the communication device 10 may delete the second and third recorded voices after step S1803. In addition, the communication device 10 leaves the second and third recorded voices as second and third candidates representing the emotion of the call, and proceeds to the process of step S 1803 to obtain the second and third recorded voices. The identification information for identifying the recorded data may be recorded in association with (In this case, [Display format using call history], [Display format using schedule book], [Phone book When displaying the presence / absence of recorded audio as described in [Display format used], display the number of recorded audios in order of priority, or display only the number of recorded audios).
[0104] この後、 [通話履歴を利用した表示形式]、 [スケジュール帳を利用した表示形式]、 [0104] After this, [Display format using call history], [Display format using schedule book],
[電話帳を利用した表示形式]に記述したように録音音声の有無を表示した後、その 録音音声を再生する処理を行なうと、録音音声の感情を特定する主な原因となった 音声箇所力も再生を開始することができるため、録音音声再生後時間を力 4ナることな ぐ過去の通話によるコミュニケーションを思い出し易い環境を提供することができる。 また、必ずしも通話開始から通話終了までの全音声信号を録音する必要がな!、ため 、録音する音データ量を削減することができる。  After displaying the presence / absence of recorded audio as described in [Display format using phone book], the process of playing back the recorded audio will also cause the voice location power that was the main cause of identifying the emotion of the recorded audio. Since playback can be started, it is possible to provide an environment in which it is easy to remember communication over past calls without having to spend 4 hours after recording audio playback. Also, it is not always necessary to record all audio signals from the start of the call to the end of the call! Therefore, the amount of sound data to be recorded can be reduced.
[0105] さらに、通話音声を録音する別の実施例について説明する。図 19に、本発明の第 2実施形態の携帯電話による通話音声を録音する処理フロー (実施例 2)を示す。  [0105] Further, another embodiment for recording a call voice will be described. FIG. 19 shows a processing flow (Example 2) for recording a call voice by the mobile phone according to the second embodiment of the present invention.
[0106] 通話装置 10は、通話を開始すると (ステップ S402、 YES)、感情特定装置 20に受 話音声信号および送話音声信号を出力し、さらに、その音声信号の録音を開始する 。(ステップ S403)。感情特定装置 20は、通話装置 10から入力した音声信号のうち の感情を特定すべき対象の音声信号について、愛情、喜び、怒り、哀しみ、二ユート ラル (通常)から構成される感情情報の因子毎に、その感情の度合いを継続的に所 定の時間間隔で推定する(ステップ S404)。ステップ S404の処理において、推定し た感情情報の因子の中に 1つでもある閾値を越える数値の因子があれば (ステップ S 1901、 YES)、感情特定装置 20は、その因子とその因子が閾値を超えた時刻に関 する情報 (例えば、通話開始からの経過時間。通話中のどの時点でその因子が閾値 を超えたかわかればよい)を記憶する (ステップ S 1902。この処理において、感情推 定部 201が感情情報の因子の中に 1つでもある閾値を越える数値の因子があると推 定すると、感情蓄積部 202がその因子とその因子が閾値を超えた時刻を記憶する)。 感情特定装置 20は、通話装置 10による通話が終了するまで (ステップ S405、 YES) 、これらの処理を繰り返すことになる。例えば、感情特定装置 20が図 2に示す愛情、 喜び、怒り、哀しみ、ニュートラル (通常)から構成される感情情報の因子毎に、受話 音声信号に対して通話開始力も通話終了にかけて継続的に所定の時間間隔で推定 した場合、ある感情情報の因子が 2以上であるときに因子とその因子が閾値を超えた 時刻を記憶するとすると、通話装置 10は、通話開始後 10秒から 15秒の区間で喜び の因子が 2であると推定されたことから、喜びの因子と、通話開始 10秒後の時刻と、 を記憶し (第 1のタグ情報)、通話開始後 45秒から 50秒の区間で怒りの因子が 2であ ると推定されたことから、怒りの因子と、通話開始 45秒後の時刻と、を記憶し (第 2の タグ情報)、さらに通話開始後 50秒から 55秒の区間で哀しみの因子が 2であると推 定されたことから、哀しいの因子と、通話開始 50秒後の時刻と、を記憶する(第 3のタ グ情報)。 [0106] When the call device 10 starts a call (step S402, YES), the call device 10 outputs a received voice signal and a transmitted voice signal to the emotion identification device 20, and further starts recording the voice signal. (Step S403). The emotion identifying device 20 is a factor of emotion information composed of affection, joy, anger, sadness, and two uitals (normal) for the target speech signal of the speech signal input from the call device 10. Each time, the degree of emotion is continuously estimated at a predetermined time interval (step S404). In the process of step S404, If there is a factor with a numerical value exceeding the threshold that is at least one of the emotional information factors (step S 1901, YES), the emotion identification device 20 provides information on the factor and the time when the factor exceeded the threshold. (For example, the elapsed time since the start of the call. It is only necessary to know at which point in the call the factor exceeded the threshold) (Step S 1902. In this process, the emotion estimation unit 201 determines the factor of the emotion information. If it is estimated that there is a factor with a numerical value exceeding the threshold value, the emotion accumulation unit 202 stores the factor and the time when the factor exceeds the threshold value). The emotion identification device 20 repeats these processes until the call by the call device 10 is completed (step S405, YES). For example, the emotion starter 20 can continuously determine the call start power for the received voice signal at the end of the call for each emotion information factor consisting of affection, joy, anger, sadness, and neutral (normal) shown in FIG. If the factor of emotion information is 2 or more and the factor and the time when the factor exceeded the threshold are memorized, the call device 10 will be in the interval from 10 seconds to 15 seconds after the start of the call. Since the joy factor was estimated to be 2, the joy factor and the time 10 seconds after the start of the call were memorized (first tag information), and the interval from 45 to 50 seconds after the start of the call Therefore, the anger factor and the time 45 seconds after the start of the call are memorized (second tag information), and 50 to 55 seconds after the start of the call. The sadness factor was estimated to be 2 in the interval It stores a child, and the time after the call start 50 seconds, the (third tag information).
[0107] 通話装置 10による通話終了後、感情特定装置 20は、その 1通話の間に推定した 各々の因子毎の数値を解析し、その数値カゝら特徴的な因子を 1つ特定し、その 1つ の因子により表される感情をこの通話における通話相手または携帯電話利用者の感 情として記憶し、さらに、特定した因子が閾値を超えた時刻に関する情報 (タグ情報) を記憶する (ステップ S 1903。このとき、図 13に示す通話に関する情報と録音した音 声データの有無と特定した感情との対応関係に加え、タグ情報も対応付けて記憶す る)。また、通話装置 10は、音声信号の入力が終了すると、音声信号の録音を終了し 、その録音データとその録音データを識別するための識別情報とを対応付けて記録 する(ステップ S 1403)  [0107] After the end of the call by the communication device 10, the emotion identification device 20 analyzes the numerical value for each factor estimated during the one call, specifies one characteristic factor from the numerical value, The emotion expressed by the one factor is stored as the other party's or mobile phone user's emotion in this call, and the information about the time when the specified factor exceeds the threshold (tag information) is stored (step information). S 1903. At this time, in addition to the correspondence between the information related to the call shown in FIG. 13, the presence / absence of the recorded voice data, and the identified emotion, tag information is also stored in association with each other). Further, when the input of the audio signal is completed, call device 10 ends the recording of the audio signal, and records the recorded data and identification information for identifying the recorded data in association with each other (step S 1403).
[0108] この後、 [通話履歴を利用した表示形式]、 [スケジュール帳を利用した表示形式]、 [電話帳を利用した表示形式]に記述したように録音音声の有無を表示した後、その 録音音声を再生する処理を行なうと、タグ情報力 特定できる音声箇所力 再生を開 始することができるため、録音音声再生後時間をかけることなぐ過去の通話によるコ ミュ-ケーシヨンを思い出し易い環境を提供することができる。また、必ずしも通話開 始力 通話終了までの全音声信号を録音する必要がないため、録音する音データ 量を削減することができる。なお、特定した因子に複数のタグ情報が存在する場合に は、録音音声の有無を表示した後に複数のタグ情報のうちのいずれ力 1つを選択す るよう促す画面を表示させ、選択されたタグ情報から特定できる音声箇所から再生を 開始するようにすればよい。さらに、タグ情報から特定できる音声箇所から所定の時 間前の箇所から録音音声の再生を開始することにより、録音音声の感情を特定する 主な原因となった音声箇所を聞き逃すことがなくなる。 [0108] After this, [Display format using call history], [Display format using schedule book], As described in [Display format using the phone book], after displaying the presence / absence of the recorded voice, playback of the recorded voice can start the voice location power that can identify the tag information power. Therefore, it is possible to provide an environment in which it is easy to remember a communication by a past call without taking a long time after reproduction of recorded sound. Moreover, since it is not always necessary to record all voice signals until the start of the call, the amount of sound data to be recorded can be reduced. If there are multiple tag information for the specified factor, a screen prompting you to select one of the tag information after displaying the presence or absence of the recorded audio is displayed and selected. Playback should be started from the audio location that can be identified from the tag information. Furthermore, by starting playback of the recorded voice from a location that is a predetermined time before the voice location that can be identified from the tag information, it is possible to avoid missing the voice location that is the main cause of identifying the emotion of the recorded speech.
[0109] 本発明を詳細にまた特定の実施態様を参照して説明したが、本発明の精神と範囲 を逸脱することなく様々な変更や修正を加えることができることは当業者にとって明ら かである。  [0109] Although the invention has been described in detail and with reference to specific embodiments, it will be apparent to those skilled in the art that various changes and modifications can be made without departing from the spirit and scope of the invention. is there.
[0110] 本出願は、 2005年 12月 16日出願の日本特許出願 (特願 2005— 363837)に基づくも のであり、その内容はここに参照として取り込まれる。  [0110] This application is based on a Japanese patent application filed on December 16, 2005 (Japanese Patent Application No. 2005-363837), the contents of which are incorporated herein by reference.
産業上の利用可能性  Industrial applicability
[0111] 本発明の情報処理端末によれば、利用者に手間や時間をかけることを強いること 無ぐ過去の通話内容やメール本文の内容を思い出すことを支援することができると いう効果を奏し、電子メールに記載されている文字列や通話時の音声から、その電 子メールの作成者の感情や通話中の話者の感情を特定することができる情報処理 端末の分野において有用である。 [0111] According to the information processing terminal of the present invention, it is possible to support recalling the contents of past calls and the contents of e-mail texts without forcing the user to spend time and effort. This is useful in the field of information processing terminals that can identify the emotion of the creator of the email and the emotion of the speaker during the call from the character string described in the email and the voice during the call.

Claims

請求の範囲 The scope of the claims
[1] 少なくとも文字データまたは音声により構成される感情特定用情報を入力する入力 手段と、  [1] An input means for inputting emotion specifying information composed of at least character data or speech;
前記入力手段により入力した感情特定用情報に基づ!/、て、感情を特定する感情特 定手段と、  Based on the emotion identifying information input by the input means! /, An emotion identifying means for identifying an emotion;
前記感情特定手段により特定した感情に関する情報を表示する表示手段と、 を備える情報処理端末。  An information processing terminal comprising: display means for displaying information related to the emotion specified by the emotion specifying means.
[2] 請求項 1記載の情報処理端末であって、  [2] An information processing terminal according to claim 1,
電子メールを送信または受信するメール送受信手段を備え、  E-mail sending and receiving means for sending or receiving e-mail
前記入力手段は、前記メール送受信手段により送信または受信した電子メールが 有する、少なくとも文字列またはマーク力も構成される文字データを入力し、 前記感情特定手段は、前記入力手段により入力した文字データに基づいて、感情 を特定し、  The input means inputs character data including at least a character string or a mark power included in the email transmitted or received by the mail transmitting / receiving means, and the emotion specifying means is based on the character data input by the input means. Identify emotions,
前記表示手段は、前記感情特定手段により特定した、前記メール送受信手段によ り送信または受信した電子メールに対応する感情に関する情報を表示する、情報処 理端末。  The information processing terminal, wherein the display means displays information related to emotions specified by the emotion specifying means and corresponding to the electronic mail transmitted or received by the mail transmitting / receiving means.
[3] 請求項 2記載の情報処理端末であって、  [3] An information processing terminal according to claim 2,
前記表示手段は、前記メール送受信手段により電子メールを送信または受信した 順に、各電子メール毎に少なくとも、送信時刻または受信時刻、送信先または送信元 、および前記感情に関する情報を表示する、情報処理端末。  The information processing terminal displays at least a transmission time or a reception time, a transmission destination or a transmission source, and information on the emotion for each electronic mail in the order in which the electronic mail is transmitted or received by the mail transmission / reception means. .
[4] 請求項 2記載の情報処理端末であって、 [4] An information processing terminal according to claim 2,
前記表示手段は、日付毎に、前記メール送受信手段により当該日付に送信または 受信した電子メールに対応する前記感情に関する情報を表示する、情報処理端末。  The information processing terminal, wherein the display unit displays, for each date, information related to the emotion corresponding to an e-mail transmitted or received on the date by the mail transmitting / receiving unit.
[5] 請求項 2記載の情報処理端末であって、 [5] An information processing terminal according to claim 2,
前記表示手段は、電話帳機能に登録されている対象者毎に、前記メール送受信手 段により当該対象者に送信した電子メールまたは当該対象者から受信した電子メー ルに対応する前記感情に関する情報を表示する、情報処理端末。  The display means displays, for each target person registered in the telephone directory function, information on the emotion corresponding to the e-mail transmitted to the target person or the e-mail received from the target person by the mail transmitting / receiving means. Information processing terminal to display.
[6] 請求項 2、 4または 5のいずれか 1項に記載の情報処理端末であって、 前記表示手段は、前記メール送受信手段により送信または受信した複数の電子メ ール毎に特定される前記感情に関する情報のうちの、いずれ力 1つを表示する、情 報処理端末。 [6] The information processing terminal according to any one of claims 2, 4 and 5, The information processing terminal, wherein the display means displays any one of the information related to the emotion specified for each of the plurality of electronic mails transmitted or received by the mail transmitting / receiving means.
[7] 請求項 1記載の情報処理端末であって、  [7] An information processing terminal according to claim 1,
通話を行なう通話手段を備え、  It has a calling means for making calls,
前記入力手段は、前記通話手段による通話音声を入力し、  The input means inputs a call voice by the call means,
前記感情特定手段は、前記入力手段により入力した通話音声に基づいて、感情を 特定し、  The emotion identifying means identifies an emotion based on the call voice input by the input means,
前記表示手段は、前記感情特定手段により特定した、前記通話手段により通話し た着信者、発信者または着信者および発信者の両者につ!、ての感情に関する情報 を表示する、情報処理端末。  The information processing terminal, wherein the display means displays information relating to the emotions specified by the emotion specifying means, to the called party, the caller, or both the called party and the caller who made a call by the calling means.
[8] 請求項 7記載の情報処理端末であって、 [8] An information processing terminal according to claim 7,
前記表示手段は、前記通話手段により発信または着信した順に、各通話毎に少な くとも、発信時刻または着信時刻、着信者または発信者、および前記感情に関する 情報を表示する、情報処理端末。  The information processing terminal, wherein the display means displays at least a call time or a call arrival time, a callee or a caller, and information related to the emotion for each call in the order in which the call is made or received by the call means.
[9] 請求項 7記載の情報処理端末であって、 [9] The information processing terminal according to claim 7,
前記表示手段は、日付毎に、前記通話手段により当該日付に発信または着信した 通話に対応する前記感情に関する情報を表示する、情報処理端末。  The information processing terminal, wherein the display unit displays, for each date, information related to the emotion corresponding to a call that is transmitted or received on the date by the call unit.
[10] 請求項 7記載の情報処理端末であって、 [10] An information processing terminal according to claim 7,
前記表示手段は、電話帳機能に登録されている対象者毎に、前記通話手段により 通話した当該対象者についての前記感情に関する情報を表示する、情報処理端末  The display means displays, for each target person registered in the telephone directory function, information related to the emotion about the target person who has called by the call means.
[11] 請求項 7、 9または 10のいずれか 1項に記載の情報処理端末であって、 [11] The information processing terminal according to any one of claims 7, 9 and 10,
前記表示手段は、前記通話手段により発信または着信した複数の通話毎に特定さ れる前記感情に関する情報のうちの、いずれか 1つを表示する、情報処理端末。  The information processing terminal, wherein the display unit displays any one of the information related to the emotion specified for each of a plurality of calls that are transmitted or received by the call unit.
[12] 請求項 7から 11のいずれ力 1項に記載の情報処理端末であって、  [12] The information processing terminal according to any one of claims 7 to 11, wherein
前記通話手段による通話音声を記憶する通話記憶手段と、  Call storage means for storing call voice by the call means;
前記通話記憶手段に記憶した通話音声を再生する通話再生手段と、を備え、 前記通話再生手段は、前記通話記憶手段に記憶した通話音声のうちの、前記表 示手段により表示している前記感情に関する情報を特定した通話音声を再生する、 情報処理端末。 Call playback means for playing back the call voice stored in the call storage means, The information processing terminal, wherein the call reproduction means reproduces a call voice specifying information related to the emotion displayed by the display means, among the call voices stored in the call storage means.
[13] 請求項 12記載の情報処理端末であって、  [13] An information processing terminal according to claim 12,
前記通話記憶手段は、前記通話手段による通話音声のうちの、前記感情特定手段 により特定した感情が反映されている箇所を記憶し、  The call storage means stores a portion in which the emotion specified by the emotion specifying means is reflected in the call voice by the call means,
前記通話再生手段は、前記通話記憶手段に記憶した通話のうちの、少なくとも前 記箇所を再生する、情報処理端末。  The information processing terminal, wherein the call reproduction means reproduces at least the part of the call stored in the call storage means.
[14] 請求項 13記載の情報処理端末であって、 [14] The information processing terminal according to claim 13,
前記通話再生手段は、前記箇所の開始時点から予め設定した時間分前を再生開 始時点として、前記箇所を再生する、情報処理端末。  The information processing terminal, wherein the call reproduction means reproduces the location with a preset time before the start time of the location as a reproduction start time.
[15] 請求項 1から 14のいずれ力 1項に記載の情報処理端末であって、 [15] The information processing terminal according to any one of claims 1 to 14,
前記表示手段は、前記感情特定手段により特定した感情に関する情報を、少なくと も前記感情を表す文字または前記感情を表すマークにより表示する、情報処理端末  The information processing terminal displays the information related to the emotion identified by the emotion identifying unit with at least a character representing the emotion or a mark representing the emotion.
PCT/JP2006/314521 2005-12-16 2006-07-21 Information processing terminal WO2007069361A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2005363837 2005-12-16
JP2005-363837 2005-12-16

Publications (1)

Publication Number Publication Date
WO2007069361A1 true WO2007069361A1 (en) 2007-06-21

Family

ID=38162679

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2006/314521 WO2007069361A1 (en) 2005-12-16 2006-07-21 Information processing terminal

Country Status (1)

Country Link
WO (1) WO2007069361A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011141651A (en) * 2010-01-06 2011-07-21 Nec System Technologies Ltd Electronic mail system, electronic mail receiving device, and display method
US8299897B2 (en) 2008-01-24 2012-10-30 Sony Corporation Information processing apparatus, method, and program
JP2013206389A (en) * 2012-03-29 2013-10-07 Fujitsu Ltd Intimacy calculation method, intimacy calculation program and intimacy calculation device
JP2015096867A (en) * 2008-10-22 2015-05-21 グーグル・インコーポレーテッド Geocoding of personal information
JP2022020659A (en) * 2017-08-08 2022-02-01 Line株式会社 Method and system for recognizing feeling during conversation, and utilizing recognized feeling

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005135169A (en) * 2003-10-30 2005-05-26 Nec Corp Portable terminal and data processing method
JP2005152054A (en) * 2003-11-20 2005-06-16 Sony Corp Emotion calculating apparatus, emotion calculating method and portable communication apparatus
JP2005192024A (en) * 2003-12-26 2005-07-14 Fujitsu I-Network Systems Ltd Communication voice data management system in call center and operator terminal using the same
JP2005345496A (en) * 2004-05-31 2005-12-15 Nippon Telegr & Teleph Corp <Ntt> Speech processor, speech processing method and its program

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005135169A (en) * 2003-10-30 2005-05-26 Nec Corp Portable terminal and data processing method
JP2005152054A (en) * 2003-11-20 2005-06-16 Sony Corp Emotion calculating apparatus, emotion calculating method and portable communication apparatus
JP2005192024A (en) * 2003-12-26 2005-07-14 Fujitsu I-Network Systems Ltd Communication voice data management system in call center and operator terminal using the same
JP2005345496A (en) * 2004-05-31 2005-12-15 Nippon Telegr & Teleph Corp <Ntt> Speech processor, speech processing method and its program

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8299897B2 (en) 2008-01-24 2012-10-30 Sony Corporation Information processing apparatus, method, and program
JP2015096867A (en) * 2008-10-22 2015-05-21 グーグル・インコーポレーテッド Geocoding of personal information
US10055862B2 (en) 2008-10-22 2018-08-21 Google Llc Geocoding personal information
US10867419B2 (en) 2008-10-22 2020-12-15 Google Llc Geocoding personal information
US11704847B2 (en) 2008-10-22 2023-07-18 Google Llc Geocoding personal information
JP2011141651A (en) * 2010-01-06 2011-07-21 Nec System Technologies Ltd Electronic mail system, electronic mail receiving device, and display method
JP2013206389A (en) * 2012-03-29 2013-10-07 Fujitsu Ltd Intimacy calculation method, intimacy calculation program and intimacy calculation device
JP2022020659A (en) * 2017-08-08 2022-02-01 Line株式会社 Method and system for recognizing feeling during conversation, and utilizing recognized feeling

Similar Documents

Publication Publication Date Title
JP4226055B2 (en) Communication terminal device and program
KR100689396B1 (en) Apparatus and method of managing call history using speech recognition
JP3738383B2 (en) Communication device
US8515405B2 (en) Communication device
US20060183513A1 (en) Audio caller ID for mobile telephone headsets
WO2006028223A1 (en) Information processing terminal
WO2004070567A2 (en) Method for populating a caller’s information to a host-based address book
EP1170932B1 (en) Audible identification of caller and callee for mobile communication device
CN1926848A (en) Method and application for arranging a conference call in a cellular network and a mobile terminal operating in a cellular network
KR20070054070A (en) Portable terminal apparatus, computer-readable storage medium, and image display method
WO2007069361A1 (en) Information processing terminal
US20080096587A1 (en) Telephone for Sending Voice and Text Messages
JP5233287B2 (en) Mobile communication terminal
JP2004129174A (en) Information communication instrument, information communication program, and recording medium
WO2018061824A1 (en) Information processing device, information processing method, and program recording medium
JP2007019600A (en) Telephone set and call termination notice method
JP3877724B2 (en) Telephone
JP2003069662A (en) Telephone equipment and cordless telephone equipment
JP2003319087A (en) Communication apparatus
JP5218376B2 (en) Telephone device with call recording function that is easy to search
KR100645765B1 (en) Method for automatic updating a phone-call details
JP4490943B2 (en) Mobile phone
CN100490472C (en) Mobile communication terminal possessing when input voicememo display specify image function and method thereof
JP2007257238A (en) Telephone set
KR100738417B1 (en) Mobile communication terminal for providing buddy management function by using address book

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 06781443

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP