JP2005151271A - Portable information device - Google Patents

Portable information device Download PDF

Info

Publication number
JP2005151271A
JP2005151271A JP2003387422A JP2003387422A JP2005151271A JP 2005151271 A JP2005151271 A JP 2005151271A JP 2003387422 A JP2003387422 A JP 2003387422A JP 2003387422 A JP2003387422 A JP 2003387422A JP 2005151271 A JP2005151271 A JP 2005151271A
Authority
JP
Japan
Prior art keywords
user
image information
information
image
unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP2003387422A
Other languages
Japanese (ja)
Other versions
JP4218830B2 (en
Inventor
Hisashi Akiguchi
Takeshi Kokubo
Kazuhiro Kondo
Kana Matsuura
Hideaki Shoji
Naohiro Takahashi
Takamoto Tsuda
Hirotaka Yamazaki
武 小久保
弘登 山崎
英明 東海林
可奈 松浦
崇基 津田
久之 秋口
和弘 近藤
直寛 高橋
Original Assignee
Sony Ericsson Mobilecommunications Japan Inc
ソニー・エリクソン・モバイルコミュニケーションズ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Ericsson Mobilecommunications Japan Inc, ソニー・エリクソン・モバイルコミュニケーションズ株式会社 filed Critical Sony Ericsson Mobilecommunications Japan Inc
Priority to JP2003387422A priority Critical patent/JP4218830B2/en
Publication of JP2005151271A publication Critical patent/JP2005151271A/en
Application granted granted Critical
Publication of JP4218830B2 publication Critical patent/JP4218830B2/en
Application status is Expired - Fee Related legal-status Critical
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers; Analogous equipment at exchanges
    • H04M1/02Constructional features of telephone sets
    • H04M1/0202Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
    • H04M1/0279Improving the user comfort or ergonomics
    • H04M1/0283Improving the user comfort or ergonomics for providing a decorative aspect, e.g. customization of casings, exchangeable faceplate
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers; Analogous equipment at exchanges
    • H04M1/02Constructional features of telephone sets
    • H04M1/22Illuminating; Arrangements for improving visibility of characters on dials
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers; Analogous equipment at exchanges
    • H04M1/72Substation extension arrangements; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selecting
    • H04M1/725Cordless telephones
    • H04M1/72519Portable communication terminals with improved user interface to control a main telephone operation mode or to indicate the communication status
    • H04M1/72563Portable communication terminals with improved user interface to control a main telephone operation mode or to indicate the communication status with means for adapting by the user the functionality or the communication capability of the terminal under specific circumstances
    • H04M1/72569Portable communication terminals with improved user interface to control a main telephone operation mode or to indicate the communication status with means for adapting by the user the functionality or the communication capability of the terminal under specific circumstances according to context or environment related information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/52Details of telephonic subscriber devices including functional features of a camera

Abstract

PROBLEM TO BE SOLVED: To customize a portable terminal in various variations without requiring a user to perform an operation for customization with particular attention.
An image is displayed on a monitor 2 of a mobile phone terminal 1 in accordance with the current state of the user himself / herself, the user's surroundings, and the inside of the terminal. For example, if the color of the user's clothing photographed by the camera 7 is white, the monitor 2 displays a white color accordingly. Alternatively, if the user's pulse detected by the body condition parameter detection unit 8 is fast, blinking display is performed at a fast pace.
[Selection] Figure 1

Description

  The present invention relates to a portable terminal such as a cellular phone terminal, a PDA (Personal Digital Assistant), and a notebook personal computer.

  In recent years, mobile phone terminals have become widespread worldwide. However, the number of mobile phone terminal suppliers is limited, and therefore there are not many variations of mobile phone terminals. On the other hand, many users (users) of mobile phone terminals dislike having the same terminal as others and desire to “customize” their own terminal.

  Here, as the customization of the mobile phone terminal, for example, it is possible to customize by changing the ringtone, changing the image displayed on the standby screen (hereinafter referred to as the standby image), or changing the design of the housing. It can be customized by doing.

  The customization by changing the ring tone or the standby image is to change the standard ring tone or the standby image prepared in advance in the mobile phone terminal to another desired ring tone or the standby image. The ringtone and standby screen are changed by downloading the data from sites that provide various ringtones and standby screen data, or by creating the user himself by operating the terminal. ing.

  In addition, customization by changing the design of the case of a mobile phone terminal means, for example, pasting a seal on an existing case, painting a color on the case, drawing a picture, or some exterior parts of the case Can be replaced with a desired color or pattern, or the casing itself can be replaced with another casing having a different color or pattern. In addition, the replacement of the exterior parts and the replacement to the other case may be made and sold not only by the manufacturer of mobile phone terminals but also manufactured and sold by other third parties. Many. Further, in the case of customization by replacement of exterior parts, the mobile phone terminal is detachable from the exterior parts.

  In addition, in Japanese Patent Laid-Open No. 2002-300240 (Patent Document 1), the display color on the screen is switched by changing part or all of the color information of the display color image data by a user operation. By doing so, a technique for enabling a user's favorite screen display to be performed is disclosed.

JP 2002-300240 (FIG. 3)

  However, customization by changing the ring tone and standby image as described above, customization by changing the design of the housing, and switching of the screen display color according to the user's operation as described in the above-mentioned published patent publication All customization and the like by the user must be performed by the user for customization.

  Since these customization operations are very time-consuming and troublesome, users tend to neglect to actually customize them even though they have a customizable mobile phone terminal. As a result, he always carries a mobile phone terminal that does not look good.

  The present invention has been made in view of the above-described problems, and provides a portable terminal that can be customized in various variations every day without specially performing an operation for customization and the like. Objective.

  The portable terminal of the present invention includes a display unit capable of displaying an image, a user related information detecting unit for detecting user related information related to a user of the terminal, and an image for outputting image information corresponding to the user related information. An information output unit; and an image control unit configured to change and control a display image of the display unit based on the image information output from the image information output unit.

  Here, the user-related information detection means includes a basic color detection unit that detects a basic color from shooting data obtained by shooting a user's clothes or a scene around the user with a camera unit that captures an image, and a physical and psychological state of the user. A physical condition parameter detection unit that detects a parameter that varies due to the target state, a time detection unit that measures time, a schedule detection unit that reads schedule information from a schedule information holding unit that holds user schedule information, and It has any one of the position detection part which detects the position of a terminal, or at least 2 or more of them.

  That is, according to the present invention, the display image of the display unit is controlled to be changed in accordance with user-related information related to the user of the terminal, so that the user particularly performs an instruction operation for changing the display image, etc. It is not necessary to do.

  In the present invention, since the display image is controlled to be changed according to user-related information, a portable terminal customized in various variations can be realized every day without specially performing an operation for customization. It becomes possible.

  Hereinafter, a portable terminal according to an embodiment of the present invention will be described with reference to the drawings. In the present embodiment, a mobile phone terminal will be described as an example of a mobile terminal.

[Appearance of mobile phone terminal]
FIG. 1 shows an example of an external configuration of a mobile phone terminal according to an embodiment to which the present invention is applied.

  The mobile phone terminal 1 is provided with a monitor 2, an operation unit 3, a speaker 4, a microphone (microphone) 5, an antenna 6, a camera 7, and a body condition parameter detection unit 8 described later.

  The monitor 2 is a liquid crystal panel, for example, and can display characters, images, and the like on the screen of the liquid crystal panel. In the case of the embodiment of the present invention, the monitor 2 performs image display according to user-related information described later and image display according to the internal state of the mobile phone terminal 1.

  The operation unit 3 includes a numeric keypad, a jog dial, and the like, and is operated by the user when inputting a call destination number or mail character.

  For example, the speaker 4 outputs a received voice during a call, and the microphone 5 collects a transmitted voice during the call.

  The antenna 6 is for performing radio wave communication with the base station.

  The camera 7 includes a lens system and an imaging device such as a CCD (Charge Coupled Diode), and can capture images of the user's surroundings such as the user's clothes and the user's location through the lens system. Is.

  As an example, the body condition parameter detection unit 8 is provided at a position where the user's finger touches when the user holds the mobile phone terminal 1, and the user's body such as body temperature and pulse is transmitted via the user's finger. It is a body temperature and pulse sensor that can measure physical condition parameters that reflect mental and psychological conditions. The mobile phone terminal 1 according to the present embodiment is a normal terminal as shown in FIG. 1, but may be, for example, a wristwatch type terminal. In this case, the body condition parameter detection unit 8 is the wristwatch type terminal. Is provided at a portion in contact with the user's arm. In addition, the body condition parameter detection unit 8 may detect a user's brain wave by detecting weak electromagnetic waves emitted from a human body, for example.

  In addition to these functions, the cellular phone terminal 1 according to the present embodiment has a built-in communication antenna such as GPS (Grobal Pointing System) or Bluetooth (Bluetooth: trademark), and collects position information by GPS. It is also possible to communicate with other electronic devices via Bluetooth or the like. The mobile phone terminal 1 also includes an acceleration sensor.

[Internal configuration of mobile phone terminal]
FIG. 2 shows an internal configuration of the mobile phone terminal 1 of the present embodiment.

  The control unit 11 is, for example, a CPU that operates based on a program that controls the entire mobile phone terminal 1, and controls various functions via the bus 10 according to the program. The control unit 11 can also recognize which application the user is executing on the mobile phone terminal 1 by the program being executed by the control unit 11. That is, for example, the control unit 11 indicates that the user is performing a mail operation when the mail application is activated, and that the user is performing a camera operation when the camera application is activated. When the schedule application is running, the user is editing the schedule, and when the call application or data transmission / reception application is running, the user is making a call or data transmission / reception, and the charging application is When operating, it is possible to recognize that charging is in progress.

  The communication circuit 12 communicates with the base station through the antenna 6. In addition, the cellular phone terminal 1 of the present embodiment can be connected to the Internet or the like through the antenna 6 and the communication circuit 12, and therefore, from a desired site on the Internet, around the place where the user is located. It is possible to obtain information such as what kind of store is present and weather forecast information.

  The display control unit 13 performs display control of the monitor 2.

  The operation signal generation unit 14 generates an operation signal corresponding to the operation of the operation unit 3 and sends the operation signal to the control unit 11.

  The memory 15 is, for example, a memory-holding rewritable memory, and holds phone book data, schedule management software, a program for operating the control unit 11, and the like.

  The data holding unit 16 is a memory holding type rewritable memory similar to the memory 15. The data holding unit 16 is a current user's own state detected as user-related information to be described later, the user's surrounding state, or the internal state of the terminal and the monitor 2. Table data indicating correspondence with display images, user schedules, various types of image data, such as weather forecast information acquired via the Internet or the like, are recorded. The image data is not only recorded by default, but may be downloaded from a server or the like as necessary. The data holding unit 16 also stores map data for detecting the current location against the position data from the GPS. In addition, the data holding unit 16 may be included in the memory 15.

  The speaker drive circuit 17 drives the speaker 4 and outputs a call sound, a ring tone, and the like.

  The microphone input signal generator 18 amplifies the sound collected by the microphone 5 and performs analog / digital conversion to generate digital sound data.

  The short-range communication circuit 19 is a circuit for performing short-range wireless communication such as Bluetooth, infrared communication (IrDA), UWB (Ultra Wide Band), and the like, and if necessary, an antenna 19a (infrared transmission / reception in the case of infrared communication) To communicate with other electronic devices via The mobile phone terminal of the present embodiment uses the short-range communication circuit 19 to send information indicating the current user's own state, the user's surrounding state, or the internal state of the terminal detected as user-related information to the other mobile phone. Transmission / reception is possible with electronic devices such as terminals.

  The sensor detection signal generation unit 20 performs analog / digital conversion on the detection signal from the sensor 25 and sends the digital signal to the control unit 11 via the bus 10. As the sensor 25, there are a body temperature / pulse sensor, an air temperature sensor, an acceleration sensor, and the like, which are the body condition parameter detection unit 8 described above, and a plurality of these can be provided as long as installation space permits. Note that the mobile phone terminal 1 according to the present embodiment can also detect whether the user is stopped, walking, running, or the like based on a detection signal from the acceleration sensor. .

  The camera drive circuit 21 drives the camera 7, receives an image signal from the camera 7, and sends the image signal to the control unit 11 via the bus 10. At this time, the control unit 11 performs image analysis from the image pickup signal of the camera 7, for example, what kind of color the user is wearing, what color of the user is, where the user is, It is also possible to recognize the current weather and time zone from the brightness and hue of the photographed image.

  The GPS reception circuit 22 receives a signal from a GPS satellite via the antenna 22 a and sends the GPS reception signal to the control unit 11. Therefore, the control unit 11 at this time recognizes the latitude / longitude of the current position from the GPS reception signal, and from the map data held in the data holding unit 16 and the latitude / longitude of the current position, the user (that is, the mobile phone) The name of the region, country, city, town, etc. where the terminal 1) exists can be detected. In addition, the mobile phone terminal 1 determines whether the user is walking in which direction from the GPS reception signal, and whether the user is walking based on the moving speed and the map data. It is also possible to detect whether the person is riding a bicycle, riding a bicycle, or riding a car, a train, an airplane, a ship, or the like.

  The timer 23 outputs a time count value and sends the time count value to the control unit 11 via the bus 10. Thereby, the control part 11 can detect the present | current time and time slot | zone (for example, morning, noon, night, etc.). Further, the control unit 11 can also know the elapsed time from the time when some processing or operation is performed in the past to the present, based on the time count value from the timer 23.

  In addition, although not shown, the mobile phone terminal according to the present embodiment also has a function of playing back music data downloaded via, for example, the Internet or music data held in a memory card or the like.

[Image Display Control of the Embodiment of the Present Invention]
Next, specific image display control by the cellular phone terminal 1 according to the embodiment of the present invention will be described.

  FIG. 3 shows a flowchart of processing on the control unit 11 side for executing image display control of the mobile phone terminal 1. The process shown in this flowchart is looped at regular time intervals.

  In FIG. 3, first, in step S <b> 1, the control unit 11 detects a detection signal from the body condition parameter detection unit 8 and the acceleration sensor included in the sensor 25, a time count value from the timer 23, an imaging signal from the camera 7, GPS Based on the GPS received signal from the receiving circuit 22 or the like, the user's own state is detected. Here, the user's own state means that the user's body temperature is several times, the user's pulse is fast or slow, how fast the user is moving, and what color or pattern the user is wearing Or something. That is, the control unit 11 recognizes the user's body temperature and pulse based on detection signals such as body temperature and pulse from the body condition parameter detection unit 8. In addition, the control unit 11 extracts the base color of the clothing from the captured data when the user's clothing is photographed by the camera 7 or recognizes the pattern of the clothing by pattern recognition. Further, the control unit 11 detects the moving speed of the user based on the output of the acceleration sensor and the GPS reception signal from the GPS reception circuit 22. Further, in step S1, the control unit 11 detects from the user schedule data held by the data holding unit 16 what the user's own current schedule is as the user's own state. To do. Note that it is possible for the user to set which of the user's own states is detected or how many of those states are to be detected, and the setting data is also stored in the data holding unit 16. Retained. After step S1, the control unit 11 advances the process to step S2.

  When the process proceeds to step S2, the control unit 11 detects the temperature sensor included in the sensor 25, the weather forecast data acquired in advance and held in the data holding unit 16, the time count value from the timer 23, the camera 7 Based on the image pickup signal, the GPS reception signal from the GPS reception circuit 22, the input audio signal from the microphone 5, and the like, the state around the user is detected. Here, the state around the user is the current time (or time zone), the current position where the user is located (including not only latitude and longitude, but also the country, city, town, region, etc.), the townscape around the user, The scenery, the sound environment around the user, etc. That is, the control unit 11 detects the outside air temperature with the temperature sensor included in the sensor 25, recognizes the current time based on the time count value from the timer 23, and reads out from the data holding unit 16 based on the time count value. The weather of the day is recognized from the weather forecast data, or the weather of the day is recognized from an imaging signal when the sky is photographed by the camera 7, for example. Further, the control unit 11 recognizes the current position where the user is present from the GPS reception signal from the GPS reception circuit 22, and recognizes surrounding scenes, streets and landscapes taken by the camera 7, their base colors, and the like. Furthermore, the control unit 11 recognizes ambient environmental sounds based on the input audio signal from the microphone 5. Note that it is possible for the user to set which of these states around the user are detected or how many of them are detected, and the setting data is also held in the data holding unit 16. After step S2, the control unit 11 advances the process to step S3.

  When the processing proceeds to step S3, the control unit 11 detects what state is in the terminal depending on what application is activated and executed. Here, the internal state of the terminal means, for example, whether a call is in progress, a mail operation is being performed, or a camera is being photographed. That is, since the control unit 11 can recognize the application being executed, the control unit 11 recognizes that the mail operation is being performed, for example, when the mail application is being executed, and is busy while the call application is being executed, for example. It can be recognized that there is. It should be noted that it is possible for the user to set which one of the states inside the terminal is detected or how many of them are detected, and the setting data is also held in the data holding unit 16. After step S3, the control unit 11 advances the process to step S4.

  In step S4, the control unit 11 determines whether to change and control the display content of the image. That is, in step S4, the control unit 11 determines the elapsed time from the time when the previous display control was performed, the user's own state detected in the previous step S1, step S2, and step S3, the user surrounding state, the terminal Using the internal state, the difference between each state used in the previous display control and the current state, etc., for example, depending on whether the elapsed time has exceeded a predetermined time or the change in each state is large, etc. It is determined whether to change the display content of. In step S4, the control unit 11 determines to change and control the display content of the monitor 2 when, for example, the elapsed time is equal to or longer than a predetermined time or the change in each state is large. If it is determined that the display content is to be changed, the control unit 11 advances the process to step S5, and otherwise ends the process of this flowchart.

  When the process proceeds to step S5, the control unit 11 refers to the information in the correspondence table held in the data holding unit 16, and the user's own state, the user surrounding state, and the terminal acquired in step S1 to step S3. A display image based on the internal state and the correspondence table is generated, and the display image is displayed on the monitor 2.

[Specific example of monitor display control according to the status]
Here, in the mobile phone terminal according to the embodiment of the present invention, the control unit 11 specifically includes the above-described user's own state, the user's surrounding state, the internal state of the terminal, and a correspondence table as exemplified below. The screen display of the monitor 2 is controlled based on the above. These are all examples.

  FIG. 4 shows an example in which the display screen of the monitor 2 is divided into, for example, an image display area 2a and a background display area 2b. In FIG. 5, the display screen of the monitor 2 is, for example, an image. The example divided into display areas 2a and 2c and background display area 2b is shown. In particular, the image display area 2a in FIG. 4 and FIG. 5 is a portion where an image having contents corresponding to the application being executed is displayed. For example, when the mail application is executed, the mail operation image is executed by the schedule application. The schedule book image is displayed when the camera application is being executed, and the photographed image is displayed when the camera application is being executed. Note that the area of each display region in FIGS. 4 and 5 can be changed as appropriate.

  First, a specific example in which display control of the image display areas 2a and 2c, the background display area 2b, and the like in FIGS. 4 and 5 is performed according to the current user's own state will be described.

  FIG. 6 shows a correspondence table between the color system obtained by the control unit 11 analyzing a photographed image taken when the user is wearing clothes with the camera 7 and the color displayed on the screen of the monitor 2. An example is shown.

  Here, when the correspondence table of FIG. 6 is a correspondence table of the color system of the user's clothes and the background display area 2b of the monitor 2 shown in FIGS. For example, if the clothing color system is “white”, the background display area 2b on the screen of the monitor 2 is set to “white”. Similarly, when the clothing color system is “black”, the background display area 2b Is “black” and the clothing color system is “blue”, screen display control is performed so that the background display area 2b is “blue”.

  The correspondence table in FIG. 6 is a correspondence table between, for example, the color system of the user's clothes and the taste (color and pattern) of the image displayed in the image display area 2a of the monitor 2 shown in FIGS. In this case, the control unit 11 controls to change the taste of the image displayed in the image display area 2a of FIGS. 4 and 5 according to the color system of the clothes worn by the user.

  FIG. 7 shows an example of a correspondence table between a user's current pulse detected by the body condition parameter detection unit 8 and an image displayed on the screen of the monitor 2.

  Here, when the correspondence table in FIG. 7 is a correspondence table between the user's pulse and the background display area 2b of the monitor 2 shown in FIGS. 4 and 5, the control unit 11 determines that the user's pulse is 90 times, for example. The background display area 2b blinks, for example, to “high speed” when it is “fast” such as more than / minute, and similarly, the background display is displayed when the user ’s pulse is “medium”, for example, 65-90 times / minute. The area 2b is blinked at “medium speed”, and the background display area 2b is blinked at “low speed” when the user's pulse is “slow” such as 65 times / minute or less. Note that the display control of the background display area 2a may be control of the change speed to a different color in addition to the blinking speed control.

  In addition, when the correspondence table in FIG. 7 is a correspondence table between the user's pulse and the image display area 2a of the monitor 2 shown in FIGS. 4 and 5, the control unit 11 determines that the user's pulse is “fast”. In the same way, when the user's pulse is “medium”, the taste of the image display area 2a is changed and controlled at “medium speed”. When the pulse of the user is “slow”, the taste of the image display area 2a is changed and controlled at “low speed”.

For example, when a standby screen is displayed in the image display area 2a, the standby screen image is controlled to be changed to another image at a speed corresponding to the pulse of the user, or the image display area For example, when a moving image in which a character moves is displayed on 2a, it is possible to change and control the speed of the moving image at a speed according to the pulse of the user.

  Furthermore, when the correspondence table in FIG. 7 is a correspondence table between the user's pulse and, for example, the expansion / contraction of the bar graph displayed on the image display area 2c in FIG. Is "fast", the bar graph of the image display area 2c is expanded and contracted to "high speed". Similarly, when the user's pulse is "medium", the bar graph of the image display area 2c is "medium speed" When the user's pulse is “slow”, the bar graph in the image display area 2c is stretched at “low speed”.

  In the example of FIG. 7, the user's pulse is taken as an example, but display control similar to that described above can also be performed in a correspondence table between the moving speed of the user obtained from an acceleration sensor or GPS and an image.

  FIG. 8 shows an example of a correspondence table between a user schedule held in the data holding unit 16 and an image displayed on the screen of the monitor 2.

  When the correspondence table in FIG. 8 is a correspondence table between the user's schedule and the image display area 2a of the monitor 2 shown in FIGS. 4 and 5, the control unit 11 has the user's schedule as “work”, for example. Sometimes “image at work” is displayed in the image display area 2a. Similarly, when the user's schedule is “drinking party”, “image of a pub” is displayed in the image display area 2a, and the user's schedule is displayed. Is “meeting”, for example, “image during meeting” is displayed in the image display area 2a, and “travel image” is displayed in the image display area 2a when the user's schedule is “travel”, for example. The image displayed in the image display area 2a may be an image taken by the camera 7 or an image prepared in advance.

  Next, a specific example of performing display control of the image display areas 2a and 2c, the background display area 2b, and the like in FIGS. 4 and 5 according to the current state around the user will be described.

  FIG. 9 shows an example of a correspondence table between the time zone determined by the control unit 11 based on the time count value from the timer 23 and the color displayed on the screen of the monitor 2.

  When the correspondence table in FIG. 9 is a correspondence table between the time zone and the background display area 2b of the monitor 2 shown in FIGS. 4 and 5, the control unit 11 has a time zone of, for example, 3:00 to 10:00. In the “morning” time zone, the background display area 2b on the screen of the monitor 2 is set to “white”. Similarly, the time zone is “daytime” such as 10:00 to 16:00. The background display area 2b is “yellow” and the background display area 2b is “dark blue” when the time zone is “night” such as 16:00 to 3:00 the next day. The screen display control is performed.

  Further, when the correspondence table in FIG. 9 is a correspondence table between the time zone and the taste (color shade) of the image displayed in the image display area 2a of the monitor 2 shown in FIG. 4 or FIG. The taste of the image displayed in the image display area 2a of FIGS. 4 and 5 is changed and controlled according to the time zone.

  FIG. 10 shows an example of a correspondence table between “country” determined by the control unit 11 based on latitude and longitude information of GPS and an image displayed on the screen of the monitor 2.

  When the correspondence table in FIG. 10 is a correspondence table between “country” and the background display area 2b of the monitor 2 shown in FIGS. 4 and 5, the control unit 11 indicates that “country” is, for example, “United States”. Sometimes, a “star pattern” image is displayed in the background display area 2 b on the screen of the monitor 2. Similarly, when the “country” is “China People's Republic”, the background display area 2 b is set to “red”, When “Country” is “Japan”, screen display control is performed so that “red and white pattern” is displayed in the background display area 2b.

  10 is a correspondence table between “country” and an image (taste) displayed in the image display area 2a of the monitor 2 shown in FIG. 4 or FIG. The taste of the image displayed in the image display area 2a of FIGS. 4 and 5 is changed and controlled according to “country”.

  In FIG. 11, the weather on that day (date and time) determined by the control unit 11 based on the weather forecast information held in the data holding unit 16 and the time count value from the timer, and the screen of the monitor 2 are displayed. An example of a correspondence table with an image is shown.

  When the correspondence table of FIG. 11 is a correspondence table of “weather” and the background display area 2b of the monitor 2 shown in FIG. 4 or FIG. Similarly, when the background display area 2b on the screen of the monitor 2 is “yellow” and “weather” is “cloudy”, the background display area 2b is “gray” and “weather” is “snow”. Sometimes screen display control is performed so that the background display area 2b is set to "white".

  11 is a correspondence table between “weather” and an image (taste) displayed in the image display area 2a of the monitor 2 shown in FIG. 4 or FIG. The taste of the image displayed in the image display area 2a of FIGS. 4 and 5 is changed and controlled according to the “weather”.

  Next, a specific example in which display control of the image display areas 2a and 2c, the background display area 2b, and the like in FIGS. 4 and 5 is performed according to the current terminal state will be described.

  FIG. 12 shows an example of a correspondence table between applications being executed by the control unit 11 and colors displayed on the screen of the monitor 2.

  When the correspondence table in FIG. 12 is a correspondence table between the application and the taste of the image display area 2a of the monitor 2 shown in FIG. 4 or FIG. The taste of the image display area 2a on the screen of the monitor 2 is, for example, “green”. Similarly, when the application is a “call” application, the taste of the image display area 2a is, for example, “red”, and the application is “ When it is a “camera” application, screen display control is performed so that the taste of the image display area 2a is “blue”.

[Summary]
As described above, according to the mobile phone terminal of the embodiment of the present invention, the image display on the monitor 2 is switched according to the state of the user's clothing color and schedule, the terminal application, and the like. It is possible to customize various variations according to the situation without any particular conscious operation.

  In the above-described example, the image corresponding to each state is displayed according to the correspondence table stored in advance in the data holding unit 16, but the user changes the setting from a plurality of tables according to his / her preference. Alternatively, an image taken by the camera 7 may be used. Furthermore, you may make it display the image downloaded from the other terminal device using a communication means etc.

  In the description of FIGS. 6 to 12 described above, examples in which each correspondence table corresponds one-to-one with respect to the user's own state, the state around the user, and the state inside the terminal are given. It is of course possible to control the screen display of the monitor 2 by combining each state and each corresponding table. For example, if the schedule of FIG. 8 is “travel” and the travel destination “country” is “United States” in FIG. 10 as a combination of FIG. 8 and FIG. "Image" is displayed, and a "star pattern" taste is displayed in the background display area 2b. In addition, since it is complicated to list all examples of combinations of other states and correspondence tables, their description is omitted.

  Furthermore, in the example of each correspondence table in FIGS. 6 to 12 described above, an example in which one image is prepared for one state is shown, but a plurality of images may be selectively displayed. For example, it is possible to display a different image each time by preparing a plurality of images when the pulse is fast and switching in turn each time the images are displayed. Or you may enable it to display only a favorite image by the selection setting by a user.

  Also, the correspondence table used according to each state, and the correspondence between the state and the image in each correspondence table can be changed as appropriate, for example, by feedback from the user, or can be prioritized by user settings. It is.

  In the above-described embodiment, an example of changing the image display of the monitor 2 according to each state is given. For example, a so-called electronic paper capable of color display is provided on the casing of a mobile phone terminal. It is also possible to change and control the display of the electronic paper as described above. Various types of electronic paper such as a so-called microcapsule electrophoresis type, twist ball type, dichroic dye / liquid crystal type, liquid crystal / organic photoreceptor composite film type, and toner display type are known. The electronic paper used in the mobile phone terminal of this embodiment may be any of the above types.

  In the above-described embodiment, an example in which the image is changed and controlled according to each state is given. For example, the ring tone and other notification sounds are changed and controlled according to each state, and the tempo during music playback, It is also possible to change and control the rhythm, tone color, pitch, magnitude, etc., and to switch the playback music according to each state.

  Finally, the description of the above-described embodiment is an example of the present invention. For this reason, the present invention is not limited to the above-described embodiment, and it is needless to say that various modifications can be made according to the design or the like as long as the technical idea according to the present invention is not deviated.

  This embodiment is applicable not only to a mobile phone terminal but also to various portable electronic devices such as a PDA, a digital television receiver, a remote controller of a car navigation device, and a computer mouse.

It is a figure which shows an example of the external appearance structure of the mobile telephone terminal used as one Embodiment to which this invention is applied. It is a block diagram which shows the internal schematic structure in the mobile telephone terminal of embodiment of this invention. It is a flowchart which shows the process of the control part for the mobile telephone terminal of this embodiment to perform image control. It is a figure which shows an example of the screen of a monitor and is used for description of each display area where image display control is performed by a control part. It is a figure which shows another example of the screen of a monitor, and is used for description of each display area where image display control is performed by a control part. It is a figure which shows an example of the correspondence table of the color system of a user's clothes, and the color displayed on the screen of a monitor as an example of the present user's own state. It is a figure which shows an example of the correspondence table of a user's pulse and the image displayed on the screen of a monitor as an example of the present user's own state. It is a figure which shows an example of a correspondence table of a user's schedule and the image displayed on the screen of a monitor as an example of the present user's own state. It is a figure which shows an example of the correspondence table of a time slot | zone and the image displayed on the screen of a monitor as an example of the state around the present user. It is a figure which shows an example of the corresponding | compatible table of the country where a user exists, and the image displayed on the screen of a monitor as an example of the state around the present user. It is a figure which shows an example of the correspondence table of the weather and the image displayed on the screen of a monitor as an example of the state around the present user. It is a figure which shows an example of the correspondence table of the application in execution and the image displayed on the screen of a monitor as an example of the state inside the present terminal.

Explanation of symbols

DESCRIPTION OF SYMBOLS 1 Mobile phone terminal, 2 Monitor, 2a, 2c Image display area, 2b Background display area, 3 Operation part, 4 Speaker, 5 Microphone, 6 Antenna, 7 Camera, 8 Body condition parameter detection part, 11 Control part, 12 Communication circuit , 13 Display control unit, 14 Operation signal generation unit, 15 Memory, 16 Data holding unit, 17 Speaker drive circuit, 18 Microphone input signal generation unit, 19 Short range communication circuit, 20 Sensor detection signal generation unit, 21 Camera drive circuit, 22 GPS receiver circuit, 23 timer, 25 sensor

Claims (7)

  1. Display means capable of displaying images;
    User related information detecting means for detecting user related information related to the user of the terminal;
    Image information output means for outputting image information corresponding to the user related information detected by the user related information detecting means;
    A portable terminal comprising: image control means for changing and controlling a display image on the display means according to image information output by the image information output means.
  2. The portable terminal according to claim 1, wherein
    The user-related information detecting means includes a camera unit that captures an image, and a base color detection unit that detects a base color from captured data obtained by capturing a user's clothes or a scene around the user with the camera unit, Detecting information on the base color detected by the base color detection unit as the user-related information,
    The image information output unit includes an image information holding unit that holds a plurality of pieces of image information, and the image information corresponding to the basic color detected by the basic color detection unit of the user related information detection unit is stored in the image information holding unit. A portable terminal characterized in that it is selectively extracted from a plurality of image information held in and output.
  3. The portable terminal according to claim 1, wherein
    The user related information detecting means includes a body state parameter detecting unit that detects a parameter that varies due to a physical and psychological state of the user, and the parameter detected by the body state parameter detecting unit is used as the user related information. Detect as
    The image information output means has an image information holding section for holding a plurality of image information, and holds image information corresponding to the parameters detected by the body condition parameter detecting means of the detecting means in the image information holding section. A portable terminal characterized by selectively extracting and outputting from a plurality of image information.
  4. The portable terminal according to claim 1, wherein
    The user related information detection means includes a time detection unit that measures time, detects time information measured by the time detection unit as the user related information,
    The image information output unit includes an image information holding unit that holds a plurality of pieces of image information, and the image information corresponding to the time information measured by the time detection unit of the user related information detection unit is stored in the image information holding unit. A portable terminal characterized in that it is selectively extracted from a plurality of image information held in and output.
  5. The portable terminal according to claim 1, wherein
    The user related information detecting means includes a schedule information holding unit for holding user schedule information, a time detecting unit for measuring time, and a schedule from the schedule information holding unit according to the time measured by the time detecting unit. A schedule detection unit that detects information, and detects the schedule information detected by the schedule detection unit as the user-related information,
    The image information output unit includes an image information holding unit that holds a plurality of pieces of image information, and the image information corresponding to the schedule information detected by the schedule detection unit of the user related information detection unit is stored in the image information holding unit. A portable terminal characterized by selectively extracting and outputting from a plurality of stored image information.
  6. The portable terminal according to claim 1, wherein
    The user related information detection means includes a position detection unit that detects the position of the terminal, detects information on a position detected by the position detection unit as the user related information,
    The image information output unit includes an image information holding unit that holds a plurality of pieces of image information, and the image information corresponding to the position information detected by the position detection unit of the user related information detection unit is stored in the image information holding unit. A portable terminal characterized in that it is selectively extracted from a plurality of image information held in and output.
  7. The portable terminal according to claim 1, wherein
    The user-related information detecting means includes a basic color detection unit that detects a basic color from photographing data obtained by photographing a user's clothes or a scene around the user with a camera unit that captures an image, and a physical and psychological state of the user A body condition parameter detection unit that detects parameters that vary due to the time, a time detection unit that measures time, a schedule detection unit that reads schedule information from a schedule information holding unit that holds user schedule information, and the terminal Having at least two or more detection units of the position detection unit for detecting the position,
    The image information output unit includes an image information holding unit that holds a plurality of image information, and the image information corresponding to the information detected by the at least two detection units of the user related information detection unit is displayed as the image information. A portable terminal characterized by selectively extracting and outputting from a plurality of pieces of image information held in an information holding unit.
JP2003387422A 2003-11-18 2003-11-18 Portable information device Expired - Fee Related JP4218830B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2003387422A JP4218830B2 (en) 2003-11-18 2003-11-18 Portable information device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2003387422A JP4218830B2 (en) 2003-11-18 2003-11-18 Portable information device

Publications (2)

Publication Number Publication Date
JP2005151271A true JP2005151271A (en) 2005-06-09
JP4218830B2 JP4218830B2 (en) 2009-02-04

Family

ID=34694778

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2003387422A Expired - Fee Related JP4218830B2 (en) 2003-11-18 2003-11-18 Portable information device

Country Status (1)

Country Link
JP (1) JP4218830B2 (en)

Cited By (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006304112A (en) * 2005-04-22 2006-11-02 Xing Inc Portable electrical apparatus
JP2007047594A (en) * 2005-08-11 2007-02-22 Sharp Corp Electronic equipment
WO2007043500A1 (en) * 2005-10-07 2007-04-19 Toyota Jidosha Kabushiki Kaisha Hybrid automobile and method of controlling the same
WO2007072352A2 (en) * 2005-12-22 2007-06-28 Koninklijke Philips Electronics N. V. Chameleon glasses
JP2007265274A (en) * 2006-03-29 2007-10-11 Sendai Foundation For Applied Information Sciences Physiology adaptive display device
JP2009169357A (en) * 2008-01-21 2009-07-30 Institute Of National Colleges Of Technology Japan Image display controller and image display control method
JP2009230363A (en) * 2008-03-21 2009-10-08 Sony Corp Display unit and display method therefor
JP2010033175A (en) * 2008-07-25 2010-02-12 Toshiba Corp Portable electronic device
EP2180675A1 (en) * 2008-10-27 2010-04-28 Lg Electronics Inc. Mobile terminal
JP2011118917A (en) * 2011-01-19 2011-06-16 Sharp Corp Electronic device, operation method of the same and program
JP2011523486A (en) * 2008-05-27 2011-08-11 クゥアルコム・インコーポレイテッドQualcomm Incorporated Method and system for automatically updating avatar status to indicate user status
JP2011158794A (en) * 2010-02-03 2011-08-18 Casio Computer Co Ltd Image display device and program
JP2011164243A (en) * 2010-02-08 2011-08-25 Casio Computer Co Ltd Display processing device and program
JP2012529709A (en) * 2009-06-09 2012-11-22 スキフ・エルエルシー Event tracking for electronic paper display
JP2013105319A (en) * 2011-11-14 2013-05-30 Sony Corp Identification apparatus, control apparatus, identification method, program, and identification system
JP2013140574A (en) * 2011-12-07 2013-07-18 Nikon Corp Electronic apparatus, information processing method, and program
US8521230B2 (en) 2008-06-27 2013-08-27 Kyocera Corporation Mobile telephone
JP2015176184A (en) * 2014-03-13 2015-10-05 株式会社三菱東京Ufj銀行 Handheld terminal and information provision system
JP2016105272A (en) * 2015-11-11 2016-06-09 ソニー株式会社 Identifying device, control device, identifying method, program and identifying system
JP2017097380A (en) * 2017-02-16 2017-06-01 カシオ計算機株式会社 Output terminal device, communication terminal device, and program
JP2018534709A (en) * 2015-10-28 2018-11-22 ハンコム インコーポレイテッド Smart watch whose display color changes according to the user's condition

Cited By (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4520897B2 (en) * 2005-04-22 2010-08-11 株式会社エクシング Portable electrical equipment
JP2006304112A (en) * 2005-04-22 2006-11-02 Xing Inc Portable electrical apparatus
JP4668726B2 (en) * 2005-08-11 2011-04-13 シャープ株式会社 Electronics
JP2007047594A (en) * 2005-08-11 2007-02-22 Sharp Corp Electronic equipment
WO2007043500A1 (en) * 2005-10-07 2007-04-19 Toyota Jidosha Kabushiki Kaisha Hybrid automobile and method of controlling the same
WO2007072352A2 (en) * 2005-12-22 2007-06-28 Koninklijke Philips Electronics N. V. Chameleon glasses
WO2007072352A3 (en) * 2005-12-22 2007-10-18 Koninkl Philips Electronics Nv Chameleon glasses
JP2007265274A (en) * 2006-03-29 2007-10-11 Sendai Foundation For Applied Information Sciences Physiology adaptive display device
JP2009169357A (en) * 2008-01-21 2009-07-30 Institute Of National Colleges Of Technology Japan Image display controller and image display control method
JP2009230363A (en) * 2008-03-21 2009-10-08 Sony Corp Display unit and display method therefor
JP2014059894A (en) * 2008-05-27 2014-04-03 Qualcomm Incorporated Method and system for automatically updating avatar status to indicate user's status
JP2011523486A (en) * 2008-05-27 2011-08-11 クゥアルコム・インコーポレイテッドQualcomm Incorporated Method and system for automatically updating avatar status to indicate user status
US8521230B2 (en) 2008-06-27 2013-08-27 Kyocera Corporation Mobile telephone
JP2010033175A (en) * 2008-07-25 2010-02-12 Toshiba Corp Portable electronic device
US8280363B2 (en) 2008-10-27 2012-10-02 Lg Electronics Inc. Mobile terminal and method of controlling visual appearance thereof
EP2180675A1 (en) * 2008-10-27 2010-04-28 Lg Electronics Inc. Mobile terminal
JP2012529709A (en) * 2009-06-09 2012-11-22 スキフ・エルエルシー Event tracking for electronic paper display
JP2011158794A (en) * 2010-02-03 2011-08-18 Casio Computer Co Ltd Image display device and program
JP2011164243A (en) * 2010-02-08 2011-08-25 Casio Computer Co Ltd Display processing device and program
US8847974B2 (en) 2010-02-08 2014-09-30 Casio Computer Co., Ltd. Display processing apparatus
JP2011118917A (en) * 2011-01-19 2011-06-16 Sharp Corp Electronic device, operation method of the same and program
JP2013105319A (en) * 2011-11-14 2013-05-30 Sony Corp Identification apparatus, control apparatus, identification method, program, and identification system
US9953128B2 (en) 2011-11-14 2018-04-24 Sony Corporation Identification apparatus, control apparatus, identification method, program, and identification system
JP2013140574A (en) * 2011-12-07 2013-07-18 Nikon Corp Electronic apparatus, information processing method, and program
JP2015176184A (en) * 2014-03-13 2015-10-05 株式会社三菱東京Ufj銀行 Handheld terminal and information provision system
JP2018534709A (en) * 2015-10-28 2018-11-22 ハンコム インコーポレイテッド Smart watch whose display color changes according to the user's condition
JP2016105272A (en) * 2015-11-11 2016-06-09 ソニー株式会社 Identifying device, control device, identifying method, program and identifying system
JP2017097380A (en) * 2017-02-16 2017-06-01 カシオ計算機株式会社 Output terminal device, communication terminal device, and program

Also Published As

Publication number Publication date
JP4218830B2 (en) 2009-02-04

Similar Documents

Publication Publication Date Title
US7555141B2 (en) Video phone
US8761840B2 (en) Methods, devices and computer program products for operating mobile devices responsive to user input through movement thereof
CN1905711B (en) Cellular phone
US8433244B2 (en) Orientation based control of mobile device
US8581954B2 (en) Mobile communication terminal that delivers vibration information, and method thereof
KR100478108B1 (en) Fold type communication terminal device and the display control method
JP2009010712A (en) Mobile electronic equipment and its display method
CN102783136B (en) For taking the imaging device of self-portrait images
US20050157174A1 (en) Folding communication terminal apparatus
US20070286596A1 (en) Method and system for adjusting camera settings in a camera equipped mobile radio terminal
CN1319356C (en) Device and method for using a rotating key and controlling a display in a mobile terminal
US20040239792A1 (en) Image display apparatus and image display method
US7190968B2 (en) Portable device displaying an image provided by photographing
US7634299B2 (en) Communication terminal apparatus, method of changing function and/or setting of communication terminal apparatus, and program
KR20110019861A (en) Method for display configuration and apparatus using the same
JP2007123962A (en) Portable terminal device, mouse application program, and method of using portable terminal device as wireless mouse device
US8564710B2 (en) Photographing apparatus and photographing method for displaying information related to a subject
JP2009134409A (en) Reminder device, reminder method, reminder program, and portable terminal device
US20090227283A1 (en) Electronic device
US8514313B2 (en) Imaging device and method for switching mode of imaging device
US20060171360A1 (en) Apparatus and method for displaying data using afterimage effect in mobile communication terminal
JP2003161792A (en) Electronic apparatus and control method thereof
US8401593B2 (en) Enabling speaker phone mode of a portable voice communications device having a built-in camera
JP5032798B2 (en) Information providing apparatus, information providing system, and information providing method
RU2414087C2 (en) Method and apparatus for menu navigation in mobile communication device

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20061023

RD02 Notification of acceptance of power of attorney

Free format text: JAPANESE INTERMEDIATE CODE: A7422

Effective date: 20071009

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20080225

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20080227

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20080424

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20081105

A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20081106

R150 Certificate of patent or registration of utility model

Free format text: JAPANESE INTERMEDIATE CODE: R150

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20111121

Year of fee payment: 3

LAPS Cancellation because of no payment of annual fees