JP4479962B2 - Input processing program, portable terminal device, and input processing method - Google Patents

Input processing program, portable terminal device, and input processing method Download PDF

Info

Publication number
JP4479962B2
JP4479962B2 JP2005051876A JP2005051876A JP4479962B2 JP 4479962 B2 JP4479962 B2 JP 4479962B2 JP 2005051876 A JP2005051876 A JP 2005051876A JP 2005051876 A JP2005051876 A JP 2005051876A JP 4479962 B2 JP4479962 B2 JP 4479962B2
Authority
JP
Japan
Prior art keywords
means
icon
display
user
distance
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2005051876A
Other languages
Japanese (ja)
Other versions
JP2006236143A5 (en
JP2006236143A (en
Inventor
元宣 佐野
哲也 奥田
一浩 黒田
Original Assignee
ソニー エリクソン モバイル コミュニケーションズ, エービー
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニー エリクソン モバイル コミュニケーションズ, エービー filed Critical ソニー エリクソン モバイル コミュニケーションズ, エービー
Priority to JP2005051876A priority Critical patent/JP4479962B2/en
Publication of JP2006236143A publication Critical patent/JP2006236143A/en
Publication of JP2006236143A5 publication Critical patent/JP2006236143A5/ja
Application granted granted Critical
Publication of JP4479962B2 publication Critical patent/JP4479962B2/en
Application status is Active legal-status Critical
Anticipated expiration legal-status Critical

Links

Images

Description

  INDUSTRIAL APPLICABILITY The present invention provides an input processing program, a portable terminal device, and an input suitable for use in, for example, a mobile phone, a PHS phone (PHS: Personal Handyphone System), a PDA device (PDA: Personal Digital Assistant), a notebook personal computer device, etc. Regarding the processing method, in particular, the input support means such as a finger or a stylus pen enlarges and displays icons close to each other, and / or displays them divided into multiple icons, enabling intuitive input operations and necessary for input The present invention relates to an input processing program, a portable terminal device, and an input processing method for reducing the number of operations.

  In recent years, many functions have been implemented in mobile phones. In order to use these functions, the user selects and uses a desired function from the menu of the mobile phone. This menu is often in the form of an icon, and the user operates a cross key or the like to select a desired icon.

  As background art regarding the selection of this icon, in Japanese Patent Laid-Open No. 2000-242383 (Patent Document 1), when the cursor is placed on the icon on the screen of the display unit, the user feels uncomfortable. A screen display enlargement control device for the purpose of performing an enlarged display of a cursor without giving is disclosed.

  In the case of the technique disclosed in Patent Document 1, an icon for performing enlarged display is selected and registered in advance when the cursor is moved, and an enlargement ratio of the icon for performing enlarged display is set and registered in advance. Keep it.

  When the enlarged display control unit detects that the cursor is positioned on the icon, the icon is displayed based on the enlarged display ON / OFF information indicating ON / OFF of the enlarged display of the icon registered in the enlarged display setting unit. ON / OFF of the enlarged display is detected. Then, when the enlarged display of the icon on which the cursor is placed is registered as “ON”, based on the enlargement factor information registered in the enlargement display setting unit and indicating the enlargement factor of the icon specified by the user. The icon is enlarged and displayed at the enlargement ratio.

  As a result, only the icon desired by the user can be enlarged and displayed at an enlargement rate desired by the user, so that the icon can be enlarged and displayed without giving the user a feeling of strangeness.

JP 2000-242383 A (page 4 to page 5: FIG. 3)

  However, more and more functions are mounted on mobile phones year by year, but the number of cross keys and numeric keys that are mounted on these functions does not change. This indicates that the number of functions assigned to one key increases as the number of functions mounted on the mobile phone increases. For this reason, with the method of selecting a desired function using the cross key or numeric keypad, it is difficult to perform intuitive operations in the present day when functions assigned to one key are increasing. Yes.

  Similarly, when inputting characters with a mobile phone, it is common to select a desired character from among a plurality of characters assigned to a numeric keypad and input the character. Specifically, for example, “a”, “i”, “u”, “e”, “o”, “a”, “i”, “ぅ”, “e”, “ぉ” When the “1” numeric keypad is pressed once, the “A” character is input. When the “1” numeric keypad is pressed four times, the “E” character is input. It has become so. Further, when the “1” numeric keypad is pressed seven times, the character “I” is input, and when the “1” numeric keypad is pressed ten times, the character “ぉ” is input.

  That is, when inputting a desired character, it is necessary to press the numeric keypad for the number corresponding to the desired character to be input. For example, when the character “ぉ” is input in the above-described example, it is necessary to press the ten-key “1” ten times.

  When the number of characters (and functions) assigned to one key increases in this way, there is a problem that the number of input operations corresponding to the corresponding number is required to select a desired character (or function).

  The present invention has been made in view of the above-described problems. Even when a plurality of functions and characters are assigned to one key, the desired function and characters can be intuitively and operated with a small number of operations. It is an object of the present invention to provide an input processing program, a portable terminal device, and an input processing method.

In order to solve the above-described problem, an input processing program according to the present invention provides
Icon display means for arranging and displaying one or a plurality of icons on the display means based on icon display information;
Based on the detection output from the distance detection means for detecting the distance between the selection means for selecting the icon and the display surface of the display means by touching the icon displayed on the display means, Proximity determination means for determining whether or not the distance between the selection means and the display surface of the display means is equal to or less than a predetermined distance;
By causing the computer to function as the proximity determination means, the selection means approaches when the distance between the selection means and the display surface of the display means is determined to be equal to or less than a predetermined distance. The selection means approaches based on the detection output from the position detection means for detecting the position of the display means on the display surface and the display information of the icon displayed on the display means by the icon display means. Proximity icon detecting means for detecting an icon displayed at a position on the display surface,
When a plurality of functions, or a plurality of characters or symbols are assigned to an icon that is close to the selection means detected by causing the computer to function as the proximity icon detection means , the selection means approaches. The icon is divided into icons corresponding to the respective functions assigned to the icon or the assigned characters and symbols, and the divided icons are enlarged and displayed on the display means. The computer is caused to function as display change means .

Moreover, the portable terminal device according to the present invention is a means for solving the above-described problems.
Icon display means for arranging and displaying one or a plurality of icons on the display means based on icon display information;
A distance detection means for detecting a distance between a selection means for selecting the icon and a display surface of the display means by touching an icon displayed on the display means;
Proximity determination means for determining whether or not the distance between the selection means and the display surface of the display means is equal to or less than a predetermined distance based on the detection output from the distance detection means;
When the proximity determining means determines that the distance between the selecting means and the display surface of the display means is equal to or less than a predetermined distance, the display of the display means that is close to the selecting means Position detecting means for detecting a position on the surface;
Based on the detection output from the position detection means and the display information of the icon displayed on the display means by the icon display means, the selection means is displayed at a position on the display surface close to the selection means. Proximity icon detection means for detecting the icon
When a plurality of functions, or a plurality of characters or symbols are assigned to the icons that are close to the selection means detected by the proximity icon detection means, the icons that are close to the selection means are It has icon display changing means for dividing each divided icon into an icon corresponding to each function assigned to the icon or assigned character or symbol and displaying the divided icon on the display means. .

A step in which the icon display means arranges and displays one or more icons on the display means based on the display information of the icon;
Proximity discrimination based on the detection output from the distance detection means for detecting the distance between the selection means for selecting the icon by touching the icon displayed on the display means and the display surface of the display means A means for determining whether a distance between the selection means and the display surface of the display means is equal to or less than a predetermined distance;
When the proximity determining means determines that the distance between the selection means and the display surface of the display means is equal to or less than a predetermined distance, the proximity icon detection means The selection means approaches based on the detection output from the position detection means for detecting the position of the display means on the display surface and the display information of the icon displayed on the display means by the icon display means. Detecting an icon displayed at a position on the display surface,
When a plurality of functions, or a plurality of characters and symbols are assigned to the icon that is close to the selection means detected by the proximity icon detection means in the step , the icon display change means is the selection means. Is divided into icons corresponding to the functions assigned to the icons or the letters and symbols assigned to the icons, and the divided icons are enlarged to be used as the display means. And displaying.

In the present invention, a plurality of functions or a plurality of icons each assigned a plurality of characters and symbols are displayed on the display means, and when the selection means comes close, each function assigned to the icon, By forming icons corresponding to characters or symbols and enlarging and displaying these icons in a divided manner, even if the display surface size of the display means is small, the user can effectively use this small display surface. Makes it easy to select a desired function, character or symbol. In addition, an erroneous input is prevented by enlarging the icon corresponding to each function, character, or symbol.

  In the present invention, when the distance between the selection means and the display surface of the display means becomes a predetermined distance or less, the icon displayed at the position on the display means close to the selection means is enlarged and displayed. can do.

  In the case of a device with a small screen size of the display means, the user needs to select a desired icon from among a plurality of icons displayed on the small screen. At the time of this selection, if an icon displayed on the display means is touched and selected by the selection means, the size of the selection means itself may accidentally touch an icon other than the desired icon and be selected. Arise.

  However, in the case of the present invention, since the icon approaching the selection means is displayed in an enlarged manner, the user can almost certainly touch the icon close to the selection means. For this reason, even when a large number of icons are displayed on the display unit having a small size, it is possible to reliably select the icon desired by the user.

  Further, since the display position of each icon is the same display position every time, the user can feel the display position of each icon. Since the user selects the desired icon by touching the display position of the desired icon, the desired icon can be selected intuitively.

  Further, the present invention displays a plurality of icons assigned to a plurality of functions or a plurality of characters and symbols on the display means, and each function or character assigned to the icon when the selection means approaches. Alternatively, icons corresponding to symbols can be formed, and each of these icons can be enlarged and displayed in a divided manner.

  Thereby, even when the size of the display surface of the display means is small, it is possible to make it easier for the user to select a desired function, character or symbol by effectively using this small display surface. In addition, since an icon corresponding to each function, character, or symbol is enlarged and displayed, erroneous input can be prevented.

  Also, before the display surface of the display means is touched by the selection means, the icons that are expected to be touched by the selection means are divided, and each function, character or symbol icon assigned to this icon is listed. From among the icons displayed in a list, the input of a function that is actually designated for execution or a character or symbol that is actually designated for input is accepted by touching the selection means. ing. For this reason, the input operation of a desired character or symbol can be completed with only one touch operation. Therefore, the operation for selecting a desired function or the operation for inputting a desired character or symbol can be greatly reduced.

  The present invention can be applied to a mobile phone.

[Appearance structure of mobile phone]
FIG. 1 (a) is a perspective view of the mobile phone according to the first embodiment of the present invention when viewed from the front side, and FIG. 1 (b) is the same as the first embodiment. The perspective view at the time of seeing the portable telephone which becomes from the back side is shown.

  As shown in FIGS. 1A and 1B, this mobile phone is a so-called foldable mobile phone, and an upper housing 1 and a lower housing 2 are rotated via a hinge portion 3. It is connected freely.

  The lower housing 2 is provided with a key input unit 4 for performing a predetermined operation such as a telephone number input or a call operation on the inner surface when the cellular phone is folded. Keys provided in the key input unit 4 include numeric keys “0” to “9”, symbol keys such as * key and # key, keys for performing outgoing operations, keys for setting various functions, A shooting key 6 for performing a shooting operation using the camera unit 5 provided on the back side of the telephone is provided.

  In addition, a rotation operation and a pressing operation can be performed on one side surface portion of the lower housing 2, and a rotation operation unit 7 for performing editing of a mobile mail and various input operations is provided.

  Further, in the vicinity of the lower end of the lower housing 2 (in the vicinity of the end opposite to the end provided with the hinge 3), a microphone unit 8 that converts a user's call voice or the like into an electric signal. Is provided.

  On the other hand, the upper casing 1 is provided in the vicinity of the hinge portion 3 on the inner side and the surface side when the mobile phone is folded, and is used for clearing (deleting) the input telephone number and characters when editing the mobile mail. A clear key 9 and a character type change key 10 for changing the type of characters input when editing mail or address book are provided. By operating the character type change key 10, the type of characters to be input is switched, for example, kana characters → numbers → English characters.

  Further, the upper housing 1 is provided with a whip antenna 11 at an end opposite to the end provided with the sheep portion 3. A speaker unit 12 is provided for obtaining a sound output such as a ringtone.

  The upper housing 1 is provided with a main display unit 13 for displaying various information, images taken by the camera unit 5, images sent from the other party, and the like. A so-called touch panel 20 is provided on the surface of the main display unit 13, and when the user's finger or the like touches, the display unit 13 on the control unit side of the mobile phone (the control unit shown in FIG. 2). It is possible to detect which position of the finger touched.

  The upper housing 1 is provided with two proximity detection camera units 14 and 15 for detecting that a finger or the like has approached the main display unit 13. As will be described later, in the case of the mobile phone according to the embodiment, based on the images picked up by the proximity detection camera units 14 and 15, it is detected which position on the main display unit 13 the user's finger has approached. In addition, at the timing when it is detected that the user's finger is approaching, an enlarged display (and split display) of icons and characters corresponding to the position on the main display unit 13 where the user's finger is approaching is performed. It has become.

  In addition, a camera unit 5 is provided on the back surface of the upper housing 1 to be used when a user of the cellular phone takes a desired subject such as a surrounding landscape or a person. Also, on the back surface of the upper casing 1, various information, images taken by the camera unit 5, and images sent from the other party are sent to approximately the center of the outer surface when the mobile phone is folded. A sub display 16 smaller than the main display 13 for displaying images and the like is provided.

Further, an earphone jack 17 is provided on the same side surface portion of the upper casing 1 as the side surface portion on which the rotation operation unit 7 is provided. By attaching the earphone device to the earphone jack 17, Instead of obtaining sound output from the speaker unit 12, sound output can be obtained through the earphone device.

[Electric configuration of mobile phone]
FIG. 2 is a block diagram of the mobile phone according to this embodiment. As shown in FIG. 2, this mobile phone detects a communication processing program (communication program) for performing communication control such as telephone and mobile mail, and that the user's finger is close to the touch panel 20. In this case, the ROM 31 (ROM: Read Only Memory) in which the device ID of the mobile phone is stored in addition to the input processing program for performing the enlarged display or divided display of icons and characters corresponding to the position of the user's finger. ) And a control unit 32 that controls communication processing, display processing, input processing, and the like of the mobile phone based on a program stored in the ROM 31.

  In addition, the cellular phone includes a communication circuit 33 that wirelessly transmits and receives image data, audio data, cellular mail data, and the like transmitted and received via the antenna 11, and the main display unit 13 and the sub display unit. And a display unit 34 corresponding to 16. Of the display unit 34, the main display unit 13 is provided with the touch panel 20 on the surface side as described above.

  Also, this cellular phone has the above numeric keys “0” to “9”, symbol keys such as * key and # key, keys for performing outgoing operation, keys for setting various functions, photographing key 6, rotation operation unit 7. , A clear key 9, an operation unit 35 provided with keys such as a character type change key 10, and a readable and writable memory for storing various data such as telephone book data, images, mail data, response messages, etc. 36.

  In addition, the mobile phone includes a speaker unit 12 for obtaining a sound output such as a call voice and a ring tone, a microphone unit 8 for converting a transmission voice of a user of the mobile phone into an electric signal, and the mobile phone. And a camera control unit 37 that performs imaging control of the camera unit 5 provided on the back side of the camera.

  In addition, the mobile phone detects, for example, how much the user's finger approaches the main display unit 13 based on images captured by the two proximity detection camera units 14 and 15. And a touch panel control unit 39 that controls the touch panel 20 provided on the surface of the main display unit 13.

Each of these blocks is connected to each other by a control line 40 for transmitting control data from the control unit 32 and a data line 41 for transmitting data such as image data, voice data, mobile mail data, etc. All the blocks are operated by power supplied from a power supply circuit (not shown).

[Icon enlargement processing]
The cellular phone according to this embodiment is based on the captured image of the main display unit 13 captured by the two proximity detection camera units 14 and 15, and how close the user's finger is to the main display unit 13. By detecting the position on the main display unit 13 where the user's finger approaches, it is detected which icon is selected by the user from among the icons displayed on the main display unit 13.

  Until it is detected that the user's finger is close to the main display unit 13, for example, icons of the same size are displayed on the main display unit 13 at equal intervals, but the user's finger is close to the main display unit 13. At the timing when this is detected, the icon corresponding to the position where the finger approaches is enlarged and displayed, and the user is made aware which icon is about to be selected.

  When the enlarged icon is an icon that the user desires to select, the user touches the display area of the icon. When detecting that an enlarged icon is touched by the user via the touch panel 20, the control unit 32 executes and controls an application function corresponding to the icon.

  The flowchart of FIG. 3 shows the flow of such icon enlargement processing. The processing shown in this flowchart is executed by the control unit 32 based on the input processing program stored in the ROM 31 shown in FIG. Based on the display information of each icon indicating the display position, display form, display size, etc. of each icon stored in the ROM 31, the control unit 32, for example, displays the main display unit 13 as shown in FIG. By displaying an icon corresponding to the application function provided in the mobile phone, the flowchart of FIG. 3 based on this input processing program is started.

  FIG. 4A shows a “camera” icon for designating activation of the camera function, a “mail” icon for designating creation and reading of a mobile mail, and a mobile carrier of the mobile phone. This is an example in which a “Web” icon or the like for designating a network connection to a Web site provided on a predetermined network such as a communication network or the Internet is displayed on the main display unit 13.

  When the control unit 32 displays each icon on the main display unit 13 in this way, the control unit 32 controls the activation of the proximity detection camera units 14 and 15, and controls each captured image from the proximity detection camera units 14 and 15. Based on this, detection of the detection outputs of “distance between the user's finger and the main display unit 13” and “position on the main display unit 13 where the user's finger is approaching” detected by the position detection unit 38 is started. To do.

  Step S1 in the flowchart of FIG. 3 shows a monitoring process of whether or not the user's finger has approached the main display unit 13 in the control unit 32 based on the detection output from the position detection unit 38. Until the control unit 32 determines that the user's finger is close to the main display unit 13, the control unit 32 repeatedly executes step S1 to continuously monitor whether or not the user's finger is close. The process proceeds to step S2 at the timing when the detection output from the position detection unit 38, which indicates that the finger of the user has approached the main display unit 13, is captured.

  In step S <b> 2, the control unit 32 determines the relative relationship between the user's finger and the main display unit 13 based on the detection output of “distance between the user's finger and the main display unit 13” detected by the position detection unit 38. Monitoring of the positional relationship is started, and the process proceeds to step S3.

  In step S3, the control unit 32 compares the detection output of “distance between the user's finger and the main display unit 13” detected by the position detection unit 38 with a threshold value stored in advance. It is determined whether or not the distance between the user's finger and the main display unit 13 is equal to or smaller than a predetermined distance α (= the threshold value). Then, until it is detected that the distance between the user's finger and the main display unit 13 is equal to or less than the predetermined distance α, the process returns to step S2, and the user's finger and the main display unit 13 At the timing when it is detected that the distance between the user's finger and the main display unit 13 is equal to or less than the predetermined distance α, the determination whether or not the distance between them is equal to or less than the predetermined distance α is continuously performed. The process proceeds to step S4.

  In step S4, since the distance between the user's finger and the main display unit 13 is equal to or less than the predetermined distance α, the control unit 32 detects the “main display that the user's finger is approaching” detected by the position detection unit 38. Based on the detection output of “position on unit 13”, it is detected which icon of the user's finger is currently approaching among the icons currently displayed on main display unit 13. Then, the icon close to the user's finger is enlarged and controlled as shown in FIG. FIG. 4B shows an example in which the “mail” icon for designating creation, reading, etc. of the mobile mail is enlarged.

  The processing in steps S1 to S4 will be described in more detail. The proximity detection camera units 14 and 15 are conscious of human left and right eyes as shown in FIG. It is provided close to the upper right corner and the upper left corner. The two proximity detection camera units 14 and 15 capture images from the upper right corner and the upper left corner of the main display unit 13 as shown in FIG. When the main display unit 13 is viewed from the upper surface side from the camera position information indicating the position where the camera 15 is provided and the position of the user's finger shown in each captured image captured by each camera unit 14, 15. The distance between the finger position and the main display unit 13 and the user's finger is calculated as shown in FIG.

  Then, the position detection unit 38 determines which icon the user's finger is approaching (which icon is about to be selected) from the calculation result, and how much the user's finger is corresponding to the icon. Determine if you are approaching.

  The control unit 32 compares the “distance between the user's finger and the main display unit 13” supplied from the position detection unit 38 with the threshold set for the distance α, so that the user's finger It is determined whether the distance to the main display unit 13 is reduced to a distance α or less. That is, when the distance between the user's finger and the main display unit 13 is greater than the distance α, it can be determined that the user is less willing to select an icon, and the user's finger and the main display unit 13 are not. Can be determined that the user has a willingness to select the icon.

  For this reason, the control unit 32 detects that the distance between the user's finger and the main display unit 13 is reduced to a distance α or less, and supplies the “user's finger is approaching” supplied from the position detection unit 38. On the basis of the detection output of “position on main display unit 13”, it is detected which icon is currently approaching by the user's finger among the icons currently displayed on main display unit 13. Then, the icon close to the user's finger is enlarged and controlled as shown in FIG. Thereby, the user can recognize the application function that the user is trying to select through the enlarged icon.

  Next, when the icon displayed in an enlarged manner is an icon corresponding to an application function that the user desires to execute, the user should touch the icon displayed in an enlarged manner.

  The touch panel control unit 39 shown in FIG. 2 detects whether or not the user has touched the touch panel 20, and supplies this detection output to the control unit 32. The control unit 32 determines whether or not a detection output indicating that the user has touched the touch panel 20 is supplied from the touch panel control unit 39 in step S5 of the flowchart of FIG. It is determined whether or not. Then, the process proceeds to step S6 at the timing when the detection output indicating that the user touches the touch panel 20 is supplied from the touch panel control unit 39.

  Note that the icon displayed in an enlarged manner in step S4 is not necessarily the icon that the user desires to select. If the enlarged icon is not the icon that the user desires to select, the user can display the main display unit 13 without touching the display position of the enlarged icon on the main display unit 13. The finger should be separated from the screen by a distance larger than the distance α or the position of the finger should be changed greatly.

  For this reason, when the detection output which shows that the user touched the touch panel 20 is not supplied from the touch panel control part 39 in step S5, the control part 32 returns a process to step S2, and a user's finger | toe and the main display part 13 are shown. Continue to monitor the relative position of

Next, when a detection output indicating that the user touches the touch panel 20 is supplied from the touch panel control unit 39, the control unit 32 touches the user in step S6, for example, as illustrated in FIG. The display color of the displayed icon is changed and / or the icon touched by the user is lit (flashing) with high brightness, and / or sound generation control of a predetermined sound is performed via the speaker unit 12. , Display control indicating that the icon is selected by the user, and / or audio output control. Then, after this display control and / or audio output control is performed for a certain time, for example, 1 second, the application function corresponding to the icon touched by the user is executed and controlled, and the routine shown in the flowchart of FIG. The execution of is terminated.

[Effect of the first embodiment]
As is clear from the above description, in the mobile phone according to the first embodiment, among the icons displayed on the main display unit 13, the icon that the user's finger approaches is enlarged and displayed on the touch panel 20. When the user's finger touches, an application function corresponding to the enlarged icon is executed.

  In the case of a mobile phone, the size of the main display unit 13 is small, and the user needs to select an icon corresponding to a desired application function from among a plurality of icons displayed on the small screen. At the time of this selection, if an icon on the main display unit 13 is touched and selected with a finger, another icon may be erroneously selected depending on the size of the finger.

  In addition, if the icon is touched using a stylus pen or the like, it becomes possible to select a desired icon almost certainly. In this case, it is necessary to have a stylus pen along with the mobile phone. There is also a risk of losing the stylus pen.

  However, in the case of the mobile phone of the embodiment, a desired icon can be selected with a finger without using a special means such as a stylus pen. In addition, since the icon approaching the finger is enlarged and displayed, the user can surely touch the desired icon with the finger approaching. For this reason, even when a large number of icons are displayed on the display unit having a small size, it is possible to reliably select an icon corresponding to the application function desired by the user.

  In addition, since the display position of each icon is the same every time, the display position of each icon is remembered while the user is using the mobile phone. The desired icon is selected by the user touching the display position of the desired icon. For this reason, the icon corresponding to a desired application function can be selected intuitively.

In the description of the first embodiment, the touch panel 20 provided on the surface of the main display unit 13 detects whether or not the user's finger touches the main display unit 13. Whether or not the user's finger touches the main display unit 13 may be detected based on the captured images captured by the two proximity detection camera units 14 and 15. In this case, based on the captured images captured by the two proximity detection camera units 14 and 15, the distance between the user's finger and the main display unit 13 and whether or not the user's finger has touched the main display unit 13. Therefore, the touch panel 20 can be dispensed with.

[Second Embodiment]
Next, a mobile phone according to a second embodiment of the present invention will be described. When the user's finger approaches the icon, the mobile phone according to the second embodiment enlarges the icon and displays a plurality of functions and characters assigned to the icon in a divided manner. It is a thing.

  The second embodiment is different from the first embodiment only in this point. For this reason, only the difference between the two will be described below, and redundant description will be omitted.

  The flowchart of FIG. 7 shows the flow of such an icon enlargement / division process. The processing shown in this flowchart is executed by the control unit 32 based on the input processing program stored in the ROM 31 shown in FIG. The control unit 32 starts the flowchart of FIG. 7 based on this input processing program by displaying, on the main display unit 13, icons each assigned a plurality of characters and symbols as shown in FIG. 8A, for example. Let me.

  The example shown in FIG. 8A is an example in which three icons are displayed in the upper, middle, and lower stages, and each symbol of “, @.” Is assigned to the icon on the upper right side. In this example, each character “ABC” is assigned to the central icon, and each character “DEF” is assigned to the icon at the upper right end of the upper row.

  When the control unit 32 displays the icons on the main display unit 13 as described above, the control unit 32 controls the activation of the proximity detection camera units 14 and 15. In step S 11, the control unit 32 controls the proximity detection camera units 14 and 15. Each detection output of “distance between user's finger and main display unit 13” and “position on main display unit 13 where user's finger approaches” detected by position detection unit 38 based on each captured image Start importing. And a process is advanced to step S12 at the timing which took in the detection output from the said position detection part 38 which shows that a user's finger | toe approached the main display part 13. FIG.

  In step S <b> 12, the control unit 32 detects the relative distance between the user's finger and the main display unit 13 based on the detection output of “distance between the user's finger and the main display unit 13” detected by the position detection unit 38. Monitoring of the specific positional relationship is started, and the process proceeds to step S13.

  In step S13, the control unit 32 compares the detection output of “distance between the user's finger and the main display unit 13” detected by the position detection unit 38 with a threshold value stored in advance. It is determined whether or not the distance between the user's finger and the main display unit 13 is equal to or smaller than a predetermined distance α (= the threshold value). Then, until it is detected that the distance between the user's finger and the main display unit 13 is equal to or less than the predetermined distance α, the process returns to step S12, and the user's finger and the main display unit 13 At the timing when it is detected that the distance between the user's finger and the main display unit 13 is equal to or less than the predetermined distance α, the determination whether or not the distance between them is equal to or less than the predetermined distance α is continuously performed. The process proceeds to step S14.

  In step S14, since the distance between the user's finger and the main display unit 13 is equal to or less than the predetermined distance α, the control unit 32 detects the “main display that the user's finger is approaching” detected by the position detection unit 38. Based on the detection output of “position on unit 13”, it is detected which icon of the user's finger is currently approaching among the icons currently displayed on main display unit 13. Then, the characters or symbols assigned to the icon that the user's finger is approaching are enlarged and displayed as shown in FIG.

  FIG. 8B shows an example in which the control unit 32 detects that the user's finger approaches the upper center icon. In this case, the control unit 32 converts each character of “ABC” assigned to the upper center icon to an icon, enlarges it to a larger size, and divides and controls the display of the icon of each character. Thereby, it is possible to make it easy for the user to select a desired character or symbol through the icon displayed in an enlarged and divided manner.

  When an icon corresponding to a character desired to be input exists among the icons displayed in an enlarged and divided manner as described above, the user touches the icon corresponding to the character desired to be input. In step S <b> 15, the control unit 32 determines whether or not the user has touched the touch panel 20 by determining whether or not a detection output indicating that the user has touched the touch panel 20 has been supplied from the touch panel control unit 39. Determine. Then, the process proceeds to step S16 at the timing when the detection output indicating that the user touches the touch panel 20 is supplied from the touch panel control unit 39.

  In addition, when there is no character desired to be input in the icons displayed in an enlarged and divided manner in this way, the user moves his / her finger to the display position of another icon, or greatly increases from the display surface of the main display unit 13. The fingers should be separated. Therefore, in step S15, the control unit 32 moves the position of the user's finger greatly from the display position of the icon based on the detection output of “position on the main display unit 13 where the user's finger is approaching”. Alternatively, it is determined whether or not the distance between the main display unit 13 and the finger is equal to or greater than the distance α.

  When it is determined that the position of the user's finger is greatly moved from the display position of the icon or is largely separated from the display surface of the main display unit 13, the process returns to step S12, and the user's finger and main The relative positional relationship with the display unit 13 is continuously monitored. As a result, each time the user's finger moves the display position of each icon, each character or symbol assigned to the destination icon is enlarged and displayed as described above.

  Next, in step S16, the control unit 32 controls to change the display color of the icon touched by the user and / or the icon touched by the user with high brightness as shown in FIG. 8C, for example. And / or display control indicating that a character or symbol corresponding to the icon has been selected by the user, such as lighting (flashing) and / or controlling sound generation of a predetermined voice via the speaker unit 12, and / or ) Audio output control is performed, and the process proceeds to step S17. FIG. 8C shows an example in which the user touches the icon corresponding to the character “A” among the characters “ABC” displayed in enlarged division.

In step S17, the control unit 32 performs the display control and / or audio output control in step S16 for a predetermined time such as 1 second, and then displays the character corresponding to the icon touched by the user, for example, the main display. Display control is performed on the input screen of the mobile mail displayed on the unit 13, and the execution of the routine shown in the flowchart of FIG.

[Effect of the second embodiment]
As is apparent from the above description, the mobile phone according to the second embodiment displays a plurality of icons, each assigned a plurality of characters and symbols, on the main display unit 13, and when the user's finger approaches. In addition, each character or symbol assigned to the icon that the user's finger is in close proximity is displayed in an enlarged and divided manner. As a result, the same effects as those of the first embodiment described above can be obtained, and the following effects can be obtained.

  That is, by enlarging and displaying icons corresponding to each character and symbol, it is possible to make it easier for the user to select a desired character and symbol by effectively using a small display area of the mobile phone. In addition, since the icons corresponding to the respective characters and symbols are displayed in an enlarged manner, erroneous input can be prevented.

  In addition, before the user touches the main display unit 13 with a finger, a list of characters and symbols that are expected to be input by the user is displayed. It accepts symbol input. For this reason, the input operation of a desired character or symbol can be completed with only one touch operation. Therefore, it is possible to greatly reduce the input operation of desired characters and symbols.

In the description of the second embodiment, a plurality of characters and symbols are assigned to each icon, and each character and symbol assigned to the icon is enlarged when the user's finger approaches. Although it was decided to display in a divided manner, each icon is assigned multiple functions, and when a user's finger approaches, the characters and icons indicating the functions assigned to that icon are enlarged and divided. You may make it display.

[Modification 1]
In the description of each embodiment described above, the relative positional relationship between the main display unit 13 and the user's finger is detected based on the captured images captured by the two proximity detection camera units 14 and 15. Instead of these two proximity detection camera units 14 and 15, a capacitive touch panel is provided as the touch panel 20, and the main display unit 13 is based on the current value detected by the capacitive touch panel. And the relative positional relationship between the user's finger may be detected.

  In the case of a capacitive touch panel, as shown in FIG. 9, the detected current value increases as the user's finger approaches the touch panel. For this reason, the control unit 32 holds the current value detected when the distance between the user's finger and the touch panel becomes the above-mentioned fixed distance α as the threshold S as shown in FIG. A current value detected when the finger touches the touch panel is held as a threshold value L.

Then, when the current value of the threshold value S is detected from the touch panel, it is determined that the distance between the user's finger and the touch panel has become a certain distance α, and the above-described icon enlargement process or icon When the current value of threshold L is detected from the touch panel, it is determined that the user's finger has touched the touch panel, and the application function corresponding to the icon touched by the user is executed. Alternatively, input processing of characters and symbols corresponding to the icon touched by the user is performed. Thereby, the effect similar to each embodiment can be acquired.

[Modification 2]
Further, instead of the proximity detection camera units 14 and 15, an optical sensor is provided for the main display unit 13, and the main display unit is based on the light blocking position detected by the optical sensor. The relative positional relationship between 13 and the user's finger may be detected.

  That is, in the case of this example, as shown in FIG. 10, the fact that the user's finger is close to the main display unit 13 at the position of, for example, 5 mm from the display surface of the main display unit 13 is displayed. A first infrared sensor for detection is provided, and a second infrared sensor for detecting that the user's finger touches the main display unit 13 at a position of, for example, 1 mm from the display surface of the main display unit 13. An infrared sensor is provided.

  When the user's finger approaches the display surface of the main display unit 13 to a position of, for example, 5 mm, a detection output indicating that the infrared output from the first infrared sensor is blocked is obtained, and the user's finger is displayed on the main display. When approaching, for example, to a position of 1 mm from the display surface of the unit 13, a detection output indicating that the infrared output from the second infrared sensor has been cut off is obtained.

  When the detection output indicating that the infrared output from the first infrared sensor is interrupted is obtained, the control unit 32 has a distance between the user's finger and the main display unit 13 that is a fixed distance α. If the detection output indicating that the infrared output from the second infrared sensor is blocked is obtained by performing the above-described icon enlargement process or icon enlargement division process, the user's finger is It is determined that the display surface of the display unit 13 has been touched, and an application function corresponding to the icon touched by the user is executed, or character or symbol input processing corresponding to the icon touched by the user is performed. Thereby, the touch panel 20 and the touch panel control unit 39 shown in FIG. 2 can be omitted, and the same effects as those of the above-described embodiments can be obtained.

In this example, the infrared sensor detects that the user's finger has come close to the display surface of the main display unit 13 and that the user's finger has touched the display surface of the main display unit 13. The infrared sensor only detects that the user's finger is close to the display surface of the main display unit 13, and whether the user's finger touches the display surface of the main display unit 13 is displayed on the display surface of the main display unit 13. You may make it discriminate | determine by providing a touch panel and having touched this touch panel.

  Lastly, in the description of each of the above-described embodiments, the present invention is applied to a mobile phone. This is applicable to PHS phones (PHS: Personal Handyphone System), PDA devices (PDA: Personal Digital Assistant), etc. You may apply to another portable terminal device.

  Moreover, each above-mentioned embodiment is only disclosed as an example of this invention to the last. For this reason, the present invention is not limited to the above-described embodiments, and the design and the like are within the scope of the technical idea according to the present invention, even if other than the above-described embodiments. It should be added that various changes can be made according to the above.

It is a figure which shows the external appearance of the mobile telephone which becomes 1st Embodiment to which this invention is applied. 1 is a block diagram of a mobile phone according to a first embodiment. It is a flowchart which shows the flow of the expansion process of the icon of the mobile telephone of 1st Embodiment. It is a schematic diagram which shows a mode that an icon is expanded by the mobile telephone of 1st Embodiment. It is a figure for demonstrating the calculation process of the position of the user's finger | toe on a main display part with the mobile telephone of 1st Embodiment. It is a figure for demonstrating the calculation process of the distance between a main display part and a user's finger | toe with the mobile telephone of 1st Embodiment. It is a flowchart which shows the flow of the expansion division process of the icon of the mobile phone of 2nd Embodiment to which this invention is applied. It is a schematic diagram which shows a mode that an icon is expanded and divided with the mobile phone of 2nd Embodiment. It is a figure for demonstrating the modification which detects the proximity | contact and touch of a user's finger | toe using a capacitive touch panel. It is a figure for demonstrating the modification which detects the proximity | contact and touch of a user's finger | toe using an infrared sensor.

Explanation of symbols

  DESCRIPTION OF SYMBOLS 1 Upper housing | casing 2 Lower housing | casing 3 Hinge part 4 Key input part 5 Camera part 6 Shooting key 7 Rotation operation part 8 Microphone part 9 Clear key 10 Character type change key 11 Whip antenna 12 Speaker unit, 13 Main display unit, 14 Proximity detection camera unit, 15 Proximity detection camera unit, 16 Sub display unit, 17 Earphone jack, 20 Touch panel, 31 ROM, 32 Control unit, 33 Communication circuit, 34 Display unit, 35 Operation unit, 36 memory, 37 camera control unit, 38 position detection unit, 39 touch panel control unit, 40 control line, 41 data line

Claims (5)

  1. Icon display means for arranging and displaying one or a plurality of icons on the display means based on icon display information;
    Based on the detection output from the distance detection means for detecting the distance between the selection means for selecting the icon and the display surface of the display means by touching the icon displayed on the display means, Proximity determination means for determining whether or not the distance between the selection means and the display surface of the display means is equal to or less than a predetermined distance;
    By causing the computer to function as the proximity determination means, the selection means approaches when the distance between the selection means and the display surface of the display means is determined to be equal to or less than a predetermined distance. The selection means approaches based on the detection output from the position detection means for detecting the position of the display means on the display surface and the display information of the icon displayed on the display means by the icon display means. Proximity icon detecting means for detecting an icon displayed at a position on the display surface,
    When a plurality of functions, or a plurality of characters or symbols are assigned to an icon that is close to the selection means detected by causing the computer to function as the proximity icon detection means , the selection means approaches. The icon is divided into icons corresponding to the respective functions assigned to the icon or the assigned characters and symbols, and the divided icons are enlarged and displayed on the display means. An input processing program that causes a computer to function as display change means .
  2. An input processing program according to claim 1,
    Corresponding to the icon for which the contact of the selection means is detected when the contact of the selection means is detected by the contact detection means for the icon displayed in an enlarged or divided manner on the display surface of the display means An input processing program for causing a computer to function as an input processing unit that performs execution control of a function to perform or input processing of characters or symbols corresponding to an icon for which contact of the selection unit is detected.
  3. Icon display means for arranging and displaying one or a plurality of icons on the display means based on icon display information;
    A distance detection means for detecting a distance between a selection means for selecting the icon and a display surface of the display means by touching an icon displayed on the display means;
    Proximity determination means for determining whether or not the distance between the selection means and the display surface of the display means is equal to or less than a predetermined distance based on the detection output from the distance detection means;
    When the proximity determining means determines that the distance between the selecting means and the display surface of the display means is equal to or less than a predetermined distance, the display of the display means that is close to the selecting means Position detecting means for detecting a position on the surface;
    Based on the detection output from the position detection means and the display information of the icon displayed on the display means by the icon display means, the selection means is displayed at a position on the display surface close to the selection means. Proximity icon detection means for detecting the icon
    When a plurality of functions, or a plurality of characters or symbols are assigned to the icons that are close to the selection means detected by the proximity icon detection means, the icons that are close to the selection means are It has icon display changing means for dividing each divided icon into an icon corresponding to each function assigned to the icon or assigned character or symbol and displaying the divided icon on the display means. Mobile terminal device.
  4. The mobile terminal device according to claim 3,
    Contact detection means for detecting contact of the selection means with respect to the icon displayed in an enlarged or divided manner on the display surface of the display means;
    An input processing means for executing and controlling a function corresponding to the icon for which the contact of the selection means is detected by the contact detection means, or for performing an input process of a character or a symbol corresponding to the icon for which the contact of the selection means is detected; A portable terminal device comprising:
  5. An icon display means for arranging and displaying one or more icons on the display means based on the display information of the icon;
    Proximity discrimination based on the detection output from the distance detection means for detecting the distance between the selection means for selecting the icon by touching the icon displayed on the display means and the display surface of the display means Means for determining whether the distance between the selection means and the display surface of the display means is equal to or less than a predetermined distance;
    When the proximity determining means determines that the distance between the selection means and the display surface of the display means is equal to or less than a predetermined distance, the proximity icon detection means The selection means approaches based on the detection output from the position detection means for detecting the position of the display means on the display surface and the display information of the icon displayed on the display means by the icon display means. Detecting an icon displayed at a position on the display surface,
    When a plurality of functions, or a plurality of characters and symbols are assigned to the icon that is close to the selection means detected by the proximity icon detection means in the step , the icon display change means is the selection means. Is divided into icons corresponding to the functions assigned to the icons or the letters and symbols assigned to the icons, and the divided icons are enlarged to be used as the display means. An input processing method comprising: a step of displaying.
JP2005051876A 2005-02-25 2005-02-25 Input processing program, portable terminal device, and input processing method Active JP4479962B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2005051876A JP4479962B2 (en) 2005-02-25 2005-02-25 Input processing program, portable terminal device, and input processing method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2005051876A JP4479962B2 (en) 2005-02-25 2005-02-25 Input processing program, portable terminal device, and input processing method

Publications (3)

Publication Number Publication Date
JP2006236143A JP2006236143A (en) 2006-09-07
JP2006236143A5 JP2006236143A5 (en) 2008-03-27
JP4479962B2 true JP4479962B2 (en) 2010-06-09

Family

ID=37043698

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2005051876A Active JP4479962B2 (en) 2005-02-25 2005-02-25 Input processing program, portable terminal device, and input processing method

Country Status (1)

Country Link
JP (1) JP4479962B2 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101770335B (en) 2008-12-26 2012-08-08 兄弟工业株式会社 Inputting apparatus
CN105814530A (en) * 2013-12-05 2016-07-27 三菱电机株式会社 Display control device, and display control method

Families Citing this family (49)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4900824B2 (en) * 2007-09-18 2012-03-21 トヨタ自動車株式会社 Input display device
JP4990753B2 (en) 2007-12-28 2012-08-01 パナソニック株式会社 Electronic device input device, input operation processing method, and input control program
JP4979570B2 (en) 2007-12-28 2012-07-18 パナソニック株式会社 Electronic device input device, input operation processing method, and input control program
JP5127547B2 (en) * 2008-04-18 2013-01-23 株式会社東芝 Display object control device, display object control program, and display device
KR101502002B1 (en) * 2008-05-21 2015-03-12 엘지전자 주식회사 Mobile terminal using of proximity touch and wallpaper controlling method therefor
KR101498039B1 (en) * 2008-06-02 2015-03-03 엘지전자 주식회사 a mobile telecommunication device and a method of displaying characters using the same
KR101495350B1 (en) 2008-06-24 2015-02-24 엘지전자 주식회사 Portable terminal capable of sensing proximity touch
KR101493089B1 (en) 2008-07-10 2015-02-12 주식회사 케이티 Method For Providing User Interface In Touch Input Recognizing Apparatus And Touch Input Recognizing Device Performing The Same
US9658765B2 (en) * 2008-07-31 2017-05-23 Northrop Grumman Systems Corporation Image magnification system for computer interface
JP4752887B2 (en) * 2008-09-12 2011-08-17 ソニー株式会社 Information processing apparatus, information processing method, and computer program
WO2010069271A1 (en) * 2008-12-19 2010-06-24 华为终端有限公司 Touch screen input method, device and communication terminal
JP5287403B2 (en) * 2009-03-19 2013-09-11 ソニー株式会社 Information processing apparatus, information processing method, and program
US8111247B2 (en) * 2009-03-27 2012-02-07 Sony Ericsson Mobile Communications Ab System and method for changing touch screen functionality
KR100929306B1 (en) 2009-05-27 2009-11-27 박창규 Input apparatus and method
EP2438504A1 (en) * 2009-06-05 2012-04-11 Dassault Systemes SolidWorks Corporation Predictive target enlargement
JP5532300B2 (en) * 2009-12-24 2014-06-25 ソニー株式会社 Touch panel device, touch panel control method, program, and recording medium
CN102792250B (en) * 2010-03-05 2015-09-02 联想创新有限公司(香港) Mobile terminal
JP5492627B2 (en) * 2010-03-25 2014-05-14 株式会社ジャパンディスプレイ Information display device and information display method
JP5625599B2 (en) * 2010-08-04 2014-11-19 ソニー株式会社 Information processing apparatus, information processing method, and program
US20120050007A1 (en) * 2010-08-24 2012-03-01 Babak Forutanpour Methods and apparatus for interacting with an electronic device application by moving an object in the air over an electronic device display
JP2013061680A (en) * 2010-10-14 2013-04-04 Nikon Corp Display device
JP5304848B2 (en) 2010-10-14 2013-10-02 株式会社ニコン Projector
KR101646616B1 (en) 2010-11-30 2016-08-12 삼성전자주식회사 Apparatus and Method for Controlling Object
CN102043584A (en) * 2010-12-07 2011-05-04 中兴通讯股份有限公司 Input method and device applied to digital terminal
JP5617603B2 (en) * 2010-12-21 2014-11-05 ソニー株式会社 Display control apparatus, display control method, and program
US9354804B2 (en) 2010-12-29 2016-05-31 Microsoft Technology Licensing, Llc Touch event anticipation in a computing device
JP5776964B2 (en) * 2011-03-04 2015-09-09 日本電気株式会社 Information display device, display control method, and program
JP2012190183A (en) * 2011-03-09 2012-10-04 Sony Corp Image processing device, method, and program
JP2012190184A (en) 2011-03-09 2012-10-04 Sony Corp Image processing device, method, and program
JP5654118B2 (en) * 2011-03-28 2015-01-14 富士フイルム株式会社 Touch panel device, display method thereof, and display program
JP2012226691A (en) * 2011-04-22 2012-11-15 Ntt Docomo Inc Display control device and display control method
JP2012248066A (en) 2011-05-30 2012-12-13 Canon Inc Image processing device, control method of the same, control program and imaging apparatus
JP5155427B2 (en) * 2011-06-08 2013-03-06 株式会社コナミデジタルエンタテインメント Game device, game device control method, and program
US20140111430A1 (en) * 2011-06-10 2014-04-24 Nec Casio Mobile Communications, Ltd. Input device and control method of touch panel
JP5810874B2 (en) * 2011-12-06 2015-11-11 株式会社日本自動車部品総合研究所 display control system
US20130335360A1 (en) * 2012-01-03 2013-12-19 Aviv Ron Touch screen interaction methods and apparatuses
JP2015084124A (en) * 2012-02-06 2015-04-30 パナソニック株式会社 Information processing device
CN103874976B (en) * 2012-02-14 2018-05-18 松下电器产业株式会社 Electronic equipment
JP5828800B2 (en) 2012-04-23 2015-12-09 パナソニック インテレクチュアル プロパティ コーポレーション オブアメリカPanasonic Intellectual Property Corporation of America Display device, display control method, and program
US9904457B2 (en) * 2012-04-25 2018-02-27 Nokia Technologies Oy Causing display of a three dimensional graphical user interface with dynamic selectability of items
JP5972692B2 (en) * 2012-07-11 2016-08-17 株式会社Nttドコモ User interface device, user interface method and program
JP5620440B2 (en) * 2012-08-09 2014-11-05 パナソニックインテレクチュアル プロパティ コーポレーション オブアメリカPanasonic Intellectual Property Corporation of America Display control apparatus, display control method, and program
JP5798532B2 (en) * 2012-08-23 2015-10-21 株式会社Nttドコモ User interface device, user interface method and program
CN102937866B (en) * 2012-11-12 2015-05-13 小米科技有限责任公司 Screen control method and terminal
JP6024466B2 (en) 2013-01-16 2016-11-16 富士通株式会社 Information processing apparatus, information processing method, and information processing program
JP5666641B2 (en) 2013-03-13 2015-02-12 パナソニック インテレクチュアル プロパティ コーポレーション オブアメリカPanasonic Intellectual Property Corporation of America Information terminal
WO2014192204A1 (en) 2013-05-28 2014-12-04 京セラドキュメントソリューションズ株式会社 Display apparatus and image forming apparatus
JP6244712B2 (en) * 2013-07-23 2017-12-13 富士通株式会社 Image processing apparatus, image processing method, and image processing program
CN107071092A (en) * 2016-07-16 2017-08-18 叶大可 A kind of smart mobile phone

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101770335B (en) 2008-12-26 2012-08-08 兄弟工业株式会社 Inputting apparatus
US8271900B2 (en) 2008-12-26 2012-09-18 Brother Kogyo Kabushiki Kaisha Inputting apparatus
CN105814530A (en) * 2013-12-05 2016-07-27 三菱电机株式会社 Display control device, and display control method
CN105814530B (en) * 2013-12-05 2018-11-13 三菱电机株式会社 Display control unit and display control method

Also Published As

Publication number Publication date
JP2006236143A (en) 2006-09-07

Similar Documents

Publication Publication Date Title
US9575646B2 (en) Modal change based on orientation of a portable multifunction device
AU2008201540B2 (en) List scrolling and document translation, scaling, and rotation on a touch-screen display
US9891819B2 (en) Apparatus and method for inputting character using touch screen in portable terminal
TWI470531B (en) A mobile terminal and a method of controlling a mobile terminal including a touch input device
EP2172836B1 (en) Mobile terminal and user interface of mobile terminal
US7978176B2 (en) Portrait-landscape rotation heuristics for a portable multifunction device
AU2008100010A4 (en) Portable multifunction device, method, and graphical user interface for translating displayed content
US8274536B2 (en) Smart keyboard management for a multifunction device with a touch screen display
KR101287791B1 (en) Input device
KR101482125B1 (en) Mobile terminal and operation method thereof
US10313505B2 (en) Portable multifunction device, method, and graphical user interface for configuring and displaying widgets
CN101627617B (en) Portable electronic device supporting application switching
EP2225629B1 (en) Insertion marker placement on touch sensitive display
DE102008000001B4 (en) Integrated hardware and software user interface
US8063872B2 (en) Portable electronic device with auto-dim timers
US8493334B2 (en) Input device, storage medium, information input method, and electronic apparatus
US7899499B2 (en) Mobile telecommunication handset having touch pad
US20140340327A1 (en) Portable multifunction device, method, and graphical user interface for interpreting a finger gesture
US8477139B2 (en) Touch screen device, method, and graphical user interface for manipulating three-dimensional virtual objects
US9207855B2 (en) Portable multifunction device, method, and graphical user interface for adjusting an insertion point marker
JP6369704B2 (en) Information processing apparatus, program, and information processing method
US20050085215A1 (en) Method and related apparatus for emergency calling in a touch screen mobile phone from a touch screen and keypad lock active state
EP2058964A2 (en) Mobile terminal and method for converting broadcast channel of a mobile terminal
KR101012300B1 (en) User interface apparatus of mobile station having touch screen and method thereof
EP2447857A1 (en) Communication device and electronic device

Legal Events

Date Code Title Description
RD02 Notification of acceptance of power of attorney

Free format text: JAPANESE INTERMEDIATE CODE: A7422

Effective date: 20071009

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20080212

A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20080212

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20090930

A711 Notification of change in applicant

Free format text: JAPANESE INTERMEDIATE CODE: A711

Effective date: 20091002

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20091027

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20100310

A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20100310

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20130326

Year of fee payment: 3

R150 Certificate of patent or registration of utility model

Free format text: JAPANESE INTERMEDIATE CODE: R150