JP2004355606A - Information processor, information processing method, and program - Google Patents

Information processor, information processing method, and program Download PDF

Info

Publication number
JP2004355606A
JP2004355606A JP2004029872A JP2004029872A JP2004355606A JP 2004355606 A JP2004355606 A JP 2004355606A JP 2004029872 A JP2004029872 A JP 2004029872A JP 2004029872 A JP2004029872 A JP 2004029872A JP 2004355606 A JP2004355606 A JP 2004355606A
Authority
JP
Japan
Prior art keywords
function
contact
display
information
button
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP2004029872A
Other languages
Japanese (ja)
Other versions
JP2004355606A5 (en
Inventor
Tatsushi Nashida
Haruo Oba
Jiyunichi Rekimoto
Carsten Schwesig
シュヴェージグ カーステン
晴夫 大場
純一 暦本
辰志 梨子田
Original Assignee
Sony Corp
ソニー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to JP2003037370 priority Critical
Priority to JP2003127408 priority
Application filed by Sony Corp, ソニー株式会社 filed Critical Sony Corp
Priority to JP2004029872A priority patent/JP2004355606A/en
Publication of JP2004355606A5 publication Critical patent/JP2004355606A5/ja
Publication of JP2004355606A publication Critical patent/JP2004355606A/en
Application status is Pending legal-status Critical

Links

Images

Abstract

<P>PROBLEM TO BE SOLVED: To provide an information processor in which the input operability is improved by making a user easily recognize functions assigned to an input means before performing input operations, even when the input means for performing input operations and a display are integrated and made small sized, and to provide an information processing method and a program. <P>SOLUTION: An information processor 1 has a display screen 10 and a keyboard 20L/R, and the keyboard is constituted such that a plurality buttons 21L/R incorporating touch sensors are arrayed within a range that a user can touch the buttons with his or her thumbs when holding both sides of a main body of the keyboard with his right and left hands, and the function of inputting a numerical value or a character is assigned to each button. Information relating to functions assigned to the respective buttons is displayed as a soft keyboard 30 on the screen, and when the user touches the button of the keyboard with his or her thumb, the touch sensor detects it, and only a corresponding soft button of the soft keyboard is highlighted and displayed. By depressing the button 21L/R in this state, the function assigned to the buttons can be selected. <P>COPYRIGHT: (C)2005,JPO&NCIPI

Description

  The present invention relates to an information processing device, an information processing method, and a program suitable for a small and portable device in which a display screen and an input unit such as a keyboard are integrated.

  When inputting information to an information processing apparatus, a user often operates buttons, a keypad, and the like (hereinafter, buttons as hardware including the keypad and the like are collectively referred to as buttons).

  For example, when a user inputs information to a personal computer (PC), the user operates a keyboard composed of a plurality of buttons, and when a user inputs various information as well as a telephone number to a mobile phone, a plurality of functions are assigned. And operating the number buttons and operation buttons that are in operation.

  During such an operation, the user visually recognizes a symbol such as a symbol or a character printed on the surface of the button, and estimates a function assigned to the button.

  2. Description of the Related Art Conventionally, as an electronic device having such an operation unit, Japanese Patent Application Laid-Open Publication No. H11-163873 discloses a handheld game machine that achieves improvement in operability while achieving downsizing. In the technology described in Patent Document 1, if the size is reduced after increasing the number of operation means in order to diversify the operation content, the interval between the operation switches becomes small, and the operability is deteriorated. While viewing the game content displayed on the display unit, a first operation unit that the user can contact with the left and right thumbs and a second operation unit that can be contacted with the left and right forefinger or middle finger are provided. A handheld game machine has been disclosed in which the thumb is brought into contact with the first operation unit while the index finger or the middle finger is brought into contact with the second operation unit while maintaining the gripped state, thereby improving operability. I have.

  On the other hand, as an information processing device for performing an input operation using an attached pen instead of performing an input operation using a physical button such as the above-described keyboard, there is a tablet computer or a pen computer. These devices are portable computers that the user (user) can use in their hands. Characters can be recognized by writing characters on the screen with the attached pen, or tapping and inputting characters from the on-screen soft keyboard. Allows you to enter characters.

JP 2001-212374 A

  However, in the device operated by the physical operation buttons described above, in the following cases (1) to (5), it is difficult for the user to visually recognize the symbols printed on the surface of the buttons. It is.

(1) When the surroundings are dark (when the user operates buttons on a mobile phone at night, or when operating buttons on a remote controller of a television (TV) receiver in a dark room, etc.)
(2) When a plurality of functions are assigned to one button and a plurality of symbols corresponding to those functions are printed on the surface of the one button, respectively (depending on the mode, the function corresponding to the button may be different). If it changes)
(3) When the button is arranged at a position where it cannot be seen (for example, when the button is provided on the back side of the device)
(4) When the user's finger is placed on the button (5) When a symbol appropriately representing the function assigned to the button is not printed on the surface of the button (the When a difficult symbol is printed on the surface of the button)

  Therefore, in the case where the symbol assigned to the button cannot be recognized, the function assigned to the operation button cannot be estimated. That is, as in the technique described in Patent Document 1, even if an operation section corresponding to the thumb and forefinger is provided to improve operability and reduce the size, the operation section cannot be visually recognized due to dark surroundings. However, the user cannot recognize the function assigned to the operation unit only after performing the operation using the operation unit, and cannot estimate the function assigned to the operation unit before performing the operation. And operability is poor.

  In a method of writing characters on a screen with a pen and recognizing characters, instead of inputting with physical buttons, such as a tablet computer or a pen computer, the pen is moved from the main body only during character input. It has to be removed and operated, which is time-consuming, tired, and misrecognized.In addition, when using a soft keyboard to input data, attention must be paid to visually tapping the keyboard. It is necessary, and there is a problem that the feeling of fatigue is large and the operability is poor as compared with the input by button operation.

  Furthermore, if physical buttons such as a normal keyboard are attached to a tablet computer, the overall size becomes extremely large, or it is necessary to adopt a two-fold form as in a notebook PC, Impairs portability as an integrated display.

  The present invention has been proposed in view of such a conventional situation. Even if the input means for performing the input operation and the display are integrated and small, the function assigned to the input means before the input operation is performed by the user. It is an object of the present invention to provide an information processing apparatus, an information processing method, and a program, which have improved input operability by allowing the user to easily recognize the information.

  In order to achieve the above-described object, an information processing apparatus according to the present invention includes an input unit to which one or more functions are assigned according to a contact position, and a contact detection unit that detects a physical contact with the input unit. And display means for displaying function information relating to one or more functions assigned to the input means; and display control means for reflecting a contact detection result of the contact detection means on the display of the function information. The means selects the function when pressed in a state where the result of the contact detection is reflected on the display of the function information.

  In the present invention, the user has a touch detection unit and an input unit, and the function of the input unit is displayed on the screen as function information, and the user selects a function by reflecting the contact detection result on the function information. Before doing so, it is possible to confirm the function currently being touched, and it is possible to reliably perform function selection.

  Further, the input means has a plurality of buttons having the contact detection means, and a function can be assigned to each button. Even in a small space, a plurality of buttons are provided. For example, the same key layout as a keyboard used for a PC can be configured.

  Furthermore, the plurality of buttons of the input means can be arranged at intervals such that two or more buttons are simultaneously touched by one finger, and the function is not selected unless pressed in a reflected state. Button can be formed small enough to place a finger over the button, and even if the button is hidden by the user's own finger, the function can be recognized by the function information of the display means, and the function can be selected accurately. can do.

  Furthermore, the display control means, when a contact is detected by the contact detection means, displays information relating to a function corresponding to the contact position brighter or enlarged than information relating to other functions, thereby displaying the user's information. Contact feedback can be provided.

  In addition, the user may hold the both sides of the main body with the left and right hands, respectively, and separate them at positions where they can perform input operations with their thumbs, so that operation input is possible with the hands held by both hands It is.

  Further, there is provided a screen contact position detecting means provided on a display screen of the display means for detecting a contact position on the display screen touched by a user, wherein the display means has a function of receiving an operation input from the user And a function corresponding to a display item of the operation information displayed at the contact position detected by the screen contact position detecting means can be selected. The operability is further improved by using the function together.

  The information processing method according to the present embodiment includes a display step of displaying function information on one or more functions assigned according to a contact position of an input unit on a display screen, and a contact step of detecting a physical contact with the input unit. A detection step, a display control step of reflecting the contact detection result on the display of the function information, and a selection of the function when the input means is pressed while the contact detection result is reflected on the display of the function information And a selecting step of performing the selection.

  A program according to the present invention causes a computer to execute the above-described information processing.

  According to the information processing apparatus of the present invention, input means to which one or more functions are assigned according to the contact position, contact detection means for detecting a physical contact with the input means, and assignment to the input means Display means for displaying function information relating to the obtained one or more functions; and display control means for reflecting the contact detection result of the contact detection means on the display of the function information. Is pressed in a state reflected in the display of the function information, the function is selected, so that information on the input means is displayed on the screen, and the contact of the user with the input means is reflected on the display on the screen. Therefore, the user can recognize the function of the input unit by touching the input unit without looking at the input unit, and the input operation is easy. Therefore, the input unit does not need to describe information indicating the function. An input means and a display performing force operation may be an integral and compact.

  Further, according to the information processing method according to the present invention, a contact position of the user on the input unit to which a different function is assigned according to the contact position is detected, and display information reflecting the contact is displayed on the display screen. It is possible to provide an information processing method for selecting the function when the input unit is pressed in a state where the contact is reflected. Further, according to the program of the present invention, it is possible to cause the computer of the information processing apparatus as described above to execute information processing.

  Hereinafter, specific embodiments to which the present invention is applied will be described in detail with reference to the drawings. In this embodiment, the present invention is applied to a portable display / keyboard integrated information processing apparatus.

  FIG. 1 is a schematic diagram illustrating an information processing apparatus according to the present embodiment. As shown in FIG. 1, the information processing apparatus 1 includes a display screen (display) 10 arranged substantially at the center, a left keyboard 20L and a right keyboard 20R as operation input means separately arranged on the left and right sides of the display screen. And

  The left and right keyboards 20L / R are composed of a button group composed of a plurality of buttons, and different functions are assigned to the respective buttons 21L / R for inputting different characters or numerical values. Information on the function of each of these buttons 21L / R is displayed on the display screen 10 as a soft keyboard (function information) 30. In the present embodiment, the display screen 10 has a function information display unit 11 and an input result display unit 12 that displays input results such as characters and numerical values input by the keyboard 20L / R.

  The information processing apparatus 1 has a rectangular display screen 10 whose vertical direction is a short side (horizontally long). The left keyboard 20L and the right keyboard R are located near the center of both sides of the main body with the display screen 10 interposed therebetween. Provided.

  In the present embodiment, the left and right keyboards 20L / R are composed of 20 buttons (input keys) 21L / R of 4 × 5 in the vertical and horizontal directions. With both sides of the main body held by the user, they are arranged at positions where they can be touched by the left thumb 40L and the right thumb 40R of the user.

  Each button 21L / R of the left and right keyboards 20L / R is provided with a built-in touch sensor (not shown), which is a contact detecting means for detecting physical contact with a living body (finger). The touch sensor detects a contact of each button 21L / R of the user's thumb 40L / R with a key top or the like, and the contact detection result is reflected (feedback) on the soft keyboard 30 of the display screen 10. That is, it is detected which of the 20 buttons 21L / R the user's hand (thumb) has touched, and the soft button 31L / R corresponding to the information on the function assigned to this button 21L / R is set. Light is emitted with a color different from that of the other soft buttons or with a higher luminance than the other soft buttons for display (hereinafter, referred to as highlight display). Thereby, the user can see which button 21L / R is currently placed on his / her finger without looking at keyboard 20L / R (button 21L / R) by looking at display screen 10. It is possible to know what kind of function is assigned to the button 21L / R where (is in contact).

  The display of the soft keyboard 30 and the highlight display of the soft keyboard 30 by touch are controlled by a display control unit (not shown).

  Here, the buttons constituting the keyboard 20L / R are arranged in a range where all the buttons can be touched by moving only the thumb without changing the position of the hand in a state where the buttons are held by both hands. A plurality of buttons are arranged at intervals such that the thumbs can hide a plurality of buttons, for example, 3 mm vertically and horizontally. When a plurality of button groups are arranged in close proximity to a narrow range in this manner, it may be difficult to touch or press each button, but for example, detection of contact of two or more buttons is performed. In such a case, it is assumed that a button located closest to the center is in contact in the range in which the contact is detected, and any one of the buttons is selected to detect contact with that button. Then, by highlighting only the display of the information related to the function assigned to the selected one button, the user can display the function information corresponding to any one of the buttons even when the thumb is placed on the plurality of buttons. Since the contact is reflected only on the button, all the buttons can be provided in close contact with a small area that can be contacted only by the movement of the thumb.

  Such contact detection of the user's finger can be realized by, for example, incorporating a later-described proximity sensor or the like in each button 21L / R.

  Then, the keyboard 20L / R displays the soft button 31L / R corresponding to the function assigned to the button 21L / R currently touched by the user himself in a highlighted state. By pressing, functions as an input unit for selecting a function assigned to the button 21L / R. Here, when the buttons 21L / R constituting the keyboard 20L / R are pressed down, the state of the key tops changes up and down, for example, and the state changes. It is configured so that the user can feel that the button has been pressed. Also, as described above, since the buttons are densely provided, even if multiple buttons are pressed at the same time with the thumb, if the touch is not pressed in a state reflected on the display, the function is selected. Therefore, it is possible to select only one desired function which has been highlighted in advance.

  Next, an input method in the information processing apparatus according to the present embodiment will be described. First, the user grips the information processing device 1 with both hands, and touches the left and right thumbs on the keyboard 20L / R. On the display screen, when a touch sensor provided on each button 21L / R of the keyboard 20L / R detects a contact by the user, information on the function automatically assigned to each button 21L / R of the keyboard 20L / L is displayed. The soft keyboard 30 is displayed. In addition, a display button or the like for displaying the soft keyboard 30 may be separately provided and operated to display the soft keyboard 30.

  Then, when the user moves the contact position of the left and right thumbs 40L / R on the keyboard 21L / R, the contact position, that is, the button for detecting the contact moves, and each button displays this when the contact is detected. Notify the control unit.

  The display control unit switches the display of the soft button of the soft keyboard corresponding to the button that has detected the touch to the highlight display 32. The user can recognize the function of the button 21L / R touched by the user by looking at the highlight display 32 of the soft keyboard 30. At this time, the user's contact is only reflected on the soft keyboard 30, and the function assigned to each button 21L / R, such as input of a predetermined alphabet, is not selected. In other words, by touching the button 21L / R and reflecting it on the soft keyboard 30, the character assigned to that button is provisionally selected.

  When the user touches the button 21L / R and presses the touched button 21L / R in a state where the desired function to be operated is highlighted 32, the function is selected. That is, the function assigned to the touching button 21L / R is temporarily selected by reflecting the touch on the soft keyboard 30, and the function can be permanently selected by pressing the button 21L / R in that state. . For example, in a case where a function of inputting an alphabet or a numerical value is assigned to each button of the keyboard 20L / R, the user repeats such an operation, so that the user can make each button 21L constituting the keyboard 20L / R. / without looking at the R, "Hello." type a sentence such as, it can be displayed in the input result display unit 12 on the display screen 10.

  Normally, if the keyboard is small and separated to the left and right, it is difficult to visually recognize the key top label of each button 21L / R of the keyboard 20L / R, or a finger may cover a plurality of buttons. Therefore, if the label of the button to be operated becomes invisible, or if two operation input units are provided on the left and right, the line of sight reciprocates, and it becomes difficult to recognize the function assigned to each button 21L / R of the keyboard 20. Or On the other hand, in the present embodiment, since the user can recognize information on each function of the keyboard 20L / R without looking at the keyboard 20L / R, it is displayed on the display screen 10 as the soft keyboard 30. Then, by detecting the touch by the user and reflecting the touch on the display of the soft keyboard 30, the user can use the buttons 21L / R regardless of whether the keyboard is small or arranged separately on the left and right. The assigned function can be recognized by the soft keyboard 30 having the highlight display 32 corresponding to the button that has detected the contact.

  Thus, the user can grasp the function only by touching the keyboard 20L / R by putting a finger on the keyboard 20L / R. Therefore, the keyboard 20L / R is integrally formed with the display screen 10 to be a small portable device. It can be an information processing device. Then, for example, if the left / right separated miniaturized keyboard 20L / R as in the present embodiment is used, the input operation of two thumbs can be performed while maintaining the state in which the device is gripped with both hands, and the touched button is pressed. By highlighting, even if the surroundings are dark, the function can be easily recognized.

  Next, a modified example of the present embodiment will be described. FIGS. 2A and 2B are schematic diagrams illustrating an information processing apparatus according to a modification of the first embodiment. As described above, the number of buttons that can be arranged on the left and right keyboards may be limited depending on the size of the buttons or the size of the information processing device. Thus, in the present modification, an example of a combination of button operations that enables input of a greater number of character types than the number of buttons in the case of character input, for example, by combining left and right button operations will be described.

  Conventionally, various keyboards or keypads for inputting characters with a small number of buttons have been developed. Some keyboards or keypads input characters by combining a plurality of buttons. For example, in TAGTYPE (two-handed thumb high-speed Japanese input system developed by Leading Edge Design), a total of ten buttons, five on each side, are prepared, and the Japanese syllabary row and column are alternately pressed. Enter a word. That is, “line” is selected by the first input, and at this time, each line from “a” to “wa” is allocated to the ten buttons, and a desired “line” can be selected. . In the second input, "stage" is selected and characters are input. At this time, each stage from "A" to "O" is assigned to both the left and right, and either of the left and right buttons You can also enter the same character by pressing. However, this type of keyboard requires the user to learn the button layout.

  On the other hand, in the present invention, the keyboard (each button) detects the contact of the user and reflects the contact on the display, so that the user can know the command content indicated by the button in advance and learn the button arrangement. It is unnecessary.

  As shown in FIG. 2A, an information processing apparatus 51 according to the present modification includes a display screen 60 provided at the center thereof, and a left keyboard 70L and a right keyboard 70R separately disposed on the left and right sides of the display screen 60. The left and right keyboards 70L / R are composed of a total of 16 buttons 71L / R (a keypad with a touch sensor), each having 8 keys in 2 rows and 4 columns.

  The left and right keyboards 70L / R can detect, for example, the state of which button the user's finger 40L / R touches, such as the thumb, and the nine states of not touching any button. Therefore, when pressing the eight right buttons 71R, depending on how the left keyboard 70L is touched (nine types of states), 8 × 9 = 72 ways, and when the eight left buttons 71L are pressed, the right By touching the keyboard 70R (9 states), 72 patterns can be identified, and a total of 144 input patterns can be identified. Generally, when there are N buttons on each of the right and left sides, the types that can be input are (N + 1) × N × 2.

  As described above, the information processing device 51 can reflect the contact of the button on the display on the display screen 60 before the button is pressed, and the user can recognize what function is assigned to the button. it can. For example, as shown in FIG. 2B, when the left thumb 40L touches, for example, one button 71L at the lower left corner of the left keyboard 70L, the soft keyboard 80R is displayed on the right side of the display screen 60. It is assumed that The right soft keyboard 80R is composed of eight soft buttons 81R, and the function information (command) assigned to the right button 71R is switched for each left button 71L, and the display of the right soft keyboard 80R is switched accordingly. .

  In FIG. 2B, eight soft buttons 81R "A" to "H" are displayed as the right soft keyboard 80R. In this state, for example, when the user touches the second right button 71R from the lower right, the soft button 81R ("G") is highlighted, and when the right button 71R is pressed in this state, "G" is input. Then, by shifting the left finger 40L and changing the contact position of the left button 71L, the display of the soft keyboard 80R displayed on the display screen 60 is switched, and another character can be input. Similarly, a different left soft keyboard 80L is displayed on the left side of the display screen 60 according to the position where the right button 71R is touched.

  In the present modification, for example, when the contact of the right button 71R is detected, the function information currently assigned to the left keyboard 70L is displayed on the display screen 60 as the left soft keyboard 80L, and the user's operation on the left button 71L is performed. Before the touch is detected and the button 71L is pressed, the function information assigned to the button currently touched by the user's finger 40L is highlighted, so that the user can press the button 71L / R before actually pressing the button 71L / R. In addition, you can know in advance which character (function) is assigned to which button, and even if you are a beginner who has not learned the button arrangement, display the key arrangement by moving your finger on the button It can be seen with the soft keyboard displayed on the screen, and is extremely operable. Further, the user who has learned the key arrangement can perform high-speed character input without relying on the screen display.

  Further, the display of the soft keyboard 80L / R on the screen can be set so as to be automatically turned off when a state in which a finger does not touch the button continues for a certain period of time. This makes it possible to configure a screen interface that can automatically display keys only during character input without a special switching command, which is convenient when using this information processing device in combination with other applications. is there.

  Next, a second embodiment of the present invention will be described. FIG. 3 is a schematic diagram illustrating the information processing apparatus according to the present embodiment. As shown in FIG. 3, in the information processing apparatus 101 according to the present embodiment, similarly to the first embodiment, a display screen 110 and a small keyboard arranged at positions separated right and left with the display screen 110 interposed therebetween. 120L / R, and the keyboard 120L / R is composed of a plurality of buttons 121L / R each having a built-in touch sensor for detecting contact of the user's finger 40L / R.

  Here, the information processing apparatus 101 according to the present embodiment has a plurality of function modes (menus). In addition to the text input mode for inputting text as shown in FIG. Hereinafter, it is referred to as a remote controller.) When the function mode is the remote control mode, for example, information on a function of selecting a channel assigned to each button 121L of the left keyboard 120L is displayed on the display screen 110 as the soft keyboard 130, and by each button 121L of the left keyboard 120L, You can select a channel. It should be noted that nothing is displayed even when a button other than the number of buttons corresponding to the channel is touched.

  In such a case, when the user touches the button 121L, the corresponding channel 131a of the soft keyboard 130 is highlighted, and by pressing the corresponding button 121L in the highlighted state, the channel can be selected.

  Here, in the present embodiment, not only the channel 131a is highlighted as information relating to the function corresponding to the button 121L whose contact has been detected, but also a function information content display window 131b is provided on the screen 110, The function content assigned to the button touched here, for example, character information or an image explaining this function can be displayed in more detail than the display on the soft keyboard 130. Specifically, when a channel selection function is assigned as in the present embodiment, character information indicating a description of a broadcast station is displayed, or a video currently being broadcast on the channel is temporarily displayed. And can be displayed.

  Also, for example, when the function mode is switched and used as a mobile phone, the button has a function of inputting a telephone number. In such a case, the function information content display window 131b includes a button for detecting the contact. May be simply enlarged and displayed. As described above, even when a different function is assigned to the button for each mode and the function is dynamically attached to the button, the user can determine whether the button corresponds to a desired function (command). Can be recognized before the button is pressed, that is, before the function is selected and executed.

  In the present embodiment, similarly to the first embodiment, a left / right separated keyboard including a button with a touch sensor is mounted, and a function corresponding to each button of the keyboard is displayed on a screen. To detect the touch of the keyboard and highlight the corresponding portion of the soft keyboard on the screen to provide feedback on the touch of the keyboard to the user. Can be recognized. Accordingly, it is not necessary to display the function of each button with a symbol or the like, and the size of the keyboard (each button) can be reduced, so that the keyboard can be mounted on a small portable information processing device. it can. Further, in the present embodiment, operability can be further improved by dynamically attaching function information to buttons and displaying detailed functions related to the touching buttons.

  Next, a third embodiment of the present invention will be described. In this embodiment, a small keyboard is mounted as in the first and second embodiments, and a touch panel is further provided on the display screen to further improve operability.

  FIGS. 4A and 4B are schematic diagrams showing an information processing apparatus showing an information processing apparatus according to the third embodiment of the present invention. As shown in FIG. 4A, the information processing apparatus 201 has a function information display screen 211 on a display screen 210, as in the first and second embodiments, and is assigned to a keyboard. A soft keyboard 230 is displayed as information relating to the function, and the function information display screen section 211 functions as a touch panel 240L / R that receives an operation from the user. Therefore, operation information for receiving an operation input from the user is displayed at a position on the display screen 210 that does not overlap with the soft keyboard 230. The touch panel 240L / R is provided with a screen contact position detecting means (not shown) for detecting a contact position on the display screen 210 to which the user makes contact, and the display detected by the screen contact position detecting means is provided. The function corresponding to the display item of the operation information displayed at the contact position on the screen 210 can be selected.

  In the example shown in FIG. 4A, a function for inputting a numerical value, an alphabet, and the like is assigned to each button 221L / R of the left and right keyboards 220L / R, and a soft keyboard 230 indicating this function is displayed on the screen 210. You. The touch panel 240L / R is provided at a position where the left and right thumbs of the screen 210 can touch each other. The conversion function unit 241a for converting a hiragana to a kanji is input to the touch panel 240L / R. A plurality of display items for selecting a function such as a deletion function unit 241b for deleting information are provided.

  Then, the user can combine the input from each button 221L / R of the keyboard 220L / R and the input from the touch panel 240L / R. By touching the display item, conversion candidates and commands are displayed, and the corresponding conversion candidates and commands can be selected.

  Further, as shown in FIG. 4B, information on the function assigned to each button 321L of the left keyboard 320L is displayed as a soft keyboard 330, and a touch panel is provided on each button 321L at a position where the right thumb can touch. A scroll bar 340 for adjusting parameters related to the selected function can be displayed.

  For example, different functions such as volume and channel are assigned to the button 321L of the left keyboard 320L. Then, for example, when the left keyboard 320L is operated and the soft button on the screen corresponding to the volume function is highlighted 331a and the button 321L is pressed to select the volume function, the scroll bar 340 becomes By switching to the function of adjusting the volume and sliding the right thumb 40R on the scroll bar 340, the volume can be adjusted. In addition, for example, when a channel function is selected by touching a button on a keyboard and pressing a button in a state where a soft button corresponding to a TV remote control function is highlighted, a scroll bar changes a channel. You can switch channels by switching to the function and moving your finger on the scroll bar while touching it.

  Also, in the present embodiment, not only the information portion relating to the function assigned to the button that has detected the contact is highlighted, but also may be displayed as character information 331b or may be enlarged and displayed.

  Next, a fourth embodiment of the present invention will be described. In the present embodiment, the user's contact with the information processing device is detected, and the function mode of the keyboard is automatically switched based on the detected contact. Switching between the display of the soft keyboard 30 of the information processing apparatus 1 shown in FIG. 1 described above and the display of the soft keyboard 130 of the information processing apparatus 101 shown in FIG. 3, or the keyboard display of FIG. The switching between the function and the parameter adjustment display of b) may be performed by preparing a switching button and switching the display by using the switching button. However, the switching can be performed by automatic mode switching as in the present embodiment.

  In other words, when both hands are not touching the device, when either one of the right and left hands is touching the device, or when both the left and right hands are touching the device, the function mode is switched, and the left and right input means are switched. The function information (soft keyboard) displayed on the display screen and the operation information (touch panel) for performing an input operation from the screen are switched.

  For example, as shown in FIG. 5A, when neither the left nor right keyboard 410L / R is in contact with the user's hand or the thumb 40L / R, nothing is displayed on the display screen 410. As shown in FIG. 5B, when the touch of the user's hand (thumb) 40L / R is detected on either of the left and right keyboards 420L / R, it is determined that the input mode is the text input mode. , A soft keyboard 430a similar to that of the first embodiment is displayed, and an input operation can be performed using the left and right keyboards 420L / R.

  Also, as shown in FIG. 5 (b), contact with only the left hand (left thumb) 40L is detected and contact with the right hand (right thumb) 40R is not detected, and as shown in FIG. 5 (c), When the contact of only the (right thumb) 40R is detected and the contact of the left hand (left thumb) 40L is not detected, the menu selection mode 1 and the menu selection mode 2 are set, respectively, and the mode-specific soft keyboards 430b and 430c are displayed. Is also good.

  The function mode switching automatically returns to a preset normal mode when a certain time elapses without touching the finger, and automatically switches the mode when the finger touches. Then, as shown in FIG. 5B, the user's thumb is touching both keyboards, or as shown in FIGS. 5C and 5D, the user's thumb is touching only one keyboard. The mode can be automatically switched by detecting the state. As described above, by switching the mode by touch, a special mode switching command for designating a text input mode or the like is not required.

  Next, a fifth embodiment of the present invention will be described. In the present embodiment, a characteristic finger movement of a user on a keyboard or a touch panel is detected, a function is assigned to this finger movement (movement pattern), and operability is further improved. .

  As shown in FIG. 6, the information processing apparatus 501 includes a keyboard 520L / R including a plurality of buttons with built-in touch sensors at positions where the left and right thumbs can touch, and relates to functions assigned to the keyboard 520L / R. The information is displayed as a soft keyboard 530 on the screen 510, and the user's contact with the keyboard 520L / R is fed back to the soft keyboard 530 on the screen 510. In this state, the function can be selected by pressing the buttons 521L / R of the keyboard 520L / R. In addition, it has a touch panel 540L / R for receiving a user's operation on the screen, and enables various operations.

  Further, in the present embodiment, a movement pattern such as moving the thumb in the vertical direction, the horizontal direction, or the diagonal direction on the paper in a state where the thumb is in contact with the keytop of the keyboard 520L / R is also a characteristic movement. To detect. Since each button constituting the keyboard 520L / R has a built-in touch sensor, it is possible to detect such a user's movement. At this time, by assigning a desired function to each of such movements, a characteristic movement such as moving the finger up and down is recognized as a gesture command, and the functions assigned to these commands are recognized. You can choose. Further, information regarding this function may be displayed on a screen.

  Similarly, not only the characteristic movement of the finger on the keyboard but also the touch panel 540L / R has a contact position detecting means for detecting the contact position on the screen. If the user moves the camera vertically, horizontally, diagonally, or the like, the movement is detected as a characteristic movement, and the movement is recognized as a gesture command, so that the function can be selected in the same manner.

  Also, a function for moving the mouse cursor for such a movement is assigned to one keyboard, and a function corresponding to right click and left click is assigned to a predetermined button on the other keyboard. While moving the mouse cursor displayed on the screen along the trajectory of the user's hand with the keyboard, pressing the other keyboard selects the function corresponding to the right click and left click of the mouse. May be. Also in this case, by displaying information indicating the function of the touched button on the screen before executing the right-click and left-click functions, the user selects the function of the button that the user is touching. Can be recognized before.

  Next, a sixth embodiment of the present invention will be described. In the above-described first to fifth embodiments, the keyboard including a plurality of buttons is arranged on the left and right. However, by providing a position sensor or the like on one button, this one button can be used as a keyboard. Can be used. FIG. 7 is a perspective view schematically showing the keypad in the present embodiment. As shown in FIG. 7, a keyboard (keypad) 820 includes a tactile feedback generator (piezo actuator or vibration actuator) 821 as a vibration generating means for transmitting a click feeling to a user, a pressure sensor 822, and a finger position sensor. 823 is formed in a three-layer structure.

  Keypad 820 as input means in the present embodiment detects a user's contact position by finger position sensor 823. Therefore, different functions (commands) can be assigned based on this contact position, and there is no need to provide a plurality of buttons for each command. The pressure sensor 822 detects whether or not the user has pressed down at a predetermined pressure or more in a state where the contact position of the user's finger has been detected, and reflects this in an input operation. The user can recognize which position his / her finger is in contact with, and what the function assigned to the contact position is with a soft keyboard displayed on the display screen, similar to the above-described embodiment. In addition, since the touch on the keypad 820 is reflected on the display, even if the user does not recognize which function is assigned to which position on the keypad 820, the user can select a desired function by displaying the display screen. can do.

  FIG. 8 is a flowchart showing the operation of such a keypad 820. As shown in FIG. 8, the finger position sensor 823 detects whether or not the user's finger is touching the pad (step S1). If the finger is touching, the screen display is updated according to the finger position (step S1). Step S2). Then, the pressure sensor 822 detects whether or not the pressure is equal to or higher than a predetermined pressure (step S3). If the pressure is equal to or higher than the predetermined pressure, the finger position detected by the finger position sensor 823 is pressed. Then, an input operation is accepted (step S4), a click feeling is generated by the tactile feedback device 821 (step S5), and the process returns to the step S1 again.

Here, assuming that the finger position sensor 823 of the keypad 820 can detect N areas, only one finger is in contact and the other finger is When the pad is not touched, N types of inputs are possible. If the finger position sensor 823 can detect a plurality of contact positions at the same time, one finger is brought into contact with any position on the keypad, and another finger is brought into any other position. The contact makes it possible to identify N × (N−1) types of states. Therefore, a total of N + N × (N−1) = N 2 inputs are possible.

  Here, even in the case of the keypad 820 for detecting the contact position by the position sensor as in the present embodiment, as shown in FIG. 1, the keypad 820 is a keyboard 20L / R including a plurality of buttons 21L / R. Similarly, not only one contact but also a plurality of contacts can be detected to identify the input state. For example, as shown in FIG. 9A, a keypad (keyboard) 870 of a standard mobile phone or the like includes 12 3 × 4 buttons 871, and a touch sensor is provided on each of the buttons 871, or By providing a position sensor that individually detects contact with the area of the button 871 (12 places), 144 types of characters can be input. That is, it is possible to input all of the alphabetic characters (uppercase and lowercase), numbers, and symbols.

  At this time, a soft keypad 880 as shown in FIG. 9B, for example, showing inputable function information assigned to each button is displayed on the display screen, so that the user masters all combinations in advance. There is no need to use this soft keypad 880 as a guide to select characters to be input. Further, as the user learns the combination of positions, the soft keypad 880 can be used without being used as a guide.

In the example of soft keypad 880 shown in FIG. 9 (b), it can be entered when placing the finger in each FIG. 9 (a) to button 871 7 position of the "6" position of the button 871 6 and "7" character Are shown as left character 881L / right character 881R, respectively. For example, placing the fingers 40R on button 871 6 position of "6", each soft button 881 of the soft keypad 880 shown in FIG. 9 (b), the left character 881L are displayed as characters that can be input, the state in for example, contacting the finger 40L on button 871 7 position of "7", the contact is reflected in the display, button 871 7 letter "M" assigned to is highlighted. Then, it is possible to letter "M" enters the letter "M" and presses the button 871 6 while being highlighted. Similarly, placing the finger 40L on button 871 7 position of "7", each soft button 881 of the soft keypad 880, the right character 881R are displayed, the in this state for example, a finger 40R "6" contacting the buttons 871 6 position of the character allocated to the button 871 6 "L" is highlighted, in this state, it is possible to enter the letter "L" and presses the button 871 6.

  Further, even when the keyboard 20L / R is composed of a plurality of buttons 21L / R as shown in FIG. 1 described above, a position sensor 823 is provided to detect a finger contact position as shown in FIG. Even with such a keypad 820, the determination of the button or position that the finger is touching is not only based on whether a single button or position is touched, but also whether a plurality of buttons or positions are touched simultaneously. Depending on this, it can be detected as a different state. For example, as shown in FIGS. 10A and 10B, in the case of a keypad 920 including six 2 × 3 buttons, a state in which each button 921 is touched and a state in which two adjacent buttons 921a and 921b are simultaneously When it is determined that the touched state is different from the touched state, a total of 13 states of 6 states of ABCDEF and 7 states of AB, CD, EF, AC, BD, CE, and DF are recognized and input becomes possible. . Thus, even if the number of buttons or positions assigned to each function is small, many types of inputs can be made.

  Next, a seventh embodiment of the present invention will be described. 2. Description of the Related Art Conventionally, there is an input device of an information processing device including a liquid crystal touch panel and a keypad mounted on an upper surface of the liquid crystal touch panel. In the present embodiment, the present invention is applied to such an input device. FIG. 11A is a schematic diagram illustrating an information processing apparatus according to the present embodiment, and FIG. 11C is a cross-sectional view schematically illustrating an operation button portion of a keypad.

  As shown in FIG. 11A, the information processing apparatus 951 has a keypad 970 as a detachable input means mounted on a liquid crystal touch panel (display item of operation information) 990 on a display screen 960. Input operation can be performed by pressing an operation button 971 provided on the keypad 970. When the operation button 971 of the keypad 970 is pressed, an operation position signal is generated on the touch panel 990 disposed therebelow, and the operation button code assigned to the section corresponding to the operation position is output. That is, the operation buttons 971 of the keypad 970 are arranged so that the display screen 960 can be pressed at a position corresponding to the touch panel 990 displayed on the display screen 960, and the function indicated by the display item is assigned. It has become. Then, a contact of the user with the operation button 971 is detected, and this contact detection result is reflected on the soft keypad 980 as a display of function information indicating a display item. Then, when the operation button 971 is pressed and operated in a state in which the contact detection result for the operation button 971 is reflected on the display of the soft keypad 980, the operation button 971 comes into contact with the display screen and the position corresponding to the contact position is determined. The function corresponding to the display item is selected.

  In such a keypad 970, normally, as shown in FIG. 11B, when the user presses the keytop 973, the operation buttons are moved up and down in the vertical direction with respect to the liquid crystal touch panel surface. The key top 973 is configured such that only a part of the key top 973 is supported or the key top 973 is urged by a spring in a direction away from the liquid crystal touch panel 990 so as to press the key top 973. In contrast to such a conventional keypad 970, in the present embodiment, as shown in FIG. 11C, a key top 973 of the operation button 971 is provided with a touch sensor electrode 972 for detecting a user's contact. By attaching the keypad, a keypad 970 having a touch-on sensing function can be configured. Accordingly, it is possible to recognize which operation button 971 of the keypad 970 a user's finger is touching by using a touch sensor independent of each operation button 971.

  If the keypad 970 is formed of, for example, a light transmissive material, even if the display screen 960 is covered with the keypad 970, the function assigned to the operation button 971 that has detected the contact is highlighted. In this case, this can be visually recognized, and the user can recognize the function assigned to the operation button 971 before pressing the operation button 971.

  A soft keypad for displaying a function corresponding to the keypad 990 on the display screen 960 in a region other than the region where the keypad 970 is mounted, that is, in a region other than the liquid crystal touch panel 990 as in the first embodiment. 980 may be displayed so that the soft keypad 980 reflects the contact of the keypad 970 with the operation button 971.

  Further, a processing device having a display screen that receives an input from such an input device may be prepared, and the touch of the operation button 971 of the keypad 970 may be displayed on the display screen of the processing device. That is, the keypad portion may be separated from the display. In any case, the input operation can be performed by detecting the contact of the user with the operation button 971 and pressing the operation button 971 in a state where the detection result is reflected on the display.

  For example, this separate keypad can be used as an input device for changing a channel or the like on a television monitor in a living room. In this case, when the user's finger touches the keypad, a soft keyboard including a group of soft buttons indicating a TV channel or the like is displayed on the television monitor. For example, a display may be made such that the touch of the user is reflected on the television monitor, for example, a highlighted soft button is displayed. The user can select a function by holding the keypad with his / her hand, touching the finger with the keypad, and pressing a button while highlighting a desired soft button. The user can operate the keypad to select a desired function without visually observing, and thus, distracting his / her eyes from the television monitor.

  Further, the above-described embodiment can be variously modified without departing from the gist of the present invention. For example, in the above-described first embodiment, the keyboard is described as having a plurality of buttons arranged in a two-dimensional array. However, by feeding back the user's contact to the screen, the user does not need to look at the keyboard. However, the input operation may be made possible, and therefore, a one-dimensional array of one column or one row may be used. That is, as shown in FIG. 12, the keyboard 620 may be a row of a plurality of buttons 621. Further, the keyboard is not limited to two keyboards provided in a range where the right and left thumbs can contact, for example, one keyboard may be provided at the top and bottom, and four keyboards may be provided at the top, bottom, left and right to be a vertical type and a horizontal type. You may use together.

  Further, as shown in FIG. 13, another button 621b may be arranged concentrically around one button 621a, whereby the movement of the finger during the character input operation can be reduced, Further, operability is improved.

  Furthermore, it has been described that the keyboard is assigned a function of inputting characters or numbers, a telephone number, a channel, or the like.However, for example, instead of a button, a jog dial that can be pressed and rotated is provided so as to slightly protrude from the surface. You may. In this case as well, when a plurality of functions are assigned to the jog dial and the jog dial is touched, information about these functions is displayed on the display screen, the jog dial is rotated, and the function to be selected is highlighted, and in this state, The jog dial may be pressed to select.

  Next, an eighth embodiment of the present invention will be described. In the present embodiment, the keypad in the above-described sixth embodiment is provided with an operation unit such as a jog dial for receiving a rotation operation. 2. Description of the Related Art In a conventional portable information processing apparatus such as a mobile phone or a portable music player, a cross key, a button, a jog dial, and the like are used as input means. Since these are mechanical parts, they have problems in terms of durability, weight and size. In addition, since the intended use is determined in advance if it is a mechanical part, for example, a button cannot be used as a substitute for a jog dial, or a jog dial cannot be used as a substitute for a button.

  On the other hand, for example, the keypads 820 and 920 described above detect the contact of the finger and the contact position with the electrostatic position sensor, and detect the “push” or “press” operation with the finger of the user with the pressure sensor. Then, when a pressing operation occurs, by driving a vibration actuator as vibration generation means, it is possible to generate a click feeling as if a mechanical button were pressed. In the present embodiment, similarly to providing such a virtual button in accordance with the contact position of the keypad, an operation capable of virtually performing a rotation operation like a jog dial (hereinafter referred to as a contact operation). A part (hereinafter, referred to as a rotation operation part) is provided, functions are assigned to these contact positions and / or contact operations, and the operability is improved by integrally forming the functions with a touch panel.

  Here, the press detection is recognized as one event in the entire apparatus, but the contact of the user's finger detected by the electrostatic position sensor can virtually configure a plurality of buttons by combining the contact positions. In addition, a virtual jog dial (rotary operation unit) can be configured by detecting a contact operation. In this case, if the button and the rotary operation unit are formed with a shape that can be confirmed by touching the surface of the electrostatic position sensor with a finger, for example, a concave portion or a convex portion, it is possible to assist the user in confirming the position of the button and the jog dial. it can.

  FIG. 14A is an enlarged schematic view showing an input unit as a keypad formed by integrating a button and a rotary operation unit, and FIGS. 14B to 14D show the input unit. It is a schematic diagram for explaining the operation method. For example, as shown in FIG. 14A, the input unit 1020 is configured such that four buttons 1021 b to 1021 e each having a circular shape when viewed from above are concentrically arranged around a circular button 1021 a when viewed from above. An annular rotation operation unit 1030 is provided along the outer periphery thereof. Here, in FIGS. 14A to 14D, it is assumed that the buttons 1021a to 1021e and the rotation operation unit 1030 are formed in a more convex shape so that the user can contact and confirm. In the present embodiment, the buttons and the rotary operation unit are described as being formed in a convex shape, but may be formed in a concave shape. It is assumed that the position can be detected. The operation unit may have any shape as long as it can detect a contact operation such as a contact position and a contact direction of the user, and may have a circular, rectangular, or rod-like shape.

  Different functions are assigned to the buttons 1021a to 1021e and the rotation operation unit 1030. An electrostatic position sensor (not shown) as a contact detection unit detects a contact with the buttons 1021a to 1021e, that is, a contact position with the input unit 1020, and a display control unit (not shown) converts the detection result into a contact. This is reflected on the display of the function information on the function assigned to the pressed button.

  That is, as a method of operating each of the buttons 1021a to 1021e, as shown in FIG. 14B, for example, a finger is brought into contact with the position of the button 1021c, and this position, that is, the function assigned to the button 1021c is displayed on the display screen. This is reflected on the display of the function information (not shown), the state is temporarily selected, and the pushing operation is performed in this state. Thus, the function assigned to the corresponding button can be permanently selected.

  In addition, the electrostatic position sensor detects a contact operation such as sliding a finger along the circumferential direction of the rotation operation unit 1030 and / or a contact position, and the display control unit determines a contact direction, and from which position to what extent. The result of the contact detection such as whether the finger has been slid is reflected in the function information on the function assigned to the rotary operation unit 1030 on the display screen (not shown).

  That is, as a method of operating the rotation operation unit 1030, as shown in FIG. 14C, a finger is slid according to a guide on a ring indicating the rotation operation unit. Also in this case, the electrostatic position sensor detects a user's contact operation on the rotary operation unit 1030, and the display control unit reflects the contact detection result on the display of the function information, thereby setting a temporary selection. Specifically, the display control means selects a desired title from a plurality of titles or a volume adjustment mode for adjusting the volume, for example, which is set in advance in the rotation operation unit 1030 in accordance with the contact position detection result of the rotation operation unit 1030. Select a function mode, such as a title selection mode, to select a selectable title from a title list according to a contact operation detection result such as the amount of finger sliding on the rotary operation unit 1030, or slide the finger. A function of setting or adjusting a parameter in a predetermined function mode by selecting an increase width or a decrease width from the current volume in accordance with a touch operation detection result such as the amount and direction (hereinafter, referred to as a selection function). Select Then, the selected result is reflected on the display of the function information. That is, in the case of the title song selection mode, the title is scrolled and displayed sequentially according to the amount of the slide, or in the case of the volume adjustment mode, the selected volume level is displayed, so that the contact operation detection result is obtained. Is reflected in the function information. Then, when the rotation operation unit 1030 is pressed while the function mode and the selection function are reflected on the display, a desired title and a desired volume level can be permanently selected. When only one function mode is assigned to the rotation operation unit 1030, even if the user touches any position of the rotation operation unit 1030, the contact is reflected on the display of the function information on the function mode. Shall be.

  In the present embodiment, the function mode and the selection function can be selected by pressing the rotary operation unit 1030 in a state where the contact is reflected on the display of the function information. ), A scroll bar having such a selection function (a function of selecting or adjusting a parameter relating to a predetermined function mode) may be displayed on a display screen.

  Further, different functions or function modes can be assigned to the rotation operation unit 1030 according to the contact position. For example, as shown in FIG. 14D, an annular rotary operation unit 1030 formed along the outer circumference of concentrically arranged 1021b to 1021e is divided into four parts, and four parts are formed in accordance with the contact positions. A function mode can be assigned. For example, a function mode for adjusting the sound volume is assigned to the rotation operation unit 1030 on the right side near the button 1021c, and a function mode for selecting the title is assigned to the rotation operation unit 1030 on the lower part near the button 1021d. It is possible to switch the function mode according to the position. A function (selection function) for adjusting or selecting a parameter in each function mode may be selectable in accordance with a contact operation of the rotary operation unit, and another button for selecting a selection function is provided. Is also good.

  With such an input unit, the user can recognize what function is assigned to the currently touching rotary operation unit 1030 without looking at the hand. By pressing the rotary operation unit 1030 in such a temporarily selected state, a button press (press operation) of the rotary operation unit 1030 is detected, and a function can be selected. In addition, by driving the vibration actuator in accordance with the movement of the finger with respect to the rotation operation unit 1030, it is possible to produce a feeling that the rotation operation unit is actually rotating like a jog dial.

  In addition, as described above, different functions can be set to the buttons 1021a to 1021e. In addition, the buttons 1021a to 1021e have different functions depending on a contact method such as contacting one of the buttons 1021a to 1021e or contacting two at the same time. A function may be set. When these various functions are touched, the result of the touch is reflected in the function information on the display screen, so that the user does not need to look at the hand or memorize which function is assigned to which button. .

  FIG. 15 shows another example of the rotation operation unit. FIG. 15 shows a case where the input unit 1100 has a button 1121a and four buttons 1121b to 1121e arranged concentrically around the button 1121a, and an annular virtual provided at the same radial position as the four buttons 1121b to 1121e. And a typical rotation operation unit 1130. That is, the four buttons 1121b to 1121e are arranged on the circumference of the rotary operation unit 1130. Since the buttons 1121b to 1121e are arranged on the rotary operation unit 1130, the size of the input unit 1100 can be further reduced. When the buttons 1121b to 1121e formed on the rotary operation unit 1130 are in contact with the buttons, the contact is reflected in the function information, and by pressing the button in the reflected state, the function is performed. You can choose. In addition, when the finger is slid on the rotation operation unit 1130 and the contact operation is reflected in the function information, the function assigned to the contact operation is selected by pressing the rotation operation unit 1130. Can be.

  As described above, the virtual button and the rotary operation unit are formed on the touch panel, and the touch position of the touch panel, a different function for the touch operation, or the function mode and the selection function are assigned, and when the touch is detected, the touch position is determined. Improve the operability and function selectivity of the input unit by reflecting the function information of the function assigned to the touch operation and / or selecting the function by pressing the touch panel in that state. can do. Further, by forming a concave portion or a convex portion on the touch panel to provide a virtual button and a rotary operation unit, it is possible to support a user operation.

  Next, an example of a button constituting the keyboard which is the input means described in the above embodiment will be described, and a specific example using this button will be described. As described above, the information input device described in Japanese Patent Application No. 2002-023700 filed earlier by the present inventors can be applied as a button capable of detecting a state in which the user's thumb is in contact with the keytop. .

  FIG. 16 is a cross-sectional view illustrating a configuration example of an information input device 701 to which the present invention is applied. The information input device 701 includes a button 711 and a proximity sensor 712.

  The button 711 is provided with a pressing portion 711a that is pressed (operated) by a living body (a user's finger or the like) in the case 711b so as to be slidable vertically in the drawing.

  A spring 711c is provided between the pressing portion 711a and the case 711b, and a contact 711d is provided in the case 711b below the pressing portion 711a.

  When the user's finger or the like is not in contact with the pressing portion 711a (in the initial state), as shown in FIG. 16, the pressing portion 711a is urged upward in the figure by a spring 711c, and the lower end thereof. Is located away from the contact 711d. That is, at this time, the contact 711d is turned off.

  Now, assuming that the user has pressed down the pressing portion 711a with his / her finger against the urging force of the spring 711c, the contact 711d is turned on. Thereafter, when the user releases the finger from the pressing portion 711a, The pressing portion 711a is pushed upward by the restoring force of the spring 711c, returns to the original position (returns to the initial state), and the contact 711d is turned off.

  As described above, the button 711 supplies a manual input by the user to the information processing apparatus as information indicating the ON state of the contact 711d. In other words, the button 711 detects an input based on physical contact with the living body (an operation in which the user presses the pressing unit 711a with his / her finger or the like) as the ON state of the contact 711d.

  In this example, when the pressing unit 711a of the button 711 is pressed, the contact 711d is turned on, and when the pressing unit 711a is returned to the original position (initial state), the contact 711d is turned off. The button 711 is configured as described above, but the relationship between the position of the pressing portion 711a and the ON or OFF state of the contact 711d is not particularly limited. For example, when the pressing portion 711a is pressed, the contact 711d is in the OFF state. The button 711 may be configured so that the contact 711d is turned on when the pressing unit 711a is returned to the original position (initial state).

  In this case, the button 711 detects an input based on physical contact with the living body (an operation in which the user presses the pressing unit 711a with his / her finger or the like) as the OFF state of the contact 711d.

  The proximity sensor 712 is disposed between the flat plate portion 711e of the pressing portion 711a and the spring 711c, and when a living body (such as a finger of the user) contacts the pressing portion 711a or approaches the proximity sensor 712 sufficiently, As an input based on the proximity of the living body, and functions as contact detection means for detecting the approaching living body, and generates a detection signal (hereinafter, referred to as approach information) for the detected living body, and sends it to the information processing apparatus. Supply.

  The configuration of the proximity sensor 712 is not limited as long as it can detect a living body approaching the sensor and can be arranged near the button 711 (pressing portion 711a). In this example, for example, FIG. 17 (a configuration disclosed by the present applicant in Japanese Patent Application No. 2001-151499).

  That is, FIG. 17 is a block diagram illustrating a configuration example of the proximity sensor 712. The proximity sensor 712 includes a linear transmission electrode 722-1 to 722-3, an oscillator 721 for supplying an AC current having a predetermined local frequency (for example, 100 Hz) for transmission thereto, and a transmission electrode 722-1 by electrostatic action. 722-3 to 722-3, linear receiver electrodes 723-1 to 723-4 for receiving AC current, a receiver 724 for receiving AC current flowing through the receiver electrodes 723-1 to 723-4, and an output of the oscillator 721; It comprises a processor 725 that inputs the outputs of the receiving electrodes 723-1 to 723-4 and the output of the receiver 724.

  The proximity sensor 712 is also provided with switches 726-1 to 726-3 between the oscillator 721 and the transmission electrodes 722-1 to 722-3, respectively, if necessary. Switches 727-1 to 727-4 are provided between the receivers 724 and 1 to 723-4, respectively. These switches 726-1 to 726-3 and switches 727-1 to 727-4 turn on the switches at a predetermined timing (for example, a timing at which the oscillator 721 outputs an alternating current). The receiver 724 outputs an AM modulator including a band-pass filter (BPF) 724a that passes only an AC current in a predetermined frequency band, an amplifier 724b, and a detector 724c, and a detection output from the AM modulator. It comprises an A / D converter 724d for analog-to-digital conversion (A / D conversion).

  The receiving electrodes 723-1 to 723-4 are disposed so as to be substantially orthogonal to the transmitting electrodes 722-1 to 722-3, and have respective intersections. At these intersections, these electrodes are in contact with each other. I haven't.

  In other words, as shown in FIG. 18, at the intersection of the transmission electrode 722 and the reception electrode 723, a circuit equivalent to the capacitor Ca for storing electric charges is substantially formed.

  Therefore, when the alternating current oscillated by the oscillator 721 and output is supplied to the transmission electrode 722, the alternating current is applied to the opposite receiving electrode 723 through the intersection (capacitor Ca) by electrostatic induction. Flows.

  That is, when the oscillator 721 applies an AC voltage to the transmission electrode 722, an AC current is generated at the reception electrode 723 based on the capacitive coupling of the capacitor Ca between the transmission electrode 722 and the reception electrode 723, and the receiver 723 receives the AC current. 724.

  The receiver 724 supplies the intensity of the supplied alternating current (the alternating current input via the capacitor Ca) to the processor 725 as digital data (reception signal), and inputs the intensity to the receiver 724 via the capacitor Ca. The intensity of the obtained alternating current depends only on the capacitance of the capacitor Ca. In addition, the capacitance of the capacitor Ca keeps a static and fixed value unless the transmission electrode 722 or the reception electrode 723 is deformed. Therefore, as long as the same AC voltage is applied to the transmission electrode 722, the intensity of the AC current input to the receiver 724 via the capacitor Ca has a constant value.

  However, when a living body (such as a fingertip of a user) approaches an intersection between the transmission electrode 722 and the reception electrode 723, an equivalent circuit at the intersection becomes as shown in FIG.

  That is, since the living body is regarded as a virtual ground point (earth), an equivalent circuit is provided between the living body and the transmitting electrode 722 with respect to the above-described capacitor Ca formed between the transmitting electrode 722 and the receiving electrode 723. A series circuit in which the formed capacitor Cb1 and the virtual capacitor Cb2 formed between the living body and the receiving electrode 723 are connected in series is connected in parallel.

  Therefore, when an AC voltage is applied to the transmission electrode 722 side, the intensity of the AC current received by the reception electrode 723 via the capacitor Ca and supplied to the receiver 724 is connected to the ground (living body) via the capacitor Cb1. It weakens by the amount of current that flows.

  As described above, the capacitance of the capacitor Ca is static and keeps a fixed value unless the transmission electrode 722 or the reception electrode 723 is deformed, but the capacitance of the capacitors Cb1 and Cb2 is And becomes smaller as approaching the receiving electrode 723.

  Utilizing such a phenomenon, the processor 725 performs AM modulation with the AM modulator (BPF 724a, amplifier 724b, and detector 724c) of the receiver 724, and further converts the digital signal with the A / D converter 724d. The received signal is used to determine whether or not the living body is approaching the intersection between the electrodes, or to measure how close the living body is (the distance between the living body and the intersection).

  Therefore, as shown in FIG. 17, when each of the plurality of buttons 711-1 to 711-12 arranged in a matrix is arranged on one of the intersections, the button 711- When the user's finger touches i (i is any value from 1 to 12), the processor 725 determines that the living body is approaching the button 711-i (the intersection located therebelow). It determines (detects that the living body is approaching), and supplies approach information (detection signal) indicating that to the information processing device.

  In the example of FIG. 17, the buttons 711-1 to 711-12 are arranged in a matrix of 4 rows and 3 columns, so that the four reception electrodes 723-1 to 723-4 and the three transmission electrodes Although the electrodes 722-1 to 722-3 are arranged, the numbers of the transmission electrodes 722 and the reception electrodes 723 are not limited. However, it is preferable that one intersection of the transmission electrode 722 and the reception electrode 723 is always arranged below each button, as described later.

  Further, in the example of FIG. 17, the angle between the transmission electrode 722 and the reception electrode 723 at the intersection is approximately 90 °, but this angle is not limited. That is, as long as the transmission electrode 722 and the reception electrode 723 are arranged so as not to contact each other and to form an intersection, the arrangement method is not limited.

  Next, an outline of an operation example of the information processing device to which the information input device 1 is applied will be described. Note that details of the operation example will be described as an operation example of the information processing device 731 in FIG. 20A described later.

  As described above, the information input device 701 includes a button 711 that detects an input based on physical contact of a living body as an ON state (or an OFF state) of the contact 711d, and a living body (such as a finger of the user) with respect to the button 711. ) Is provided with a proximity sensor 712 for detecting an input (approach of a living body) based on proximity.

  Thus, the user can use the information processing apparatus to which the information input apparatus 1 is applied to change the technique called “tool tip” used as an interface of a mouse by using a physical (as hardware) button. Can be used for

  That is, the “tool tip” is a technique capable of performing the following operations (a) to (c), and the user can use the “tool tip” to assign to the soft button or icon on the screen. Before causing the information processing apparatus to execute the functions being used, information on these functions can be obtained.

(A)
When the user operates the mouse to place the mouse cursor on a soft button or icon on the screen, the information processing apparatus performs information (for example, information on a function assigned to the soft button or icon). , The name of the function, etc.) is displayed in a pop-up on the screen.

(B)
When the user presses a mouse button (left-clicks), the information processing device executes a function assigned to the soft button or icon.

(C)
When the user operates the mouse to move the mouse cursor to another position without pressing the mouse button (without left-clicking), the information processing apparatus performs the function assigned to the soft button or the icon. Do not execute.

  Then, the information processing device to which the information input device 701 is applied can execute operations corresponding to these (a) to (c), for example, the following operations (A) to (C).

(A)
When the user places his / her finger on the pressed portion 711a of the button 711 (when the finger is touched), the proximity sensor 712 detects the contact and inputs approach information to the information processing device. The information processing apparatus displays a tooltip on which information (for example, a function name or the like) related to the function assigned to the button 711 is displayed on the screen based on the input approach information.

(B)
When the user presses the pressing unit 711a with the finger (when the button 711 is operated), the information processing device executes a function assigned to the button 711.

(C)
If the user does not press the pressing unit 711a (does not operate the button 711) and releases his / her finger from the pressing unit 711a (moves to another position), the information processing device assigns the button 711 Do not perform the specified function.

  Next, a specific example of the information processing apparatus of the present invention will be described. FIG. 20A is a block diagram illustrating a configuration example of an information processing device 731 in this specific example.

  A CPU (Central Processing Unit) 741 executes various processes according to a program stored in a ROM (Read Only Memory) 742 or a program loaded from a storage unit 748 to a RAM (Random Access Memory) 743.

  The RAM 743 also appropriately stores data necessary for the CPU 741 to execute various processes.

  The CPU 741, the ROM 742, and the RAM 743 are mutually connected via a bus 744. The input / output interface 745 is also connected to the bus 744.

  The input / output interface 745 is connected to the information input device 1 of FIG. 16 described above as an input unit. That is, the input / output interface 745 is connected to the button 711 and a proximity sensor 712 disposed near the button 711 and detecting a user's contact with the button 711.

  The input / output interface 745 is also connected to an output unit 747 including a display, a storage unit 748 including a hard disk, and a communication unit 749 including a modem, a terminal adapter, and the like. The communication unit 749 performs communication processing such as communication via a network or wireless communication. Note that the storage unit 748 may be omitted as necessary.

  Further, a drive 750 is connected to the input / output interface 745 as necessary, and a magnetic disk 761, an optical disk 762, a magneto-optical disk 763, a semiconductor memory 764, or the like is appropriately mounted, and a computer program read out from these is loaded. Are installed in the storage unit 748 as needed.

  Although not shown, the information processing device 731 is provided with a block that executes various functions, for example, a telephone communication function of a mobile phone, as needed, in addition to the above-described blocks.

  When the keypad portion is used as a device (remote controller) separated from a display (display device), the configuration is as shown in FIG. 20B. That is, as shown in FIG. 20B, the information processing device 751 is a device in which the information input device 752 and the display device 753 are separated. The information input device 752 receives the output data from the button 711 to which a predetermined function is assigned and the proximity sensor 712 that detects the user's contact with the button 711, and transmits this to the display device 753. A portion 713 is provided. The display device 752 has a receiving unit 750 that receives data such as a contact detection result transmitted from the transmitting unit 713 of the information input device 752. For example, the CPU 741 or the like outputs an output of the output unit 747 based on the received data. Function as a display control means for controlling the display. That is, as described above, the receiving unit 750 receives a contact detection signal or the like in which the proximity sensor 712 detects the contact, and the CPU 741 reflects the function of the button 711 touched based on the contact detection signal in the display information, and When the pressing of the button 711 is detected in this state, and the receiving unit 750 receives a signal notifying the pressing, the CPU 741 controls to select the function. Other configurations are the same as those of the information processing apparatus in which the information input device and the display device shown in FIG. 20A are integrated.

  Next, an example of a function execution process of the information processing device 731 will be described with reference to the flowchart in FIG.

  Now, it is assumed that the power of the information processing device 731 is turned on (the CPU 741 and the like have been activated) and a predetermined initial screen is displayed on the display (output unit 747).

  At this time, in step S11, the CPU 741 determines whether or not the living body (finger 771) is in the first state in which the living body (finger 771) is in contact with (approaching) the button 711 as shown in FIG.

  Now, as shown in FIG. 22, assuming that the user does not touch the finger 771 with the button 711, the proximity sensor 712 does not detect a living body (because it does not output approach information). Determines (recognizes) that it is not in the first state, returns to step S11, and determines again whether or not it is in the first state. That is, the CPU 741 constantly monitors whether or not the living body has touched the button 711, and repeats the processing until the living body comes into contact with the button 711 (until the proximity sensor 12 detects the living body).

  Thereafter, as shown in FIG. 23, when the user places his / her finger 771 on the button 711, the proximity sensor 712 detects the living body (finger 771) and inputs approach information (detection signal) to the input / output interface 745. Will come.

  Therefore, the CPU 741 acquires this approach information via the bus 744 in step S11, thereby determining (recognizing) the first state, and in step S12, information regarding the function assigned to the button 711. Is displayed on a display (output unit 747) via the bus 744 and the input / output interface 745.

  In step S13, the CPU 741 determines whether or not the living body (finger 771) is in the second state in which the button 711 is operated (pressed) as shown in FIG. 24, and is not in the second state. If so, the process returns to step S11 and the subsequent processes are repeated.

  For example, when the user releases the finger 771 from the button 711 (returns to the state of FIG. 22), the CPU 741 erases the display of the information on the function assigned to the button 711, and the finger 771 is put on the button 711 again. Until the user enters the first state in FIG. 23, the processing is continued.

  If the first state in FIG. 23 remains, the CPU 741 repeats the processing of steps S11 to S13. That is, the CPU 741 keeps the information on the function assigned to the button 711 displayed on the display.

  Now, as shown in FIG. 24, assuming that the user has pressed button 711 with his / her finger 771, button 711 is a signal corresponding to the pressing operation (information indicating that contact 711d is ON). Is input to the input / output interface 745.

  Therefore, in step S13, the CPU 741 obtains this signal (information indicating that the contact point 711d is in the ON state) via the bus 744, thereby determining (recognizing) the second state, and In S14, the function assigned to the button 711 is executed.

  In step S15, the CPU 741 determines whether or not the power of the information processing apparatus has been turned off. When it is determined that the information processing apparatus has been turned off, the CPU 741 terminates the process. , And the subsequent processes are repeated.

  As described above, by using the information processing device 731, the user can use a technique corresponding to the above-described “tool tip” without using a mouse.

  Further, in the above-described embodiment, the hardware configuration has been described. However, the present invention is not limited to this. Any processing may be realized by causing a CPU (Central Processing Unit) to execute a computer program. It is possible. In this case, the computer program can be provided by being recorded on a recording medium, or can be provided by being transmitted via the Internet or another transmission medium.

1 is a schematic diagram illustrating an information processing device according to a first embodiment of the present invention. (A) And (b) is a schematic diagram which shows the information processing apparatus in the modification of 1st Embodiment of this invention. FIG. 9 is a schematic diagram illustrating an information processing device according to a second embodiment of the present invention. (A) And (b) is a schematic diagram which shows the information processing apparatus in the 3rd Embodiment of this invention. FIG. 14 is a schematic diagram illustrating an information processing device according to a fourth embodiment of the present invention. FIG. 14 is a schematic diagram illustrating an information processing device according to a fifth embodiment of the present invention. It is a perspective view showing typically the keypad in a 6th embodiment of the present invention. It is a flowchart which shows the input operation in the said keypad. (A) is a schematic diagram illustrating a keypad input method, and (b) is a schematic diagram illustrating a display example of a soft keyboard. It is a schematic diagram explaining another example of the input method by the contact position of a keypad or the button of a keyboard. (A) is a schematic diagram illustrating an information processing apparatus according to a seventh embodiment of the present invention, (c) is a cross-sectional view schematically illustrating an operation button portion, and (b) is a conventional operation button portion. It is sectional drawing which shows typically. FIG. 9 is a schematic diagram illustrating a modification of the input unit of the information processing device according to the embodiment of the present invention. FIG. 14 is a schematic diagram illustrating another modification of the input unit of the information processing device according to the embodiment of the present invention. (A) is an enlarged schematic diagram showing an input unit formed by integrating a button and a rotary operation unit, and (b) to (d) are schematic diagrams for explaining an operation method thereof. is there. FIG. 15 is an enlarged view of an input unit formed by integrating a button and a rotary operation unit, and is a schematic diagram showing an input unit different from FIG. 14. FIG. 4 is a cross-sectional view schematically illustrating a specific example of an input unit of the information processing apparatus to which the present invention is applied. It is a block diagram showing the proximity sensor in the example of the present invention. FIG. 16 is an equivalent circuit diagram at an intersection between a transmission electrode and a reception electrode of the proximity sensor shown in FIG. 15. FIG. 16 is an equivalent circuit diagram at the intersection of the proximity sensor shown in FIG. 15 when a living body approaches the intersection of the transmission electrode and the reception electrode. It is a block diagram showing the information processor in the example of the present invention. FIG. 14 is a block diagram illustrating another information processing device according to a specific example of the present invention. 5 is a flowchart illustrating an operation of the information processing apparatus according to the specific example of the present invention. It is a schematic diagram which shows the initial state of the button of the information processing apparatus in the specific example of this invention. It is a schematic diagram which shows the 1st state of the button of the information processing apparatus in the specific example of this invention. It is a schematic diagram which shows the 2nd state of the button of the information processing apparatus in the specific example of this invention.

Explanation of reference numerals

1, 51, 101, 201, 301, 401, 501, 901 Information processing device, 10, 60, 110, 210, 310, 410, 510, 610 Display screen, 11, 111, 211 Function information display section, 12, 212 Input result display section, 20L / R, 70L / R, 120L / R, 220L / R, 320L / R, 420, 520L / R, 620 keyboard, 21L / R, 71L / R, 121L / R, 221L / R, 321 L / R, 521 L / R button, 30, 80 L / R, 130, 230, 330, 430, 930 Soft keyboard, 31 L / R, 81 L / R, 131 L / R, 231 L / R, 331 L / R, 431 L / R Soft button, 40L / R thumb, 240, 340, 540, 940 Touch panel

Claims (19)

  1. Input means to which one or more functions are assigned according to the contact position;
    Contact detection means for detecting physical contact with the input means,
    Display means for displaying function information on one or more functions assigned to the input means;
    Display control means for reflecting the contact detection result of the contact detection means on the display of the function information,
    The information processing apparatus, wherein the input means selects the function when pressed in a state where the contact detection result is reflected on the display of the function information.
  2.   2. The information processing apparatus according to claim 1, wherein the input unit has a plurality of buttons having the contact detection unit, and a function is assigned to each button.
  3.   3. The information processing apparatus according to claim 2, wherein the plurality of buttons of the input unit are arranged at intervals such that two or more buttons are simultaneously touched by one finger.
  4.   2. The information according to claim 1, wherein when the plurality of contact positions are detected by the contact detection unit, the display control unit reflects the different function information according to a combination of the contact positions on a display. Processing equipment.
  5. 2. The information processing apparatus according to claim 1, wherein, when a contact is detected by the contact detecting means, the display control means displays information on a function corresponding to the contact position brighter than information on other functions. .
  6. 2. The information according to claim 1, wherein, when a contact is detected by the contact detection unit, the display control unit displays information on a function corresponding to the contact position in a manner larger than information on other functions. Processing equipment.
  7.   The portable device, wherein the input means is provided separately at a position where the user can perform an input operation with his / her thumb in a state where the user holds both sides of the main body with left and right hands, respectively. Item 10. The information processing device according to Item 1.
  8. 8. The information processing apparatus according to claim 7, further comprising a mode switching unit configured to detect a state in which only one of the left and right thumbs of the user is in contact with the input unit or a state in which both are in contact with each other and switch the mode. apparatus.
  9. Screen contact position detection means provided on the display screen of the display means, detecting a physical contact position on the display screen,
    The display means displays operation information on a function for receiving an operation input from a user,
    The information processing apparatus according to claim 1, wherein a function corresponding to a display item of the operation information displayed at the contact position detected by the screen contact position detecting means is selected.
  10. The function assigned to the input means has adjustable parameters,
    The operation information has information for adjusting a parameter of a function assigned to the input unit,
    10. The parameter of the selected function is adjusted by moving a contact position in the operation information on the display screen while the function is selected by the input unit. Information processing device.
  11. The input means is mounted on a display screen of the display means, has the contact detection means, and has one or more operation buttons for pressing and operating the display screen at a position corresponding to a display item of the operation information. And
    The function indicated by the display item is assigned to the operation button,
    The display control means reflects the contact detection result for the operation button on the display of the function information indicating the display item,
    When the operation button is pressed and touches the display screen in a state where the contact detection result for the operation button is reflected in the display of the function information, the function corresponding to the display item at the position corresponding to the contact position is selected. The information processing apparatus according to claim 9, wherein:
  12. One or more functions are assigned according to a movement pattern of a finger touching the input means, and when the movement pattern is detected by the contact detection means, the assigned function is selected. Item 10. The information processing device according to Item 1.
  13. The information processing apparatus according to claim 1, further comprising a vibration generation unit configured to generate a vibration when the input unit is pressed.
  14. The input means has a circular operation unit to which one or more functions are assigned,
    The contact detection unit detects a contact position and / or a circumferential contact operation on the operation unit,
    14. The information processing apparatus according to claim 13, wherein the display control unit reflects a contact detection result of the contact detection unit on a display of the function information.
  15. The information processing apparatus according to claim 14, wherein the vibration generation unit generates a vibration when the contact detection unit detects a circumferential contact operation on the operation unit.
  16. Different modes are assigned to the operation unit according to the contact position, and a selection function of selecting a parameter in the mode is assigned according to a circumferential angle range and a direction,
    The contact detection unit detects a contact position and a circumferential contact operation on the operation unit,
    15. The information processing apparatus according to claim 14, wherein the display control means selects a mode and a selection function based on a contact position detection result and a contact operation detection result, respectively, and reflects the selected mode and selection function on the function information.
  17. An input device having the input unit and the contact detection unit, and a display device having the display unit and the display control unit, and a display device separated from the input device,
    The input device has transmission means for transmitting a contact detection result by the contact detection means to the display control means,
    The information processing apparatus according to claim 1, wherein the display device includes a receiving unit that receives a contact detection result from the transmitting unit and outputs the result to the display control unit.
  18. A display step of displaying function information on one or more functions assigned according to the contact position of the input means on a display screen;
    A contact detection step of detecting physical contact with the input means,
    A display control step of reflecting the contact detection result on the display of the function information;
    A selecting step of selecting the function when the input unit is pressed in a state where the result of the contact detection is reflected in the display of the function information.
  19. A program for causing a computer to execute a predetermined operation,
    A display step of displaying function information on one or more functions assigned according to the contact position of the input means on a display screen;
    A contact detection step of detecting physical contact with the input means,
    A display control step of reflecting the contact detection result on the display of the function information;
    A selecting step of selecting the function when the input unit is pressed in a state where the result of the contact detection is reflected on the display of the function information.

JP2004029872A 2003-02-14 2004-02-05 Information processor, information processing method, and program Pending JP2004355606A (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
JP2003037370 2003-02-14
JP2003127408 2003-05-02
JP2004029872A JP2004355606A (en) 2003-02-14 2004-02-05 Information processor, information processing method, and program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2004029872A JP2004355606A (en) 2003-02-14 2004-02-05 Information processor, information processing method, and program

Publications (2)

Publication Number Publication Date
JP2004355606A5 JP2004355606A5 (en) 2004-12-16
JP2004355606A true JP2004355606A (en) 2004-12-16

Family

ID=34068884

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2004029872A Pending JP2004355606A (en) 2003-02-14 2004-02-05 Information processor, information processing method, and program

Country Status (1)

Country Link
JP (1) JP2004355606A (en)

Cited By (31)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006070531A1 (en) * 2004-12-27 2006-07-06 Pioneer Corporation User interface system, user interface device, and control method for electronic equipment
JP2007058569A (en) * 2005-08-24 2007-03-08 Sharp Corp Electronic equipment
JP2007133806A (en) * 2005-11-14 2007-05-31 Ntt Docomo Inc Terminal and control program for terminal
JP2007233552A (en) * 2006-02-28 2007-09-13 Kenwood Corp Device with indicator
WO2007116465A1 (en) * 2006-03-31 2007-10-18 Mitsubishi Denki Kabushiki Kaisha Car operation panel for elevator
WO2008020538A1 (en) * 2006-08-18 2008-02-21 Kyocera Corporation Portable electronic device and method for controlling same
JP2008046971A (en) * 2006-08-18 2008-02-28 Kyocera Corp Portable electronic apparatus and control method thereof
JP2008052581A (en) * 2006-08-25 2008-03-06 Kyocera Corp Portable electronic device and method for controlling display of same
WO2008103018A1 (en) * 2007-02-23 2008-08-28 Tp-I Co., Ltd Virtual keyboard input system using pointing apparatus in digial device
JP2008544361A (en) * 2005-06-14 2008-12-04 メルファス インコーポレイテッド User contact based digital device control apparatus and method including visual input feedback
JP2009503663A (en) * 2005-07-27 2009-01-29 ノキア コーポレイション Method for controlling software functions, electronic device, and computer program product
JP2009099067A (en) * 2007-10-18 2009-05-07 Sharp Corp Portable electronic equipment, and operation control method of portable electronic equipment
JP2010086411A (en) * 2008-10-01 2010-04-15 Canon Inc Information processing apparatus, and information processing method
JP2010086064A (en) * 2008-09-29 2010-04-15 Toshiba Corp Information processor, character input method, and program
JP2010129041A (en) * 2008-12-01 2010-06-10 Denso Corp Input device
JP2010165149A (en) * 2009-01-15 2010-07-29 Victor Co Of Japan Ltd Electronic apparatus and operation control method using touch sensor
WO2011055587A1 (en) * 2009-11-04 2011-05-12 日本電気株式会社 Mobile terminal and display method
JP2011133579A (en) * 2009-12-22 2011-07-07 Olympus Corp Microscope controller and microscope system provided with microscope controller
JP2012079097A (en) * 2010-10-01 2012-04-19 Kddi Corp Information apparatus with key input unit disposed on surface invisible during use, input method and program
WO2012109452A2 (en) * 2011-02-10 2012-08-16 Research In Motion Limited Portable electronic device and method of controlling same
US8310449B1 (en) * 2008-12-23 2012-11-13 Lockheed Martin Corporation Touch interface device, system, and method
JP2013012255A (en) * 2012-10-19 2013-01-17 Jvc Kenwood Corp Electronic device, control method, and program
JP2013047968A (en) * 2012-10-17 2013-03-07 Jvc Kenwood Corp Electronic device, control method, and program
KR101402274B1 (en) * 2007-01-26 2014-06-02 삼성전자주식회사 Apparatus and method for displaying
JP2014135076A (en) * 2014-03-19 2014-07-24 Jvc Kenwood Corp Electronic device, control method, and program
JP2014149843A (en) * 2014-03-19 2014-08-21 Jvc Kenwood Corp Electronic device, control method, and program
US9116616B2 (en) 2011-02-10 2015-08-25 Blackberry Limited Portable electronic device and method of controlling same
JP2016015181A (en) * 2015-10-29 2016-01-28 Kddi株式会社 User interface device, program, and function starting method capable of starting different function according to degree of pressing force
US9372623B2 (en) 2010-04-30 2016-06-21 Nec Corporation Information processing terminal and operation control method for same
US10248312B2 (en) 2008-09-29 2019-04-02 Microsoft Technology Licensing, Llc Glow touch feedback for virtual input devices
US10379626B2 (en) 2012-06-14 2019-08-13 Hiroyuki Ikeda Portable computing device

Cited By (46)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7804495B2 (en) 2004-12-27 2010-09-28 Pioneer Corporation User interface system, user interface apparatus, and method of controlling electronic device
WO2006070531A1 (en) * 2004-12-27 2006-07-06 Pioneer Corporation User interface system, user interface device, and control method for electronic equipment
JP2008544361A (en) * 2005-06-14 2008-12-04 メルファス インコーポレイテッド User contact based digital device control apparatus and method including visual input feedback
US8462122B2 (en) 2005-06-14 2013-06-11 Melfas, Inc. Apparatus for controlling digital device based on touch input interface capable of visual input feedback and method for the same
JP2014041655A (en) * 2005-06-14 2014-03-06 Melfas Inc Digital equipment control device and method for user contact base including visual input feedback
JP2012069165A (en) * 2005-06-14 2012-04-05 Melfas Inc Mobile terminal
JP2009503663A (en) * 2005-07-27 2009-01-29 ノキア コーポレイション Method for controlling software functions, electronic device, and computer program product
JP2007058569A (en) * 2005-08-24 2007-03-08 Sharp Corp Electronic equipment
JP2007133806A (en) * 2005-11-14 2007-05-31 Ntt Docomo Inc Terminal and control program for terminal
US8422661B2 (en) 2005-11-14 2013-04-16 Ntt Docomo, Inc. Terminal and control program of terminal
JP2007233552A (en) * 2006-02-28 2007-09-13 Kenwood Corp Device with indicator
WO2007116465A1 (en) * 2006-03-31 2007-10-18 Mitsubishi Denki Kabushiki Kaisha Car operation panel for elevator
JP4657171B2 (en) * 2006-08-18 2011-03-23 京セラ株式会社 Portable electronic device and control method thereof
US8704771B2 (en) 2006-08-18 2014-04-22 Kyocera Corporation Portable electronic apparatus and control method thereof
JP2008046971A (en) * 2006-08-18 2008-02-28 Kyocera Corp Portable electronic apparatus and control method thereof
KR101035814B1 (en) 2006-08-18 2011-05-20 교세라 가부시키가이샤 Portable electronic device and method for controlling same
WO2008020538A1 (en) * 2006-08-18 2008-02-21 Kyocera Corporation Portable electronic device and method for controlling same
JP2008052581A (en) * 2006-08-25 2008-03-06 Kyocera Corp Portable electronic device and method for controlling display of same
KR101402274B1 (en) * 2007-01-26 2014-06-02 삼성전자주식회사 Apparatus and method for displaying
WO2008103018A1 (en) * 2007-02-23 2008-08-28 Tp-I Co., Ltd Virtual keyboard input system using pointing apparatus in digial device
JP2010521022A (en) * 2007-02-23 2010-06-17 ティーピーアイ カンパニー リミテッド Virtual keyboard input system using a pointing device used in digital equipment
JP2009099067A (en) * 2007-10-18 2009-05-07 Sharp Corp Portable electronic equipment, and operation control method of portable electronic equipment
US10248312B2 (en) 2008-09-29 2019-04-02 Microsoft Technology Licensing, Llc Glow touch feedback for virtual input devices
JP2010086064A (en) * 2008-09-29 2010-04-15 Toshiba Corp Information processor, character input method, and program
JP2010086411A (en) * 2008-10-01 2010-04-15 Canon Inc Information processing apparatus, and information processing method
JP2010129041A (en) * 2008-12-01 2010-06-10 Denso Corp Input device
US8310449B1 (en) * 2008-12-23 2012-11-13 Lockheed Martin Corporation Touch interface device, system, and method
JP2010165149A (en) * 2009-01-15 2010-07-29 Victor Co Of Japan Ltd Electronic apparatus and operation control method using touch sensor
US8687103B2 (en) 2009-01-15 2014-04-01 JVC Kenwood Corporation Electronic apparatus and method of operating electronic apparatus through touch sensor
US8570425B2 (en) 2009-01-15 2013-10-29 JVC Kenwood Corporation Electronic apparatus and method of operating electronic apparatus through touch sensor
US8339499B2 (en) 2009-01-15 2012-12-25 Victor Company Of Japan, Ltd. Electronic apparatus and method of operating electronic apparatus through touch sensor
CN102597929A (en) * 2009-11-04 2012-07-18 日本电气株式会社 Mobile terminal and display method
WO2011055587A1 (en) * 2009-11-04 2011-05-12 日本電気株式会社 Mobile terminal and display method
JP5681867B2 (en) * 2009-11-04 2015-03-11 レノボ・イノベーションズ・リミテッド(香港) Mobile terminal and display method
JP2011133579A (en) * 2009-12-22 2011-07-07 Olympus Corp Microscope controller and microscope system provided with microscope controller
US9372623B2 (en) 2010-04-30 2016-06-21 Nec Corporation Information processing terminal and operation control method for same
JP2012079097A (en) * 2010-10-01 2012-04-19 Kddi Corp Information apparatus with key input unit disposed on surface invisible during use, input method and program
WO2012109452A2 (en) * 2011-02-10 2012-08-16 Research In Motion Limited Portable electronic device and method of controlling same
US9116616B2 (en) 2011-02-10 2015-08-25 Blackberry Limited Portable electronic device and method of controlling same
WO2012109452A3 (en) * 2011-02-10 2012-10-11 Research In Motion Limited Portable electronic device and method of controlling same
US10379626B2 (en) 2012-06-14 2019-08-13 Hiroyuki Ikeda Portable computing device
JP2013047968A (en) * 2012-10-17 2013-03-07 Jvc Kenwood Corp Electronic device, control method, and program
JP2013012255A (en) * 2012-10-19 2013-01-17 Jvc Kenwood Corp Electronic device, control method, and program
JP2014149843A (en) * 2014-03-19 2014-08-21 Jvc Kenwood Corp Electronic device, control method, and program
JP2014135076A (en) * 2014-03-19 2014-07-24 Jvc Kenwood Corp Electronic device, control method, and program
JP2016015181A (en) * 2015-10-29 2016-01-28 Kddi株式会社 User interface device, program, and function starting method capable of starting different function according to degree of pressing force

Similar Documents

Publication Publication Date Title
US7170496B2 (en) Zero-front-footprint compact input system
US6947028B2 (en) Active keyboard for handheld electronic gadgets
CN101290540B (en) Integrated keypad system
CN103064629B (en) It is adapted dynamically mancarried electronic aid and the method for graphical control
CN101133385B (en) Hand held electronic device, hand held device and operation method thereof
US8514186B2 (en) Handheld electronic device and operation method thereof
US8669941B2 (en) Method and apparatus for text entry
US6639586B2 (en) Efficient entry of characters from a large character set into a portable information appliance
JP5243967B2 (en) Information input using sensors attached to fingers
CN1307518C (en) Information display input device and information display input method, and information processing device
EP1691263B1 (en) Display actuator
FI116425B (en) Method and apparatus for integrating an extensive keyboard into a small apparatus
EP1456740B1 (en) Using touchscreen by pointing means
US6741235B1 (en) Rapid entry of data and information on a reduced size input area
DE102008000001B4 (en) Integrated hardware and software user interface
US9304602B2 (en) System for capturing event provided from edge of touch screen
JP5323070B2 (en) Virtual keypad system
JP2009217814A (en) Selective rejection of touch contact in edge region of touch surface
US7898527B1 (en) Systems, methods and devices for efficient communication utilizing a reduced number of selectable inputs
JP3778277B2 (en) Information processing apparatus and method
JP2004213269A (en) Character input device
EP0464712A2 (en) Display/input control system for software keyboard in information processing apparatus having integral display/input device
KR101311338B1 (en) Electronic apparatus and method for symbol input
US20030197736A1 (en) User interface for character entry using a minimum number of selection keys
KR101136153B1 (en) User input device, method for recognizing user finger prints, and method for recognizing user touches using a transparent sensor grid panel which is able to recognize finger prints or mult-touch

Legal Events

Date Code Title Description
A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20070131

A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20070131

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20080520

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20080717

A02 Decision of refusal

Free format text: JAPANESE INTERMEDIATE CODE: A02

Effective date: 20080902