WO2012056864A1 - Input device, information apparatus provided with the input device, program for causing computer to function as input device, and method for using the input device to input characters - Google Patents

Input device, information apparatus provided with the input device, program for causing computer to function as input device, and method for using the input device to input characters Download PDF

Info

Publication number
WO2012056864A1
WO2012056864A1 PCT/JP2011/073203 JP2011073203W WO2012056864A1 WO 2012056864 A1 WO2012056864 A1 WO 2012056864A1 JP 2011073203 W JP2011073203 W JP 2011073203W WO 2012056864 A1 WO2012056864 A1 WO 2012056864A1
Authority
WO
WIPO (PCT)
Prior art keywords
finger
user
desk
input device
key
Prior art date
Application number
PCT/JP2011/073203
Other languages
French (fr)
Japanese (ja)
Inventor
善博 和田
Original Assignee
Wada Yoshihiro
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wada Yoshihiro filed Critical Wada Yoshihiro
Publication of WO2012056864A1 publication Critical patent/WO2012056864A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1662Details related to the integrated keyboard
    • G06F1/1673Arrangements for projecting a virtual keyboard
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0425Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
    • G06F3/0426Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected tracking fingers with respect to a virtual keyboard projected or printed on the surface
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus

Definitions

  • This disclosure relates to information equipment, and more specifically to a technique for optically detecting finger movement and inputting information.
  • Equipment that has been put to practical use as a virtual keyboard includes equipment that realizes key input by projecting a soft keyboard onto a desk with laser light and touching the desk with fingers.
  • Patent Document 1 Japanese Patent Laid-Open No. 6-83512 discloses a technique for detecting the movement of a finger with a camera without projecting a soft keyboard on a desk.
  • Patent Document 2 Japanese Patent Laid-Open No. 2003-288156 discloses a technique for detecting a three-dimensional position of a finger by providing two photographing means.
  • Japanese Patent Application Laid-Open No. 2004-500657 discloses a technique for detecting a three-dimensional position of a finger using a special three-dimensional sensor.
  • Non-Patent Document 1 (Hafiz Adnan Habib and Muid Mafti, “Real Time Mono Vision Gesture Based Virtual Keyboard System”, IEEE Transaction on Consumer Electronics, Vol.52, Issue4, Nov. 2006) shows the position of the finger joint.
  • a technique for detecting the position of a finger or a keystroke operation by detecting the finger is disclosed.
  • JP-A-6-83512 JP 2003-288156 A Japanese translation of PCT publication No. 2004-500657
  • Patent Document 1 discloses a technique for detecting the movement of a finger with a camera without projecting a soft keyboard on a desk, but does not mention a technique for making it wearable.
  • Patent Document 2 discloses a technique for detecting a three-dimensional position of a finger with two cameras. Although it is possible with only one camera, there is no mention of specific means for realizing it.
  • Patent Document 3 discloses a technique for detecting a three-dimensional position of a finger using a special three-dimensional sensor.
  • this sensor cannot be easily obtained, and the resolution of the distance from the sensor to the finger is about 1 cm.
  • Non-Patent Document 1 discloses a technique for detecting the position of a finger and a keystroke action by detecting the position of a finger joint.
  • this method like a normal hard keyboard, it is necessary to stand on the desk with the fingertips standing almost vertically.
  • the impact on the finger due to keystroke is mitigated by the springiness of the key top, but when you hit the key on the desk, the impact is transmitted directly to the joint, causing pain and fatigue in the fingers and arms to increase The burden is large in doing.
  • An ergonomic keyboard has been developed as a hard keyboard that is hard to get tired of fingers and arms even after long-time key input, but such a key input device has not been developed in the case of a virtual keyboard.
  • An input device includes a camera configured to continuously capture a front image of a finger of a user of the input device when the input device is disposed close to a desk, and a captured front surface.
  • the finger position detection unit configured to detect the position of the user's finger from the image and the position of the user's finger at that time when the user's finger touches and rests on the desk.
  • the home position detector configured to detect the home position on the virtual keyboard having a different distance from the camera of the key in charge and having a shape unique to the user, and the user's finger on the virtual keyboard
  • a keystroke operation detector configured to detect that a key has been pressed, and a key code generator configured to generate a key code on the virtual keyboard corresponding to the position where the keystroke occurred. Provided with a door.
  • FIG. 2B is a diagram of a virtual keyboard when fingers are held in a comfortable posture as shown in FIG. 2B, and is an example in which a row of key tops are diagonally arranged like a normal hard keyboard. It is explanatory drawing of this invention.
  • FIG. 5B is an image view in which only the right hand approaches the camera for one line of the virtual keyboard in FIG. 5A. It is an image figure of the front image of the finger image
  • FIG. 6A is an image diagram in which only the right hand moves away from the camera for one line of the virtual keyboard in FIG. It is a functional block diagram which shows an example of a functional structure of this invention. It is a functional block diagram which shows another example of a functional structure of this invention.
  • FIG. 1A is an explanatory diagram of a three-dimensional space notation used in the following description. Description will be made assuming that a certain position of the camera 20 is an origin, a plane on the desk is an XY plane, and a height direction from the desk is a Z axis. However, this shows only the direction of the coordinate axis. In the description of the image taken by the camera 20, the X-axis coordinate is represented by a pixel number, and the size of the finger on the image is represented by the number of pixels that it occupies. The direction of the finger photographed by the camera 20 from the positional relationship of FIG. 1A is referred to as a front image of the finger.
  • FIG. 1B is an explanatory diagram of the definition of the row direction and the column direction of the virtual keyboard used in the following description. The direction from the camera to the perspective is the column direction.
  • FIG. 2A is an example of a system perspective view according to the present embodiment.
  • the mobile terminal placed on the desk and the user's fingers are shown.
  • a part of the image on the desk acquired by the camera 20 mounted on the side surface of the mobile terminal 10 and arranged close to the desk is displayed on the finger image display unit 33 on the display 30.
  • the user can check whether his / her left hand 50A and right hand 51A are within the shooting range of the camera 20 by looking at the front image of his / her finger, and the straight line 21 which is the horizontal angle of view of the camera 20 can be confirmed.
  • the fingers of both hands can be accommodated between the straight lines 22.
  • the light irradiation unit 40 irradiates the range of the straight line 41 and the straight line 42 so that the contours of fingers and nails can be detected more clearly.
  • On the keyboard display unit 32 finger detection positions detected by the key input device are displayed superimposed on the virtual keyboard. The user sees this and can hit the key more accurately.
  • the input character string display unit 31 displays a character string input by the user, which is obtained from the key codes generated by the key input device so far.
  • FIG. 2B is an example of a perspective view of a system in which a user holds a finger in a comfortable posture.
  • the camera 20 is mounted on a certain surface of the display 30 of the mobile terminal 10 and can be used as a self-portrait camera.
  • the key tops are aligned in the horizontal direction. Therefore, when the user holds the user in such an easy posture, the length of each finger is different, and the user cannot correctly press the key. Further, when the user hits the key, the user presses the surrounding keys at the same time.
  • the skill of the user of the system according to the present embodiment from the case of holding a finger as in a normal hard keyboard as shown in FIG. 2A to the case of holding in a sufficiently easy posture as shown in FIG. 2B. You can choose freely according to the degree.
  • FIG. 3A illustrates a virtual keyboard 60A in the case where a finger is held in a comfortable posture as in FIG. 2B (note that the virtual keyboard is only shown in FIG. I want to be)
  • the distance from the camera of the key of each row changes according to the length of the finger. Since the user can freely set the shape of the virtual keyboard depending on the posture of the fingers, an ergonomic keyboard suitable for him / her can be configured.
  • FIG. 3A shows an integrated virtual keyboard, the left and right hands can be separated from each other than when using a normal hard keyboard so that a virtual keyboard separated in the center can be obtained. The burden can be further reduced. Since the thumb basically only presses the space key, it can be input that the space key is pressed no matter where the thumb hits the virtual keyboard.
  • FIG. 3B illustrates a virtual keyboard 60B in which a row of key tops are arranged obliquely like a normal hard keyboard.
  • the key top of the space key is not shown because the thumb is regarded as a space key no matter where the key is pressed.
  • Whether the user uses the virtual keyboard of FIG. 3A or FIG. 3B is determined based on the setting that the user inputs in advance to the key input device. Alternatively, there is a method of setting by learning with an initial character string described later.
  • FIG. 4 is an explanatory diagram of this embodiment.
  • FIG. 4 shows the principle that the virtual keyboard can be freely set as described above and the principle that the key input device detects the distance change between the camera and the finger (a specific calculation method of the distance change). Is described in FIG.
  • the left hand 50B and the right hand 51B placed on the home position of the virtual keyboard 60C constitute a horizontal angle of view (viewing angle) of the camera 20 that is mounted on the side surface of the mobile terminal 10 and arranged close to the desk. It is placed in the range of the straight line 21 and the straight line 22.
  • FIG. 4 shows an example in which the horizontal angle of view of the camera 20 is 90 degrees, the number of pixels is VGA, and the number of horizontal pixels is 640.
  • the pixel numbers in the horizontal direction are expressed as 1 to 640.
  • FIG. 4 illustrates a case where the right hand 51B has moved from the home position to the key position 51E one row above.
  • the pixel 25 that captures the left end of the four fingers of the right hand excluding the thumb in the right hand 51B at the home position is the pixel number 277, and the pixel 26 that captures the right end is the pixel number 147.
  • the pixel 27 that captures the left end of the four fingers of the right hand excluding the thumb when moved to the key position 51E one level above is the pixel number 282, and the pixel 28 that captures the right end is the pixel number 137.
  • FIG. 4 shows an example of finger arrangement similar to the case where the fingers are held in a straight line on a normal hard keyboard, but the fingers are held in a comfortable posture at the home position based on the detection principle of the back and forth movement described above. Even in this case, the system according to the present embodiment can detect the movement distance by measuring a change in the width of the finger when moving back and forth with reference to the width of the finger at that time.
  • the method for determining that the finger is on the home position is, for example, the time when the finger on the image is approximately stationary for less than one pixel on average on both the vertical and horizontal movements and the change in the width of the finger over 10 frames. There is a technique. At that time, it is not necessary to distinguish whether the fingers are held in a straight line on a normal hard keyboard or in an easy posture.
  • the key point of the system according to the present embodiment is to determine the key on which the finger is placed by measuring the distance of the forward / backward movement based on the change in the width of the finger from the current finger width. .
  • the pitch of the virtual keyboard key tops varies depending on the user.
  • the average value of the pitch of the finger corresponds to the key top pitch, so the key input device can determine the left and right pitches of the key top.
  • each key top is substantially square, it can be determined how much the finger has moved back and forth for one line of the virtual keyboard assumed by the user.
  • the angle ⁇ in FIG. 4 is an angle from the pixel number 1 in the horizontal field angle to the pixel number with the finger. As can be seen from FIG. 4, when the horizontal pixel number is known, the angle ⁇ can be determined. This relationship is used in the description of FIG. Also, the horizontal pixel numbers in FIG. 4 are allocated at equal intervals while ignoring distortion caused by camera lens aberrations. This countermeasure will also be described with reference to FIG.
  • FIG. 5A is an image diagram of a front image of fingers captured by the camera according to the present embodiment.
  • FIG. 5A shows a front image of the left hand 50C and the right hand 51C at the home position.
  • the fingers are ready to be placed on a normal hard keyboard.
  • the four fingers other than the right thumb are surrounded by an outline 80A.
  • FIG. 5B is an image diagram in which only the right hand of FIG. 5A approaches the camera for one line of the virtual keyboard.
  • FIG. 5B shows a right hand 51D having an increased external size and its outline 80B.
  • the change in the outer size photographed by the camera by moving the finger back and forth has been explained only by the number of pixels in the horizontal direction, but as can be seen from FIGS. 5A and 5B, the change is actually It is an area change. Therefore, the use of this information makes it easier to identify the forward / backward movement distance.
  • FIG. 6A is also an image diagram of a front image of fingers captured by the camera according to the present embodiment.
  • the fingers are in an easy posture.
  • FIG. 6A shows finger images of the left hand 52C and the right hand 53C at the home position.
  • the widths 81A, 81B, and 81C indicate the width of the middle finger extracted from the contour of the finger image.
  • FIG. 6B is an image diagram in which only the right hand of FIG. 6A is moved away from the camera by one line of the virtual keyboard, and the middle finger is keyed.
  • FIG. 6B shows the right hand 53D in which the outer size of the right hand 53C in FIG. 6A is reduced, and the widths 81D, 81E, and 81F of the middle finger that has performed the keystroke operation.
  • the change in the width of the finger can be captured at a plurality of points instead of one point, and the measurement accuracy can be improved by taking the average value.
  • the middle finger of the right hand performing the keystroke operation touches the desk, and the other fingers of the right hand are away from the desk.
  • the left hand that is not performing the keystroke operation remains touching the desk.
  • the fingers on the side not performing the keystroke operation are arranged in an easy posture touched on the desk, there is no problem in detecting the keystroke operation.
  • FIG. 6B by taking a front image of a finger with the camera placed close to the desk, the difference in height between the finger that made the keystroke and the other finger from the desk is analyzed. (The details will be described with reference to FIGS. 16A and 16B).
  • the camera shoots by comparing the case where the fingers are held in the home position in a straight line on a normal hard keyboard as shown in FIG. 5A and the case where the fingers are held in the home position in an easy posture as shown in FIG. 6A. From the image, it can hardly be identified whether the position of the fingertip is in a straight line from side to side. By taking advantage of this fact, they are treated equally, and the relative size of the fingers is calculated from the change in size based on the size of each finger and nail image at the home position. By calculating, the position of the finger on the virtual keyboard can be detected. As a result, the personality of the user's fingers can be absorbed, and the function as an ergonomic keyboard can be realized in which fingers and arms are less tired.
  • the keyboard according to the present embodiment can be arranged with both hands wider than a general ergonomic keyboard.
  • the front / rear position information of fingers as a 3D image can also be used.
  • FIG. 7 is a functional block diagram showing an example of the functional configuration of the present embodiment.
  • the key input device 12 is configured as an external device or a built-in module of the mobile terminal 10. Hand images taken by the camera 20 placed close to the desk in the camera module 90 are preprocessed by a DSP (Digital Signal Processor) 29 in the camera module, and then the contour candidates of fingers and nails Such feature data is extracted by the feature extraction unit 70.
  • DSP Digital Signal Processor
  • the key input processor 13 receives feature data such as a finger contour candidate from the DSP 29, and all functions from the finger position detecting unit 71 to the command determining unit 77 are executed by a software program.
  • the finger position detection unit 71 determines each finger from the finger contour candidates, determines the position of the finger on the XZ plane, and calculates the size of the finger and nails. For example, when all fingers are substantially stationary on a desk for a predetermined time, the home position detection unit 72 stores the positions and sizes of the fingers at that time as home positions on the virtual keyboard. In the home position, when the virtual keyboard is a QWERTY keyboard, the keys "A”, “S”, “D”, “F”, “J”, “K”, “L” ”,“; ”(Semicolon). Since the thumb basically hits only the space key, the position on the virtual keyboard is not important. In addition, when changing the key assigned to each finger due to finger injury or the like, it is possible to cope with the change by pre-registering the change before using the key input device.
  • the finger position detector 71 Based on the position on the XZ plane of each finger on the image at the home position and the size of the finger or nail as a reference, the finger position detector 71 Calculate the position of the finger on the virtual keyboard.
  • the keystroke detection unit 73 checks the height of the fingertip of each finger from the desk and determines whether or not there is a keystroke action. Further, even when another key is pressed with the right hand while the SHIFT key is pressed with the left hand, the key pressing operation detecting unit 73 can determine by calculating the key pressing position of both hands.
  • the key code generation unit 74 generates a key code corresponding to the position where the keystroke operation was performed, and sends the key code to the host processor 11 of the mobile terminal 10.
  • the detection results of the finger position detection unit 71 and the home position detection unit 72 and the key code generated by the key code generation unit 74 are a learning unit 75 based on an initial character string, a learning unit 76 during character input, and a command determination unit 77. And used as input information for learning and determination.
  • the learning and determination results are fed back to the finger position detection unit 71, the home position detection unit 72, and the keystroke operation detection unit 73, and the detection parameters are adjusted. Thereby, detection accuracy improves.
  • the learning unit 75 based on the initial character string asks the user to input a predetermined character string or an arbitrary character string or perform a fingertip operation before using the key input device 12, and the keystroke position information
  • the detection parameters of the finger position detection unit 71, the home position detection unit 72, and the keystroke operation detection unit 73 are adjusted based on the size information of fingers and nails. Even when the little finger is half hidden behind the ring finger in the front image of the finger, by letting the learning unit 75 by the initial character string learn by a method such as bringing the little finger into contact with the desk before and after placing the finger at the home position, The size of the little finger can be measured.
  • the learning unit 76 during character input takes statistics of how much the coordinates of the finger with the keystroke operation and the coordinates of the center of the key top on the virtual keyboard calculated by the finger position detection unit 71 are deviated. Correct the coordinates of the keys in order.
  • the command determination unit 77 will be described with reference to FIG.
  • the host processor 11 receives the output from the key code generation unit 74 at the input key reception unit 78. If the output is a valid character code, the host processor 11 converts the character corresponding to the character code into the input character string on the display 30. Is displayed on the display unit 31. The host processor 11 also receives the detection result of the finger position detection unit 71 on the desk, and displays the finger detection position display 62 superimposed on the keyboard display unit 32 on the display 30.
  • FIG. 8 is a functional block diagram showing another example of the functional configuration of the present embodiment.
  • FIG. 8 illustrates a case where all processing is performed by the processor 14 of the mobile terminal 10.
  • An image captured by the camera 20 that is mounted on the mobile terminal 10 and arranged close to the desk is captured by the processor 14 of the mobile terminal 10.
  • the processor 14 executes all the processes from the feature extraction unit 70 to the command determination unit 77 (for example, extraction of feature data such as nail contour candidates, determination of the validity of the input command, etc.).
  • the detection result is displayed on the display 30.
  • the command determination unit 77 is a command (hereinafter also referred to as “mouse command”) that moves the mouse pointer because the number of fingers of the left hand of the user placed in contact with the desk is less than the original number, and will be described in detail with reference to FIG. .) Is determined. Then, the command determination unit 77 feeds back to the hand position detection unit 71 and the home position detection unit 72 that a mouse command has been issued.
  • the finger position detection unit 71 and the home position detection unit 72 perform the following operations in addition to the operations described in the embodiment of FIG.
  • the home position detection unit 72 stores the position and size of the right hand of the right hand as the current position (home position) of the mouse pointer when only one finger of the right hand touches the desk and stops.
  • the finger position detection unit 71 calculates the moving direction and distance from the home position when the position or size of one finger resting on the desk on the right hand changes, and sends it to the mouse pointer control unit 79.
  • the mouse pointer control unit 79 holds the current position of the mouse pointer display 34 displayed superimposed on the input character string display unit 31.
  • the mouse pointer control unit 79 moves the display position of the mouse pointer display 34 based on the movement information sent from the finger position detection unit 71.
  • the user issues a mouse command, moves the mouse pointer to an erroneous position, and clicks (keystrokes). By moving the cursor, it becomes possible to correct the error.
  • the function for controlling the mouse pointer it can be used for, for example, inputting a button on the display screen or scrolling the screen even when no key is input.
  • the light illuminating unit 40 illuminates fingers on the desk, sharpens the contours of fingers and nails photographed by the camera 20, and enables key input even in a dark environment.
  • the microphone 45 can be used, for example, when the user inputs information to the mobile terminal by voice that the finger is placed at the home position.
  • the speaker 46 can notify the user that the mobile terminal 10 has accepted the key input by outputting a click sound of the key input.
  • FIG. 9 is an explanatory diagram showing an example of a method for inputting a command with a finger stationary at the home position.
  • FIG. 5A shows a finger when the QWERTY keyboard is arranged for two-hand key input
  • FIG. 9 shows an image of a finger arranged for key input by grasping the left hand 50D in this way.
  • An image having a characteristic different from the above is given to the key input device to indicate that a command is being input.
  • FIG. 10 is an explanatory diagram of fingers placed at the home position on the virtual numeric keyboard. The fingers are ready to be placed on a normal hard keyboard.
  • the right hand 51F is disposed at the home position on the virtual numeric keyboard 60D.
  • the configuration of the virtual keyboard can be instantaneously switched from the virtual QWERTY keyboard as shown in FIG. 3A to such a virtual numeric keyboard by grasping the left hand as shown in FIG.
  • FIG. 11 is an explanatory diagram of a method for displaying the current detection position of the finger placed on the desk on the virtual keyboard displayed on the display.
  • a finger detection position display 62A and a finger center position display 63A indicating the center thereof are displayed in an overlapping manner. If a key is pressed in a state where the position of the left index finger extends over two or more key tops as in the finger detection position display 62B, an ordinary hard keyboard will be double-keyed.
  • the F key on which the finger center position display 63B is placed is input to the key input device.
  • the virtual keyboard has a feature that there is a margin with respect to the positional deviation of fingers compared to a normal hard keyboard, and even if there is an error in the finger detection position of the key input device, there is a feature that erroneous input is less likely to occur.
  • the range in which the camera 20 is shooting is indicated by shooting area displays 64A and 64B.
  • the user can check whether the range used by the virtual keyboard is within the shooting area by looking at the shooting area displays 64A and 64B. As the user moves his / her finger away from the camera 20 and the distance between the fingers is reduced, the range used by the virtual keyboard is more easily accommodated within the horizontal angle of view of the camera 20, and the shootable range of the virtual keyboard is expanded.
  • FIG. 12 is an explanatory diagram of an example of a method for displaying a virtual keyboard, a finger image captured by the camera 20 and an input character string on the display 30.
  • FIG. 12 shows an input character string display unit 31, a keyboard display unit 32, and a finger image display unit 33 that are displayed on the display 30. According to the present embodiment, since the left and right sides of the finger image are displayed in an inverted manner, the user can easily imagine the position of the finger on the virtual keyboard.
  • FIG. 13 is an explanatory diagram of an example of a method for measuring the position and size of a finger.
  • FIG. 13 illustrates a portion in which only the middle finger of the right hand that is pressing the key in FIG. 6B is taken out.
  • the finger position detection unit 71 extracts the contour 82 of the middle finger. For example, when the light irradiation unit 40 illuminates the user's fingers from the front, the image from the camera 20 is obtained as a dark image with the fingers bright and the others. Therefore, the finger position detection unit 71 can extract the contour of the finger by taking out the portion where the density value changes rapidly by differentiation or the like.
  • the finger position detection unit 71 sets ZP, which is the Z coordinate value of the lowest point of the contour, as the Z coordinate value of the middle finger of the right hand (because of the keystroke, it is in contact with the desk).
  • the finger position detection unit 71 uses the coordinate value ZP as a starting point, obtains the horizontal width 83 of the left and right contours of the finger at n predetermined heights Z1, Z2, and Zn from the coordinate value ZP. 84 is determined. The finger position detection unit 71 obtains n middle points in this way, and obtains a regression line 85 from these n points by the least square method. The finger position detection unit 71 sets a point 86 where the straight line and the outline of the finger intersect as the X coordinate value XP of the middle finger on the image. The reason why the X coordinate value is not simply set as the lowest point of the contour line is that the fingertip with the keystroke is flat, and there may be many lowest points.
  • a straight line that passes through the n predetermined points on the Z axis and n points where the regression line 85 intersects and goes straight to the regression line 85 is drawn.
  • a line segment 87 where this straight line is cut off by the contour 82 of the middle finger is the diameter of the middle finger at the height Zn.
  • the finger position detection unit 71 obtains n diameters in this way, and sets the average value as the average diameter (number of pixels) of the middle finger.
  • the finger position detection unit 71 determines the position and size of the middle finger (in this case, the average diameter).
  • the finger position detection unit 71 can obtain other fingers in the same manner, but what is important is the coordinate value of the finger that has performed the keystroke operation. Therefore, for example, if the finger position detection unit 71 first obtains the lowest point of the contour of all fingers, and there is a finger that is determined to be performing a keystroke operation based on the positional relationship of all fingers on the Z axis.
  • the finger position detection unit 71 can reduce the calculation time by a method in which only the finger is used to obtain the coordinates and size according to the above procedure.
  • the X coordinate value XP of the middle finger obtained here is a horizontal pixel number on the image.
  • the average diameter of the finger is the number of pixels occupied by the finger on the image. Therefore, it is necessary to obtain the horizontal movement distance from the home position of the finger on the virtual keyboard based on the number of pixels.
  • FIG. 14 is an explanatory diagram showing an example of a method for calculating the horizontal movement distance from the home position of the finger on the virtual keyboard.
  • FIG. 14 illustrates a state in which the left hand 52D at the home position moves on the virtual keyboard and approaches the camera diagonally to the left hand 52E (only the middle finger to be pressed is shown).
  • the horizontal angle of view of the camera is the angle ⁇ and the number of pixels in the horizontal direction is the number N of pixels (not shown).
  • the actual average diameter of the left middle finger is defined as an average diameter Dmm (not shown)
  • the average diameter (number of pixels) of the middle finger image is defined as an average diameter d (not shown).
  • the average diameter (number of pixels) of the left-hand middle finger image at the home position P1 be the average diameter d1
  • the angle from the left end of the horizontal field angle be the angle ⁇ 1. It is assumed that the middle finger of the left hand moves to the keying position P2 on the desk, the average diameter (number of pixels) at that time is the average diameter d2, and the angle from the left end of the horizontal field angle is the angle ⁇ 2.
  • the angle from the left end of the horizontal angle of view can correspond to the horizontal pixel number as shown in FIG. Therefore, if a conversion table between the angle and the horizontal pixel number is stored in advance in a memory (not shown) in the key input device, including correction of distortion due to aberration of the lens of the camera 20, etc., the angle from the pixel number is calculated. Is sought immediately.
  • the horizontal pixel numbers are integers corresponding to the number of pixels, and the angle corresponding to the horizontal pixel number includes a value less than the decimal point. Since the finger position is obtained by the regression line 85 as described in FIG. 13, the pixel number is obtained to the decimal point.
  • the angles ⁇ 1 and ⁇ 2 are obtained from the horizontal pixel numbers (obtained in FIG. 13) of the fingers on the image. Therefore, if the distances L1 and L2 from the camera are obtained, the movement vector V from the home position of the middle finger of the left hand is obtained.
  • the distances L1 and L2 can be approximately calculated from the distance L in the following expression (1) (details of the expression (1) will be described with reference to FIG. 15).
  • the average diameter of the actual fingers is expressed as average diameter D, but the value of this average diameter D cannot be measured by the camera.
  • One method is to have the key input device preliminarily set the key interval (mm) of the virtual keyboard assumed by the user. Since the average interval (number of pixels) of each finger at the home position corresponds to the key interval (mm), the actual finger diameter (mm) can be determined from the average diameter (number of pixels) of each finger on the image.
  • Equation (2) Let's calculate backward how much the average diameter (number of pixels) of the finger image changes when the finger moves on the virtual keyboard for one line using Equation (2).
  • the actual average diameter D of the fingers is 15 mm
  • the horizontal field angle ⁇ of the camera is ⁇ / 2 (that is, 90 degrees)
  • the number N of pixels in the horizontal direction is 640.
  • the distance L between the finger and the camera at the home position is 250 mm.
  • the average diameter (number of pixels) d of the finger image is about 24.5.
  • the key pitch of the virtual keyboard is 19 mm
  • the finger approaches the camera 20 for one line from the home position, and the distance L becomes 231 mm.
  • the average diameter (number of pixels) d of the finger image is about 26.5. Therefore, moving the virtual keyboard by one line increases the average diameter (number of pixels) of the finger image by about two pixels.
  • the difference between the two pixels can be sufficiently detected even when the influence of noise is taken into consideration.
  • the frame frequency is increased, and information on the average diameter (number of pixels) of the finger image before and after the keystroke while the finger is moving is used for further accuracy. It can be improved.
  • FIG. 15 is an explanatory diagram of Expression (1).
  • FIG. 15 shows an isosceles triangle having an actual average diameter D of fingers at positions separated by a distance L.
  • FIG. The angle ⁇ can be approximately expressed by the following expression (3) using the horizontal view angle ⁇ , the number of pixels N in the horizontal direction, and the average diameter (number of pixels) d of the finger image.
  • Equation (1) L ⁇ sin ( ⁇ ⁇ d / N) (4) If the actual average diameter D of the finger is replaced by the approximated segment A, Equation (1) is derived. In FIG. 15, the average diameter D of the fingers is greatly written. Actually, it is sufficiently smaller than the distance L, so that the error is small even if A is substituted.
  • the relative change of the position on the horizontal plane on the desk is calculated by the two parameters of the X coordinate position of the finger on the XZ plane obtained from the camera image and the size of the finger. Even if the mobile terminal 10 is arranged so that the camera 20 is close to the desk, the camera 20 is actually slightly away from the desk. Therefore, when the finger moves from the home position and approaches the camera 20, for example, the Z coordinate of the fingertip on the image is slightly lowered. Therefore, it is also possible to utilize the change in the Z coordinate for calculating the relative change in the position of the finger on the horizontal plane.
  • the mobile terminal 10 is arranged in a manner in which the camera 20 is sufficiently close to the desk so that the change of the Z coordinate on the image due to the horizontal movement of the finger is minimized. Is desirable.
  • FIG. 16A and FIG. 16B are explanatory diagrams of an example of a keystroke operation detection method. All the fingertips of the right hand 53C in the home position in FIG. 16A are in contact with the desk.
  • the keystroke detection unit 73 stores the height of the fingertip of each finger at that time on the Z axis and uses it for the determination.
  • the height of the middle finger at the home position is represented by a height ZH, and a threshold value (ZH + a) and a threshold value (ZH + b) are provided based on the height ZH. These threshold values are used when the middle finger strikes the virtual keyboard one line away from the camera.
  • the condition that the keystroke detection unit 73 determines that the keystroke is keyed is that the lowest point of the middle finger is below the threshold value (ZH + a), and at the same time, fingers other than the middle finger are above the threshold value (ZH + b). .
  • the right hand 53D of FIG. 16B that has moved from the home position and performed a keystroke operation satisfies this condition.
  • the key top of a hard keyboard has a sinking of several millimeters when a key is pressed, and users accustomed to this have a habit of lifting a finger other than the key that has been pressed.
  • the keystroke operation detection unit 73 uses this to provide two thresholds to increase the detection accuracy of the keystroke operation. Two thresholds for detecting the keystroke are set for each finger and each line of the virtual keyboard.
  • the portable terminal 10 is arranged so that the camera 20 is close to the desk. Therefore, even if the fingers move back and forth on the virtual keyboard, the change in the Z-axis on the image is very small, and it is easy to detect the keystroke operation based on the height of each finger on the Z-axis at the home position. is there.
  • FIG. 17 is an example of a hardware block diagram, and configures the functions of the functional block diagram of FIG.
  • the functions from the feature extraction unit 70 to the mouse pointer control unit 79 in FIG. 8 are realized by a software program 18 written in a ROM (Read Only Memory) 17 of the processor 14, and are stored in the DSP 15 and the CPU (Central Processing Unit) 16. Read and execute. Information obtained during execution is stored in a RAM (Random Access Memory) 19.
  • the ROM 17 can be externally attached.
  • FIG. 18 is a diagram illustrating an example of a method for detecting the desk edge 102 and a method for measuring the area of the finger.
  • An image 100 is an image taken by the camera 20 arranged close to the desk.
  • the state (A) after the user 101 activates the virtual keyboard, the user 101 once drops his hand from the desk. As a result, the desk edge 102 appears on the image without being blocked by the hand.
  • the desk edge 102 can be detected by differentiating the image in the Z-axis direction and extracting a line segment having a large absolute value of the differential value.
  • the desk edge 102 is partially invisible by the finger, and a portion that cannot be detected by the differential calculation occurs. In other words, the user's finger is placed at the pixel position where the desk edge 102 is no longer visible, and the finger position detector 71 can easily detect the finger.
  • State (C) is an enlarged image of the broken line rectangle in state (A) and represents the desk edge 102 and the desk top 104.
  • the state (D) is an enlarged image of the broken line rectangle in the state (B), and is a diagram in which the user's fingers 106 are arranged.
  • mobile terminal 10 detects desk edge 102.
  • the portable terminal 10 calculates a parameter such as a finger clipping threshold only for a desk image as a partial area obtained by the detected desk edge 102. Thereby, detection accuracy can be improved.
  • a finger image is also a relatively easy pattern recognition target by detecting only the tip of a finger arranged below the image of the desk edge 102 as in the state (E) of FIG. .
  • FIG. 19 shows an example of a method for obtaining the nail area.
  • Mobile terminal 10 obtains maximum width 109 and maximum height 110 of nail 108 from the contour of the image of nail 108 of finger 106.
  • Each of the maximum width 109 and the maximum height 110 can be used as one of the feature quantities that determine the size of the image of the finger or nail, but the product of the maximum width 109 and the maximum height 110 is used as a substitute for the nail area. Can also be used.
  • FIG. 20 exemplifies the procedure of a method for inputting a command with fingers.
  • FIG. 9 illustrates an example in which a command is input by grasping a hand.
  • FIG. 20 illustrates a method other than the method illustrated in FIG. 9, and only the front image of the finger 103 and the desk edge 102 are extracted. The state is illustrated. The desk edge 102 is shown as a straight line regardless of whether it is hidden by the finger 103.
  • the right hand temporarily floats from the home position in the state (A) as in the state (B), and the index finger strikes the key as in the state (C). Meanwhile, the user may lift his left hand, but may remain stationary at the home position as in the state (C).
  • the user moves the right hand with the thumb of the left hand floating, and presses the key as in the state (E), thereby inputting an image different from the state (C) to the key input device.
  • These series of operations can be input to the command determination unit 77 as one command.
  • the mobile terminal 10 can realize the same function as pressing a shift key, for example. As a result, it is possible to realize key input with high speed and few errors with less movement of the left hand as compared with normal hard keyboard operations.
  • state (F) the user of the mobile terminal 10 gives a command with the left thumb and index finger lifted.
  • This example shows a command to move the mouse pointer (hereinafter also referred to as “mouse command”). Only the index finger is in contact with the desk on the right hand, and that position is the position of the mouse pointer currently displayed on the display screen. Is associated with.
  • the right hand is moved as in the state (G)
  • the moving distance and direction of the index finger are calculated by the method described with reference to FIG. 13, and the position of the mouse pointer on the display screen is moved accordingly.
  • a function equivalent to mouse click input can be realized by lifting the right index finger from the desk in the state (G) and returning it to the desk again. By using this function, even when key input is not performed, for example, input of buttons on the display screen and scrolling of the screen can be realized.
  • FIG. 21A and FIG. 21B are examples of a flowchart of the present embodiment.
  • the camera 20 performs continuous shooting on the desk. Start (step S0).
  • the user once removes his / her hand from the desk, and the portable terminal 10 detects the desk edge 102 (step S1).
  • the mobile terminal 10 informs the user of the fact and starts detecting the position of the finger on the desk (step S2).
  • the position detection includes not only the position of the finger on the image but also the size of the finger, and is continuously performed until the end of the virtual keyboard program.
  • step S3 the mobile terminal 10 waits for the user's fingers to rest.
  • step S3: YES the portable terminal 10 detects that the finger is stationary
  • step S4: YES the portable terminal 10 measures the position and size of each finger at the home position, and stores the position and size in the RAM 19 (step S4). ).
  • step S4 the mobile terminal 10 starts detecting various operations of the user.
  • step S5: YES the mobile terminal 10 performs re-detection of the home position (step S6). If it is not stationary (step S5: NO), the portable terminal 10 checks whether or not an end command is indicated (step S7).
  • the mobile terminal 10 determines that an end command has been indicated. If the end command is indicated (step S7: YES), the mobile terminal 10 ends the position detection of the fingers on the desk, stops continuous shooting on the desk by the camera 20 (step S8), and ends the virtual keyboard. (Step 9).
  • the mobile terminal 10 checks whether the mouse command described in FIG. 20 is indicated (step S10). If the mouse command is indicated (step S10: YES), the mobile terminal 10 confirms whether one finger corresponding to the mouse pointer is stationary on the desk (step S18). If not stationary (step S18: NO), the mobile terminal 10 returns the control to step S5. If the mobile terminal 10 is stationary (step S18: YES), the mobile terminal 10 uses the position of the finger as the current position (home position) of the mouse displayed on the display screen of the display 30, and the position of the finger on the image. The size is stored in the RAM 19 (step S19). If the mouse command is released (step S20: NO), the mobile terminal 10 returns the control to step S5.
  • step S21 the mobile terminal 10 checks whether or not the user's finger has moved. If the finger has moved (step S21: YES), the portable terminal 10 compares the position and size of the finger after movement with the position and size of the finger at the home position stored in the RAM 19, and The moving direction and the moving distance of are calculated. The mobile terminal 10 moves the mouse pointer displayed on the display screen of the display 30 in accordance with the calculated moving direction and moving distance (step S22).
  • step S10 If the finger operation detected by the mobile terminal 10 is not a mouse command (step S10: NO), the mobile terminal 10 confirms whether or not a keystroke operation has been performed (step S11). If there is no keystroke operation (step S11: NO), the portable terminal 10 returns control to step S5. If there is a keystroke operation (step S11: YES), the mobile terminal 10 determines the position of the finger on the image and the size of the finger and the position of the finger on the image at the home position stored in step S4 or step S6. The movement direction and distance of the finger on the desk are calculated by comparing the size of the finger and the finger, and the position of the key on the virtual keyboard where the key is hit is obtained (step S12).
  • the mobile terminal 10 confirms whether a command is indicated (step S13). If no command is indicated (step S13: NO), the mobile terminal 10 generates a key code of the virtual QWERTY keyboard corresponding to the position (step S14). If the command is indicated with the hand opposite to the key that is pressed (step S13: YES), if the indicated command is a shift command, the portable terminal 10 has the shift key pressed at the same time as the key is pressed. And a key code corresponding to the determination is generated (step S15).
  • the mobile terminal 10 determines that the virtual keyboard has been switched to the numeric keyboard, and generates a key code for the numeric keypad corresponding to the position where the key was pressed (step S16). After the key input, the mobile terminal 10 waits for the finger that has pressed the key to leave the desk (step S17). If the finger is released (step S17: YES), the mobile terminal 10 returns the control to step 5.
  • FIG. 21 does not describe error handling when a keyless position is pressed.
  • the shift command and the numeric key command are described as examples of the key input command.
  • the key input command is not limited to the shift command and the numeric key command.
  • a control command indicating that a character key or numeric key and a control key are simultaneously pressed, or a function key command that can be operated with one hand can be added.
  • the function key command can be configured as, for example, each key of a keyboard having the same shape as a numeric keyboard, with the F1 key to F12 key, Back Space key, Delete key, etc. of the QWERTY keyboard arranged from the home position. Distant keys can be input instantly.
  • FIG. 22A and FIG. 22B are examples of a virtual keyboard in which the positions of keys assigned to each finger overlap each other.
  • the keyboard 112 in FIG. 22A is a virtual keyboard that is handled by the index finger of the right hand 111.
  • the index finger is arranged on the key “C” which is the home position.
  • the user rotates his / her hand with the wrist as a fulcrum, and the index finger strikes the key “E” from the key “A”.
  • FIG. 22B is a virtual keyboard handled by the middle finger of the same right hand 111.
  • the index finger is arranged on the key “H” which is the home position.
  • the index finger can press the J key from the F key.
  • a virtual keyboard having 40 or more types of keys can be realized by having eight fingers excluding the thumb in charge of five keys each.
  • the finger moves only to the left and right, and does not need to move back and forth. Therefore, in the present embodiment, the finger position detection unit 71 only needs to detect a change in the position of the finger in the camera image, and does not need to detect a change in the size of the finger or nail.
  • Each finger is responsible for only one key in each column, but the length of each finger is different, so in this case too, the distance from the camera of the key responsible for each finger is different and is specific to the user
  • This is a virtual keyboard having the following shape.
  • FIGS. 22A and 22B can be expanded so that the virtual keyboard assigned to each finger has a plurality of lines. In the case of a plurality of lines, it is necessary to move each finger forward and backward, and the finger position detection unit 71 needs to detect changes in the size of fingers and nails.
  • the functional configuration of FIGS. 7 and 8 and the flowchart of FIG. 21 are also explanatory diagrams of a virtual keyboard configured by overlapping the positions of keys assigned to each finger.
  • the virtual keyboard according to the present embodiment is configured as a virtual keyboard configured such that the positions of the keys assigned to each finger are overlapped with each other, thereby minimizing finger movement required for key operation. it can. By doing so, it is possible to realize a virtual keyboard in which hands and arms are less likely to get tired and the required area on the desk required for operation is minimized.
  • the next step is for the user to create a virtual keyboard composed of the keys arranged by each finger overlapping each other. It is desirable to use FIG. 23A and FIG. 23B are other examples of a system perspective view of the present embodiment.
  • FIG. 23A shows an example in which the key input device 118 is configured as an external device of a personal computer.
  • the key input device 118 is disposed on the desk, and the camera 116 captures a user's finger image from a position close to the desk.
  • the inputted key information is sent to the personal computer 114 through the cable 117 and displayed on the monitor 115.
  • FIG. 23A shows an example in which the key input device 118 is configured as an external device of a personal computer.
  • the key input device 118 is disposed on the desk, and the camera 116 captures a user's finger image from a position close to the desk.
  • the inputted key information is sent to the personal computer 114 through the cable 117 and displayed on the monitor 115.
  • 23B shows an example in which the function as a key input device is realized by a program of the personal computer 114.
  • the camera 116 is attached to the monitor 115, takes an image on the desk, and sends the image to the personal computer 114 through the cable 117.
  • a CPU (not shown) of the personal computer 114 executes each process (FIG. 21) performed on the mobile terminal 10 and functions as a virtual keyboard.
  • the description has been made on the assumption that nothing is attached to the fingers operating the virtual keyboard, but the embodiment is not limited to this.
  • the user uses a device such as a finger sack with different characteristics that can be identified by the camera. You may attach to.
  • the user attaches the instrument to the fingertip, and the mobile terminal 10 and other key input devices are configured to detect the position and size of the instrument instead of detecting the position and size of the finger itself. May be.
  • a user who is unfamiliar with the key input device can also operate while checking the position of the key by placing a sheet on which a keyboard suitable for the length of his / her finger is printed on the desk.
  • the mobile terminal 10 is realized in one aspect by a processor executing a computer program stored in a data recording medium.
  • the computer program can also be executed by a computer system having a known configuration. Therefore, it can be said that the essential part of the present invention is software stored in a RAM, hard disk, CD-ROM or other data recording medium, or software downloadable via a network. Since the operation of each hardware of the computer system is well known, detailed description will not be repeated.
  • Data recording media are not limited to CD-ROM, FD (Flexible Disk), and hard disk, but are magnetic tape, cassette tape, optical disc (MO (Magnetic Optical Disc) / MD (Mini Disc) / DVD (Digital Versatile Disc)). ), IC (Integrated Circuit) card (including memory card), optical card, mask ROM, EPROM (Electronically Programmable Read-Only Memory), EEPROM (Electronically Erasable Programmable Read-Only Memory), semiconductor memory such as flash ROM, etc. It may be a medium that carries the program in a fixed manner.
  • the program may include not only a program that can be directly executed by the CPU, but also a program in a source program format, a compressed program, an encrypted program, and the like.
  • the user can reduce fatigue of fingers and arms without sacrificing the wearability of the mobile terminal, and the virtual keyboard having the optimal position and shape for himself / herself Using, enables fast key input using both hands.
  • a normal hard keyboard has keys in each row aligned in a horizontal direction, so it is necessary to set each finger of different length so that the fingertip is almost perpendicular to the keyboard. For example, it is possible to hold all fingers straight. In this case, the keys in each row of the virtual keyboard are not aligned in the horizontal direction, and many users have the longest middle finger.
  • the key corresponding to the home position of the middle finger is a virtual keyboard having a shape most suitable for the user's fingers such that the keys in the row are closer to the camera than the keys in the other rows. And since the finger is stretched easily, the key is pressed with the belly of the finger (the part with the fingerprint). If it is a normal hard keyboard, it is easy to be double-keyed when the key of the finger is pressed at the same time, but since it is a virtual keyboard, it can be processed as if only the key at the fingertip position was input. The impact on the finger joint and the like is greatly reduced by hitting the key with the belly of the finger. This characteristic is effective when applied to information devices such as desktop computers that require key input other than portable terminals.
  • the key input device 12 can introduce the information about the size of the finger image in addition to the two-dimensional position information of the finger on the two-dimensional image obtained by the camera 20, and makes the input information three-dimensional. be able to.
  • the input device can calculate the relative movement distance from the change in the size of the finger based on the size of the finger image when the finger is placed at the home position.
  • the key input device 12 since the camera 20 is arranged close to the desk and is configured to capture a front image of the fingers on the desk, the key input device 12 has a height distance at which the fingers on the desk are separated from the desk. And the relative change distance of the two-dimensional position of the finger on the desk can be easily calculated. In addition, when the key input device 12 determines the keystroke operation, the difference in height between the finger performing the keystroke operation and the other fingers from the desk can be easily determined from the analysis of the image.
  • the user can confirm the key arrangement, and can confirm the current position of the finger detected by the key input device 12.
  • the user can check whether the camera 20 can capture the entire virtual keyboard that the user is trying to input.
  • the user can correct the imaging range by adjusting the interval between fingers or the distance between the fingers and the camera.
  • the user does not need to perform any operation on the key input device 12 and can instantly switch the keyboard type. For example, if the user holds the left hand in the shape of a goo, the key input device 12 can easily detect that the left hand is not effective.
  • the key input device 12 can detect the contours of fingers and nails photographed by the camera 20 more clearly, it is possible to improve the accuracy of measuring the size of the fingers. Further, even in a dark environment, the user can use the key input device 12.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)
  • Input From Keyboards Or The Like (AREA)

Abstract

Provided is an input device which can mitigate finger and arm fatigue for the purpose of achieving high-speed key entry. The input device has: a camera configured to consecutively photograph fingers of a user of the input device; a finger position detection unit configured to detect the position of each finger of the user on the basis of an image photographed by the camera; a home position detection unit configured so that when fingers of the user make tactile contact with a desktop and come to rest, the positions of the fingers of the user at the point in time are stored as home positions upon a virtual keyboard such that the distances from the camera to the keys for which each finger is responsible differ and which has a shape unique to the user; a typing behavior detection unit configured so as to detect that a finger of the user which has been detected by the finger position detection unit has typed a key upon the virtual keyboard; and a key code generation unit configured so as to generate the code of the key upon the virtual keyboard corresponding to the position where there has been a typing operation.

Description

入力装置、当該入力装置を備えた情報機器、コンピュータを入力装置として機能させるためのプログラム、および入力装置を用いて文字を入力するための方法INPUT DEVICE, INFORMATION EQUIPMENT HAVING THE INPUT DEVICE, PROGRAM FOR MAKING COMPUTER TO FUNCTION AS INPUT DEVICE, AND METHOD FOR INPUT CHARACTER USING INPUT DEVICE
 この開示は、情報機器に関し、より特定的には、光学的に手指の動きを検出して情報を入力する技術に関する。 This disclosure relates to information equipment, and more specifically to a technique for optically detecting finger movement and inputting information.
 多くの携帯電話は、その使用者が少ないキーを用いて片手で複数回の打鍵をすることにより、1回のキー入力を実現している。また多くのスマートフォンやタブレット端末は、その表示画面上に小さなソフトキーボードを表示し、タッチパネルによりキー入力を実現している。またパーソナルコンピュータのような情報機器では、多くの場合横一列にキートップが並んだハードキーボードを備えている。 Many mobile phones realize one-time key entry by using a few keys and one-handed keystrokes with few users. Many smartphones and tablet terminals display a small soft keyboard on the display screen and realize key input by a touch panel. Information devices such as personal computers often have a hard keyboard with key tops arranged in a horizontal row.
 仮想キーボードとして実用化されている機器には、レーザー光でソフトキーボードを机上に投影して手指で机上にタッチすることによってキー入力を実現する機器がある。特許文献1(特開平6-83512号公報)は、机上にソフトキーボードを投影せずに、手指の動きをカメラで検出する技術を開示している。特許文献2(特開2003-288156号公報)は、2台の撮影手段を設けることで手指の3次元位置を検出する技術を開示している。また特許文献3(特表2004-500657号公報)は、特殊な3次元センサーを用いて指の3次元位置を検出する技術を開示している。さらに、非特許文献1(Hafiz Adnan Habib and Muid Mafti,"Real Time Mono Vision Gesture Based Virtual Keyboard System", IEEE Transaction on Consumer Electronics, Vol.52, Issue4, Nov. 2006)は、手指の関節の位置を検出することで手指の位置や打鍵動作を検出する技術を開示している。 Equipment that has been put to practical use as a virtual keyboard includes equipment that realizes key input by projecting a soft keyboard onto a desk with laser light and touching the desk with fingers. Patent Document 1 (Japanese Patent Laid-Open No. 6-83512) discloses a technique for detecting the movement of a finger with a camera without projecting a soft keyboard on a desk. Patent Document 2 (Japanese Patent Laid-Open No. 2003-288156) discloses a technique for detecting a three-dimensional position of a finger by providing two photographing means. Japanese Patent Application Laid-Open No. 2004-500657 discloses a technique for detecting a three-dimensional position of a finger using a special three-dimensional sensor. Furthermore, Non-Patent Document 1 (Hafiz Adnan Habib and Muid Mafti, “Real Time Mono Vision Gesture Based Virtual Keyboard System”, IEEE Transaction on Consumer Electronics, Vol.52, Issue4, Nov. 2006) shows the position of the finger joint. A technique for detecting the position of a finger or a keystroke operation by detecting the finger is disclosed.
特開平6-83512号公報JP-A-6-83512 特開2003-288156号公報JP 2003-288156 A 特表2004-500657号公報Japanese translation of PCT publication No. 2004-500657
 パーソナルコンピュータが普及し、QWERTYキーボードなどの標準的なハードキーボードで両手を使ってアルファベット等のキー入力をすることが一般的になっている。熟練した使用者は、これらのハードキーボードを目で見ずに高速なキー入力であるタッチタイピングができる。しかしながら携帯端末においては、小型の外形サイズと高速なキー入力を両立させる手段がなかった。 Personal computers have become widespread, and it is common to input keys such as alphabets using both hands on a standard hard keyboard such as a QWERTY keyboard. Skilled users can perform touch typing, which is high-speed key input, without looking at these hard keyboards. However, in the portable terminal, there is no means for achieving both a small external size and high-speed key input.
 多くの携帯電話は、少ないキーしか有しておらず、また多くのスマートフォンやタブレット端末においては、表示画面上に表示するソフトキーボードが通常のハードキーボードよりも小さくなり、キーの間隔が狭いため、全ての手指を使った高速なキー入力は困難である。小型のハードキーボードを一体化している機器もあるが、やはりキーの間隔が狭いため、2本の手指での入力が限度であり、高速なキー入力は困難である。外付けの機器として大きなハードキーボードを使えば高速なキー入力は可能であるが、そのような外付け機器を持ち運びするのは、せっかくのハンドヘルドな携帯端末の特性を損なうことになる。またパーソナルコンピュータなどの情報機器のハードキーボードは、キーが横一列に並んでいることや、2重打鍵を回避するために指先を立てて打鍵する必要があるため、日々多くの文字を入力する仕事の従事者には、腱鞘炎や肩こりなどに悩まされている人が多い。 Many mobile phones have few keys, and in many smartphones and tablet devices, the soft keyboard displayed on the display screen is smaller than the normal hard keyboard, and the key spacing is narrow, High-speed key input using all fingers is difficult. Some devices integrate a small hard keyboard. However, since the interval between the keys is narrow, input with two fingers is the limit, and high-speed key input is difficult. If a large hard keyboard is used as an external device, high-speed key input is possible. However, carrying such an external device impairs the characteristics of a handheld portable terminal. In addition, hard keyboards for information devices such as personal computers require that the keys are arranged side by side and that the fingertips must be raised to avoid double keystrokes. Many of the workers are suffering from tendonitis and stiff shoulders.
 レーザー光を使って仮想キーボードを机上に投影し、カメラで手指の動きを検出してキー入力を実現する機器があるが、投影するための機器を持ち運びする必要があり、ウエアラブルとは言いがたい。 There is a device that uses a laser beam to project a virtual keyboard on a desk and detects the movement of fingers with a camera to realize key input, but it is necessary to carry the device for projection and it is difficult to say that it is wearable .
 また特許文献1は、机上にソフトキーボードを投影せずにカメラで手指の動きを検出する技術を開示しているが、ウエアラブルにするための技術については言及がない。 Patent Document 1 discloses a technique for detecting the movement of a finger with a camera without projecting a soft keyboard on a desk, but does not mention a technique for making it wearable.
 また特許文献2は、2台のカメラで手指の3次元位置を検出する技術を開示している。1台のカメラでも可能としているが、その具体的な実現手段については言及がない。 Patent Document 2 discloses a technique for detecting a three-dimensional position of a finger with two cameras. Although it is possible with only one camera, there is no mention of specific means for realizing it.
 また特許文献3は、特殊な3次元センサーを用いて指の3次元位置を検出する技術を開示している。しかしこのセンサーを容易には入手することができず、またセンサーから指までの距離の分解能が1cm程度とのことであり実用性に疑問がある。 Patent Document 3 discloses a technique for detecting a three-dimensional position of a finger using a special three-dimensional sensor. However, this sensor cannot be easily obtained, and the resolution of the distance from the sensor to the finger is about 1 cm.
 また非特許文献1は、手指の関節の位置を検出することで手指の位置や打鍵動作を検出する技術を開示している。この方法の場合、通常のハードキーボードと同様に、指先をほぼ垂直に立てて机上を打鍵する必要がある。通常のハードキーボードの場合はキートップのバネ性によって打鍵による手指への衝撃は緩和されるが、机上を打鍵する場合は衝撃が直接関節に伝わり、手指や腕の痛みや疲れが大きくなり、使用する上で負担が大きい。 Also, Non-Patent Document 1 discloses a technique for detecting the position of a finger and a keystroke action by detecting the position of a finger joint. In the case of this method, like a normal hard keyboard, it is necessary to stand on the desk with the fingertips standing almost vertically. In the case of a normal hard keyboard, the impact on the finger due to keystroke is mitigated by the springiness of the key top, but when you hit the key on the desk, the impact is transmitted directly to the joint, causing pain and fatigue in the fingers and arms to increase The burden is large in doing.
 長時間キー入力をしても手指や腕が疲れにくいハードキーボードとして人間工学(エルゴノミクス)キーボードが開発されているが、仮想キーボードの場合そのようなキー入力装置は開発されてこなかった。 An ergonomic keyboard has been developed as a hard keyboard that is hard to get tired of fingers and arms even after long-time key input, but such a key input device has not been developed in the case of a virtual keyboard.
 したがって、携帯端末のウエアラブル性を損なうことなく、手指や腕の疲れを軽減でき、安価なカメラを使用して高速なキー入力を可能とする技術が必要とされている。また、パーソナルコンピュータなどの情報機器のハードキーボードの使用による手指や腕の疲れを軽減できる技術が必要とされている。 Therefore, there is a need for a technique that can reduce the fatigue of fingers and arms without impairing the wearability of the mobile terminal, and enables high-speed key input using an inexpensive camera. In addition, there is a need for a technique that can reduce the fatigue of fingers and arms caused by the use of a hard keyboard of an information device such as a personal computer.
 一実施の形態に従う入力装置は、入力装置が机上に近接して配置されると、入力装置の使用者の手指の正面画像を連続的に撮影するように構成されたカメラと、撮影された正面画像から使用者の手指の位置を検出するように構成された手指位置検出部と、使用者の手指が机上に接触して静止した時に、その時点の使用者の手指の位置を、各指が担当するキーのカメラからの距離が異なり、かつ、使用者に固有の形状を有する仮想キーボード上のホームポジションとして検出するように構成されたホームポジション検出部と、使用者の手指が仮想キーボード上のキーを打鍵したことを検出するように構成された打鍵動作検出部と、打鍵動作があった位置に該当する仮想キーボード上のキーのコードを生成するように構成されたキーコード生成部とを備える。 An input device according to an embodiment includes a camera configured to continuously capture a front image of a finger of a user of the input device when the input device is disposed close to a desk, and a captured front surface. The finger position detection unit configured to detect the position of the user's finger from the image and the position of the user's finger at that time when the user's finger touches and rests on the desk. The home position detector configured to detect the home position on the virtual keyboard having a different distance from the camera of the key in charge and having a shape unique to the user, and the user's finger on the virtual keyboard A keystroke operation detector configured to detect that a key has been pressed, and a key code generator configured to generate a key code on the virtual keyboard corresponding to the position where the keystroke occurred. Provided with a door.
 日々多くの文字をキー入力する仕事の従事者には、腱鞘炎や肩こりなどに悩まされている人が多い。そのような仕事の従事者はキーボードを見ずにタッチタイピングができる人がほとんどであり、ある局面においては、症状が改善されることが期待できる。 Employees who are keying in many letters every day often suffer from tendonitis or stiff shoulders. Most of the workers engaged in such work can perform touch typing without looking at the keyboard, and in one aspect, symptoms can be expected to be improved.
 この発明の上記および他の目的、特徴、局面および利点は、添付の図面と関連して理解されるこの発明に関する次の詳細な説明から明らかとなるであろう。 The above and other objects, features, aspects and advantages of the present invention will become apparent from the following detailed description of the present invention which is to be understood in connection with the accompanying drawings.
以下の説明で使用する3次元空間の表記の説明図である。It is explanatory drawing of the description of the three-dimensional space used by the following description. 以下の説明で使用する仮想キーボードの行方向と列方向の説明図である。It is explanatory drawing of the row direction and column direction of the virtual keyboard used by the following description. 本発明のシステム斜視図である。It is a system perspective view of the present invention. 図2Aにおいて手指を楽な姿勢で構えた場合のシステム斜視図である。It is a system perspective view at the time of holding a finger with an easy posture in Drawing 2A. 図2Bのように手指を楽な姿勢で構えた場合の仮想キーボードの図である。It is a figure of the virtual keyboard at the time of holding a finger with a comfortable posture like FIG. 2B. 図2Bのように手指を楽な姿勢で構えた場合の仮想キーボードの図で、通常のハードキーボードのように一列のキートップを斜めに配置した例である。FIG. 2B is a diagram of a virtual keyboard when fingers are held in a comfortable posture as shown in FIG. 2B, and is an example in which a row of key tops are diagonally arranged like a normal hard keyboard. 本発明の説明図である。It is explanatory drawing of this invention. 本発明のカメラが撮影する手指の正面画像のイメージ図である。手指は通常のハードキーボード上に置く場合の構えである。It is an image figure of the front image of the finger image | photographed with the camera of this invention. The finger is ready to be placed on a normal hard keyboard. 図5Aにおいて右手だけが仮想キーボードの1行分カメラに近づいたイメージ図である。FIG. 5B is an image view in which only the right hand approaches the camera for one line of the virtual keyboard in FIG. 5A. 本発明のカメラが撮影する手指の正面画像のイメージ図である。手指は楽な姿勢の構えである。It is an image figure of the front image of the finger image | photographed with the camera of this invention. The fingers are in an easy posture. 図6Aにおいて右手だけが仮想キーボードの1行分カメラから遠ざかり、その中指が打鍵をしたイメージ図である。FIG. 6A is an image diagram in which only the right hand moves away from the camera for one line of the virtual keyboard in FIG. 本発明の機能的構成の一例を示す機能ブロック図である。It is a functional block diagram which shows an example of a functional structure of this invention. 本発明の機能的構成の別の一例を示す機能ブロック図である。It is a functional block diagram which shows another example of a functional structure of this invention. ホームポジションに静止した手指でコマンド入力をする方法の説明図である。It is explanatory drawing of the method of inputting a command with the finger | toe stationary at a home position. 仮想テンキーボード上のホームポジションに置かれた手指の説明図である。It is explanatory drawing of the finger | toe placed in the home position on a virtual ten keyboard. ディスプレイ上に表示される仮想キーボードに、机上に配置された手指の現在の検出位置を重ねて表示する方法の一例の説明図である。It is explanatory drawing of an example of the method of superimposing and displaying the present detection position of the finger arrange | positioned on the desk on the virtual keyboard displayed on a display. ディスプレイ上に仮想キーボード、カメラが捉えた手指の画像及び入力済み文字列を表示する方法の一例の説明図である。It is explanatory drawing of an example of the method of displaying the virtual keyboard, the image of the fingers which the camera caught on the display, and the input character string. 手指の位置と大きさを測定する方法の一例の説明図である。It is explanatory drawing of an example of the method of measuring the position and magnitude | size of a finger. 仮想キーボード上の手指のホームポジションからの水平移動距離を計算する方法の一例の説明図である。It is explanatory drawing of an example of the method of calculating the horizontal movement distance from the home position of the finger on a virtual keyboard. 式(1)の説明図である。It is explanatory drawing of Formula (1). 打鍵動作検出方法の一例の説明図である。It is explanatory drawing of an example of a keystroke operation | movement detection method. 打鍵動作検出方法の一例の説明図である。It is explanatory drawing of an example of a keystroke operation | movement detection method. 本発明のハードウエアブロック図の一例である。It is an example of a hardware block diagram of the present invention. 机エッジの検出方法と手指の面積の測定方法の一例を示す図である。It is a figure which shows an example of the detection method of a desk edge, and the measuring method of the area of a finger. 爪の面積を得る方法の一例である。It is an example of the method of obtaining the area of a nail | claw. 状態Aから状態Gは手指でコマンド入力をする方法の手順を示す図である。State A to state G are diagrams illustrating a procedure of a method for inputting a command with fingers. 本実施の形態に係るフローチャートの一例である。It is an example of the flowchart which concerns on this Embodiment. 本実施の形態に係るフローチャートの一例である。It is an example of the flowchart which concerns on this Embodiment. 各指が担当するキーの位置が相互に重なって構成されている仮想キーボードの例である。This is an example of a virtual keyboard in which the positions of keys assigned to each finger overlap each other. 各指が担当するキーの位置が相互に重なって構成されている仮想キーボードの例である。This is an example of a virtual keyboard in which the positions of keys assigned to each finger overlap each other. 本実施の形態に係るシステム斜視図の一例である。It is an example of the system perspective view concerning this Embodiment. 本実施の形態に係るシステム斜視図の一例である。It is an example of the system perspective view concerning this Embodiment.
 以下、図面を参照しつつ、本発明の実施の形態について説明する。以下の説明では、同一の部品には同一の符号を付してある。それらの名称および機能も同じである。したがって、それらについての詳細な説明は繰り返さない。 Hereinafter, embodiments of the present invention will be described with reference to the drawings. In the following description, the same parts are denoted by the same reference numerals. Their names and functions are also the same. Therefore, detailed description thereof will not be repeated.
 図1Aは、以下の説明で使用する3次元空間の表記の説明図である。カメラ20のある位置を原点とし、机上の平面をXY平面、机上からの高さ方向をZ軸として説明する。ただしこれは座標軸の方向だけを示す。カメラ20で撮影した画像についての説明では、X軸の座標は画素番号で表され、画像上の手指の大きさはそれが占める画素数で表される。図1Aの位置関係からカメラ20が撮影した手指の向きを手指の正面画像と呼ぶ。図1Bは、以下の説明で使用する仮想キーボードの行方向と列方向の定義の説明図である。カメラと遠近の方向を列方向とする。 FIG. 1A is an explanatory diagram of a three-dimensional space notation used in the following description. Description will be made assuming that a certain position of the camera 20 is an origin, a plane on the desk is an XY plane, and a height direction from the desk is a Z axis. However, this shows only the direction of the coordinate axis. In the description of the image taken by the camera 20, the X-axis coordinate is represented by a pixel number, and the size of the finger on the image is represented by the number of pixels that it occupies. The direction of the finger photographed by the camera 20 from the positional relationship of FIG. 1A is referred to as a front image of the finger. FIG. 1B is an explanatory diagram of the definition of the row direction and the column direction of the virtual keyboard used in the following description. The direction from the camera to the perspective is the column direction.
 図2Aは、本実施の形態に係るシステム斜視図の例である。机上に置かれた携帯端末と使用者の手指を示している。携帯端末10の側面に実装されて机上に近接して配置されたカメラ20で取得された机上の画像は、その一部がディスプレイ30上の手指画像表示部33に表示される。使用者は、自分の手指の正面画像を見て自分の左手50Aと右手51Aがカメラ20の撮影範囲内に収まっているかどうかを確認でき、また、カメラ20の水平方向画角である直線21と直線22の間に両手の手指を収めることができる。 FIG. 2A is an example of a system perspective view according to the present embodiment. The mobile terminal placed on the desk and the user's fingers are shown. A part of the image on the desk acquired by the camera 20 mounted on the side surface of the mobile terminal 10 and arranged close to the desk is displayed on the finger image display unit 33 on the display 30. The user can check whether his / her left hand 50A and right hand 51A are within the shooting range of the camera 20 by looking at the front image of his / her finger, and the straight line 21 which is the horizontal angle of view of the camera 20 can be confirmed. The fingers of both hands can be accommodated between the straight lines 22.
 ライト照射部40は、より鮮明に手指や爪の輪郭を検出できるように、直線41と直線42の範囲を照射する。キーボード表示部32には、キー入力装置が検出している手指検出位置が仮想キーボードに重ねて表示されている。使用者は、これを見て、より正確に打鍵することができる。入力済み文字列の表示部31には、キー入力装置がこれまでに生成したキーコードから得られた、使用者が入力した文字列が、表示されている。 The light irradiation unit 40 irradiates the range of the straight line 41 and the straight line 42 so that the contours of fingers and nails can be detected more clearly. On the keyboard display unit 32, finger detection positions detected by the key input device are displayed superimposed on the virtual keyboard. The user sees this and can hit the key more accurately. The input character string display unit 31 displays a character string input by the user, which is obtained from the key codes generated by the key input device so far.
 図2Bは、使用者が手指を楽な姿勢で構えたシステムの斜視図の例である。カメラ20は、携帯端末10のディスプレイ30のある面に実装されていて、自分撮り用カメラとしても用いられ得る。通常のハードキーボードではキートップが横方向に一直線に並んでいるため、使用者がこのように楽な姿勢で構えた場合、各手指の長さが異なるため、使用者は正しく打鍵ができない。また使用者は、打鍵した場合に、周囲のキーを同時に押してしまう。本実施の形態では、図2Aのように通常のハードキーボードに対するように手指を構える場合から、図2Bのように十分楽な姿勢で構える場合まで、本実施の形態に係るシステムの使用者の熟練度に応じて自由に選択できる。 FIG. 2B is an example of a perspective view of a system in which a user holds a finger in a comfortable posture. The camera 20 is mounted on a certain surface of the display 30 of the mobile terminal 10 and can be used as a self-portrait camera. In a normal hard keyboard, the key tops are aligned in the horizontal direction. Therefore, when the user holds the user in such an easy posture, the length of each finger is different, and the user cannot correctly press the key. Further, when the user hits the key, the user presses the surrounding keys at the same time. In the present embodiment, the skill of the user of the system according to the present embodiment from the case of holding a finger as in a normal hard keyboard as shown in FIG. 2A to the case of holding in a sufficiently easy posture as shown in FIG. 2B. You can choose freely according to the degree.
 図3Aは、図2Bのように手指を楽な姿勢で構えた場合の仮想キーボード60Aを例示する(本図でも以降の図でも仮想キーボードは図示しているだけで実際には存在しないことに留意されたい)。指の長さに合わせて各列のキーのカメラからの距離が変わる。手指の姿勢によって使用者は仮想キーボードの形状を自由に設定できるため、自分に合った人間工学キーボードが構成できる。図3Aは一体的な仮想キーボードを表しているが、左手と右手とを通常のハードキーボードを使用するときよりも離すことで、中央で分離した仮想キーボードにすることができ、腕や手指への負担をさらに減らすことができる。なお親指は基本的にスペースキーを押すだけなので、仮想キーボード上のどの位置で親指が打鍵をしてもスペースキーが押されたとして入力するようにもできる。 FIG. 3A illustrates a virtual keyboard 60A in the case where a finger is held in a comfortable posture as in FIG. 2B (note that the virtual keyboard is only shown in FIG. I want to be) The distance from the camera of the key of each row changes according to the length of the finger. Since the user can freely set the shape of the virtual keyboard depending on the posture of the fingers, an ergonomic keyboard suitable for him / her can be configured. Although FIG. 3A shows an integrated virtual keyboard, the left and right hands can be separated from each other than when using a normal hard keyboard so that a virtual keyboard separated in the center can be obtained. The burden can be further reduced. Since the thumb basically only presses the space key, it can be input that the space key is pressed no matter where the thumb hits the virtual keyboard.
 図3Bは、通常のハードキーボードのように一列のキートップを斜めに配置した仮想キーボード60Bを例示する。親指はどの位置で打鍵してもスペースキーと見なされるので、スペースキーのキートップは図示されていない。使用者が図3Aと図3Bのどちらの形状の仮想キーボードを使用するかは、使用者があらかじめキー入力装置に入力する設定に基づいて決定される。あるいは後で述べる初期文字列による学習によって設定する方法もある。 FIG. 3B illustrates a virtual keyboard 60B in which a row of key tops are arranged obliquely like a normal hard keyboard. The key top of the space key is not shown because the thumb is regarded as a space key no matter where the key is pressed. Whether the user uses the virtual keyboard of FIG. 3A or FIG. 3B is determined based on the setting that the user inputs in advance to the key input device. Alternatively, there is a method of setting by learning with an initial character string described later.
 図4は本実施の形態の説明図である。図4は、上記のように、仮想キーボードを自由に設定できる原理と、キー入力装置がカメラと手指の間の距離変化を検出する原理と、を示している(距離変化の具体的な計算方法は図14で説明する)。仮想キーボード60Cのホームポジション上に置かれた左手50Bと右手51Bは、携帯端末10の側面に実装されて机上に近接して配置されたカメラ20の水平方向の画角(視野角)を構成する直線21と直線22の範囲に置かれている。 FIG. 4 is an explanatory diagram of this embodiment. FIG. 4 shows the principle that the virtual keyboard can be freely set as described above and the principle that the key input device detects the distance change between the camera and the finger (a specific calculation method of the distance change). Is described in FIG. The left hand 50B and the right hand 51B placed on the home position of the virtual keyboard 60C constitute a horizontal angle of view (viewing angle) of the camera 20 that is mounted on the side surface of the mobile terminal 10 and arranged close to the desk. It is placed in the range of the straight line 21 and the straight line 22.
 図4は、カメラ20の水平方向の画角が90度であり、画素数はVGAで水平方向画素数が640を例示している。水平方向の画素番号は、1から640として表現されている。図4は、右手51Bがホームポジションから1行上のキー位置51Eに移動した場合を図示している。ホームポジションにある右手51Bでの親指を除く右手の4本の指の左端を写す画素25は、画素番号277であり、右端を写す画素26は、画素番号147である。1段上のキー位置51Eに移動したときの親指を除く右手の4本の指の左端を写す画素27は、画素番号282であり、右端を写す画素28は、画素番号137である。 FIG. 4 shows an example in which the horizontal angle of view of the camera 20 is 90 degrees, the number of pixels is VGA, and the number of horizontal pixels is 640. The pixel numbers in the horizontal direction are expressed as 1 to 640. FIG. 4 illustrates a case where the right hand 51B has moved from the home position to the key position 51E one row above. The pixel 25 that captures the left end of the four fingers of the right hand excluding the thumb in the right hand 51B at the home position is the pixel number 277, and the pixel 26 that captures the right end is the pixel number 147. The pixel 27 that captures the left end of the four fingers of the right hand excluding the thumb when moved to the key position 51E one level above is the pixel number 282, and the pixel 28 that captures the right end is the pixel number 137.
 従って、この手指の移動によってカメラが写す親指を除く右手の4本の指の幅は、15画素広くなる。図示しないが、1本の手指の幅は約3画素広くなる。カメラの画素数をVGAよりも多くすることで、さらに多くの画素数の変化が得られる。図4は、通常のハードキーボード上に一直線に手指を構えた場合と同様の手指配置例を示しているが、以上の前後移動の検出原理から、ホームポジションに手指が楽な姿勢で構えられた場合においても、その時点の手指の幅を基準として、前後に動いたときの手指の幅の変化を測定することで、本実施の形態に係るシステムは、移動距離を検出できる。 Therefore, the width of the four fingers of the right hand excluding the thumb captured by the camera is increased by 15 pixels due to the movement of the fingers. Although not shown, the width of one finger is approximately 3 pixels wider. By making the number of pixels of the camera larger than that of the VGA, more changes in the number of pixels can be obtained. FIG. 4 shows an example of finger arrangement similar to the case where the fingers are held in a straight line on a normal hard keyboard, but the fingers are held in a comfortable posture at the home position based on the detection principle of the back and forth movement described above. Even in this case, the system according to the present embodiment can detect the movement distance by measuring a change in the width of the finger when moving back and forth with reference to the width of the finger at that time.
 手指がホームポジション上にあることを決定する手法には、例えば画像上の手指が10フレームにわたって、上下左右の移動も手指の幅の変化も平均して1画素未満で略静止した時点をホームポジションとする手法がある。その時に手指が通常のハードキーボード上に一直線に構えられているか、あるいは楽な姿勢で構えられているかを区別する必要はない。その時点の手指の幅を基準として、そこからの手指の幅の変化で前後移動の距離を測定することで手指が置かれたキーを決定することが本実施の形態に係るシステムの要点である。 The method for determining that the finger is on the home position is, for example, the time when the finger on the image is approximately stationary for less than one pixel on average on both the vertical and horizontal movements and the change in the width of the finger over 10 frames. There is a technique. At that time, it is not necessary to distinguish whether the fingers are held in a straight line on a normal hard keyboard or in an easy posture. The key point of the system according to the present embodiment is to determine the key on which the finger is placed by measuring the distance of the forward / backward movement based on the change in the width of the finger from the current finger width. .
 使用者によって想定している仮想キーボードのキートップのピッチは異なる。ホームポジションにその使用者の手指が置かれたときに、その手指のピッチの平均値はキートップのピッチに相当するので、キー入力装置は、キートップの左右ピッチを決定できる。また各キートップは略正方形であるため、手指がどれだけ前後に移動したら使用者が想定している仮想キーボードの1行分移動したかは、決定できる。 • The pitch of the virtual keyboard key tops varies depending on the user. When the user's finger is placed at the home position, the average value of the pitch of the finger corresponds to the key top pitch, so the key input device can determine the left and right pitches of the key top. Also, since each key top is substantially square, it can be determined how much the finger has moved back and forth for one line of the virtual keyboard assumed by the user.
 図4の角度θは、水平方向画角の画素番号1から手指がある画素番号までの角度である。図4からわかるように水平方向画素番号がわかれば角度θがわかる。この関係は図14の説明で使用する。また図4の水平方向画素番号は、カメラのレンズの収差などによるひずみは無視して等間隔に割りふられている。この対策も図14で説明する。 The angle θ in FIG. 4 is an angle from the pixel number 1 in the horizontal field angle to the pixel number with the finger. As can be seen from FIG. 4, when the horizontal pixel number is known, the angle θ can be determined. This relationship is used in the description of FIG. Also, the horizontal pixel numbers in FIG. 4 are allocated at equal intervals while ignoring distortion caused by camera lens aberrations. This countermeasure will also be described with reference to FIG.
 図5Aは、本実施の形態に係るカメラが撮影する手指の正面画像のイメージ図である。図5Aは、ホームポジションにある左手50Cと右手51Cの正面画像を示している。手指は、通常のハードキーボード上に置く場合の構えである。右手の親指を除く4本の指は、輪郭線80Aで囲われている。 FIG. 5A is an image diagram of a front image of fingers captured by the camera according to the present embodiment. FIG. 5A shows a front image of the left hand 50C and the right hand 51C at the home position. The fingers are ready to be placed on a normal hard keyboard. The four fingers other than the right thumb are surrounded by an outline 80A.
 図5Bは、図5Aの右手だけが仮想キーボードの1行分カメラに近づいたイメージ図である。図5Bは、外形サイズが大きくなった右手51Dとその輪郭線80Bを示している。図4では、この手指の前後移動でのカメラが撮影する外形サイズの変化は、水平方向の画素数だけで説明されたが、図5A、図5Bからわかるように、その変化は、実際には面積変化である。従ってこの情報を用いればさらに前後移動距離の識別が容易となる。 FIG. 5B is an image diagram in which only the right hand of FIG. 5A approaches the camera for one line of the virtual keyboard. FIG. 5B shows a right hand 51D having an increased external size and its outline 80B. In FIG. 4, the change in the outer size photographed by the camera by moving the finger back and forth has been explained only by the number of pixels in the horizontal direction, but as can be seen from FIGS. 5A and 5B, the change is actually It is an area change. Therefore, the use of this information makes it easier to identify the forward / backward movement distance.
 図6Aも、本実施の形態に係るカメラが撮影する手指の正面画像のイメージ図である。手指は、楽な姿勢の構えである。図6Aは、ホームポジションにある左手52Cと右手53Cの手指画像を示している。幅81A、81B、81Cは、手指画像の輪郭から抽出された中指の幅を示している。 FIG. 6A is also an image diagram of a front image of fingers captured by the camera according to the present embodiment. The fingers are in an easy posture. FIG. 6A shows finger images of the left hand 52C and the right hand 53C at the home position. The widths 81A, 81B, and 81C indicate the width of the middle finger extracted from the contour of the finger image.
 図6Bは、図6Aの右手だけが仮想キーボードの1行分カメラから遠ざかり、その中指が打鍵をしたイメージ図である。図6Bは、図6Aの右手53Cの外形サイズが小さくなった右手53Dと、打鍵動作をした中指の幅81D、81E、81Fとを示している。このように、手指の幅の変化は、1点ではなく複数の点で捉えることができ、平均値をとることで測定精度を高めることができる。 FIG. 6B is an image diagram in which only the right hand of FIG. 6A is moved away from the camera by one line of the virtual keyboard, and the middle finger is keyed. FIG. 6B shows the right hand 53D in which the outer size of the right hand 53C in FIG. 6A is reduced, and the widths 81D, 81E, and 81F of the middle finger that has performed the keystroke operation. Thus, the change in the width of the finger can be captured at a plurality of points instead of one point, and the measurement accuracy can be improved by taking the average value.
 図6Bでは、打鍵動作をしている右手の中指は机上にタッチし、右手のその他の手指は机上から離れている。また打鍵動作をしていない左手は、机上にタッチしたままである。このように打鍵動作をしていない側の手指は、机上にタッチした楽な姿勢のままで配置させても打鍵動作の検出に支障はない。また、図6Bからわかるように、カメラを机上に近接した位置に置いて手指の正面画像を撮影することによって、打鍵動作をした手指とその他の手指の机上からの高さの違いが画像の分析から容易に判別できる(詳細は図16Aと図16Bとで説明する)。 In FIG. 6B, the middle finger of the right hand performing the keystroke operation touches the desk, and the other fingers of the right hand are away from the desk. The left hand that is not performing the keystroke operation remains touching the desk. Thus, even if the fingers on the side not performing the keystroke operation are arranged in an easy posture touched on the desk, there is no problem in detecting the keystroke operation. In addition, as can be seen from FIG. 6B, by taking a front image of a finger with the camera placed close to the desk, the difference in height between the finger that made the keystroke and the other finger from the desk is analyzed. (The details will be described with reference to FIGS. 16A and 16B).
 図5Aのように通常のハードキーボード上に一直線に手指をホームポジションに構えた場合と、図6Aのように楽な姿勢で手指をホームポジションに構えた場合とを比較して、カメラが撮影する画像からは指先の位置が左右に一直線であるのか否かは、ほとんど識別できない。このことを逆に利用して、これらを同等の取り扱いとし、ホームポジションにおける各手指や爪などの画像の大きさを基準とし、そこからの大きさの変化から手指の相対的な前後移動距離を計算することで、仮想キーボード上の手指の位置を検出することができる。このことによって、使用者の手指の構えの個性を全て吸収でき、手指や腕が疲れにくい人間工学キーボードとしての機能が実現できる。 The camera shoots by comparing the case where the fingers are held in the home position in a straight line on a normal hard keyboard as shown in FIG. 5A and the case where the fingers are held in the home position in an easy posture as shown in FIG. 6A. From the image, it can hardly be identified whether the position of the fingertip is in a straight line from side to side. By taking advantage of this fact, they are treated equally, and the relative size of the fingers is calculated from the change in size based on the size of each finger and nail image at the home position. By calculating, the position of the finger on the virtual keyboard can be detected. As a result, the personality of the user's fingers can be absorbed, and the function as an ergonomic keyboard can be realized in which fingers and arms are less tired.
 またカメラの画角を広げるか、あるいは複数のカメラを用いることで、本実施の形態に係るキーボードによると、一般の人間工学キーボードよりも両手を広く離して配置することも可能である。複数のカメラを使った場合、3D画像としての手指の前後位置情報も活用できる。 Also, by widening the angle of view of the camera or using a plurality of cameras, the keyboard according to the present embodiment can be arranged with both hands wider than a general ergonomic keyboard. When a plurality of cameras are used, the front / rear position information of fingers as a 3D image can also be used.
 図7は、本実施の形態の機能的構成の一例を示す機能ブロック図である。キー入力装置12は、携帯端末10の外付け機器あるいは内蔵モジュールとして構成されている。カメラモジュール90内の机上に近接して配置されたカメラ20で撮影された手指画像は、カメラモジュール内のDSP(Digital Signal Processor)29で画像の前処理がされた後、手指や爪の輪郭候補などの特徴データが、特徴抽出部70によって抽出される。 FIG. 7 is a functional block diagram showing an example of the functional configuration of the present embodiment. The key input device 12 is configured as an external device or a built-in module of the mobile terminal 10. Hand images taken by the camera 20 placed close to the desk in the camera module 90 are preprocessed by a DSP (Digital Signal Processor) 29 in the camera module, and then the contour candidates of fingers and nails Such feature data is extracted by the feature extraction unit 70.
 キー入力プロセッサ13は、DSP29から手指の輪郭候補等の特徴データを受け取り、手指位置検出部71からコマンド判定部77までの全ての機能は、ソフトウエアプログラムで実行される。手指位置検出部71は、手指の輪郭候補から各手指を判別し、手指のXZ平面上の位置を決定するとともに、手指や爪の大きさを計算する。ホームポジション検出部72は、例えば全ての手指が所定の時間机上で略静止した場合に、そのときの手指の位置と大きさとを、仮想キーボード上のホームポジションとして記憶する。ホームポジションでは仮想キーボードがQWERTYキーボードである場合、親指を除く左手小指から右手小指までが、キー“A”,“S”,“D”,“F”,“J”,“K”,“L”,“;”(セミコロン)の上にある。親指は、基本的にスペースキーだけを打鍵するので、仮想キーボード上の位置は重要ではない。また、手指の怪我などで各手指が担当するキーを変更する場合は、本キー入力装置の使用前にその変更を事前登録することで、その変更への対応が可能である。 The key input processor 13 receives feature data such as a finger contour candidate from the DSP 29, and all functions from the finger position detecting unit 71 to the command determining unit 77 are executed by a software program. The finger position detection unit 71 determines each finger from the finger contour candidates, determines the position of the finger on the XZ plane, and calculates the size of the finger and nails. For example, when all fingers are substantially stationary on a desk for a predetermined time, the home position detection unit 72 stores the positions and sizes of the fingers at that time as home positions on the virtual keyboard. In the home position, when the virtual keyboard is a QWERTY keyboard, the keys "A", "S", "D", "F", "J", "K", "L" ”,“; ”(Semicolon). Since the thumb basically hits only the space key, the position on the virtual keyboard is not important. In addition, when changing the key assigned to each finger due to finger injury or the like, it is possible to cope with the change by pre-registering the change before using the key input device.
 このホームポジションでの画像上の各手指のXZ面上の位置と手指や爪の大きさとを基準として、ここからどれだけ位置と大きさとが変化したかに応じて、手指位置検出部71は、手指の仮想キーボード上の位置を計算する。 Based on the position on the XZ plane of each finger on the image at the home position and the size of the finger or nail as a reference, the finger position detector 71 Calculate the position of the finger on the virtual keyboard.
 打鍵動作検出部73は、各手指の指先の机上からの高さを調べ、打鍵動作があったかどうかを判定する。また、左手でSHIFTキーを押下した状態のまま右手で他のキーを打鍵するような場合も、打鍵動作検出部73は、両手の打鍵位置を計算することで判定が可能である。 The keystroke detection unit 73 checks the height of the fingertip of each finger from the desk and determines whether or not there is a keystroke action. Further, even when another key is pressed with the right hand while the SHIFT key is pressed with the left hand, the key pressing operation detecting unit 73 can determine by calculating the key pressing position of both hands.
 キーコード生成部74は、打鍵動作があった位置に該当するキーコードを生成して、携帯端末10のホストプロセッサ11に当該キーコードを送る。 The key code generation unit 74 generates a key code corresponding to the position where the keystroke operation was performed, and sends the key code to the host processor 11 of the mobile terminal 10.
 手指位置検出部71とホームポジション検出部72の各検出結果と、キーコード生成部74が生成したキーコードとは、初期文字列による学習部75、文字入力中の学習部76およびコマンド判定部77に送られ、これらの学習および判定のための入力情報として使用される。またこれらの学習および判定の各結果は、手指位置検出部71とホームポジション検出部72と打鍵動作検出部73とに、それぞれフィードバックされ、検出パラメータが調整される。これにより、検出精度が向上する。 The detection results of the finger position detection unit 71 and the home position detection unit 72 and the key code generated by the key code generation unit 74 are a learning unit 75 based on an initial character string, a learning unit 76 during character input, and a command determination unit 77. And used as input information for learning and determination. The learning and determination results are fed back to the finger position detection unit 71, the home position detection unit 72, and the keystroke operation detection unit 73, and the detection parameters are adjusted. Thereby, detection accuracy improves.
 初期文字列による学習部75は、使用者がキー入力装置12を使用する前に、使用者に予め定められた文字列もしくは任意の文字列の入力または指先操作をしてもらい、その打鍵位置情報や手指や爪の大きさの情報から、手指位置検出部71とホームポジション検出部72と打鍵動作検出部73との各検出パラメータを調整する。手指の正面画像において小指が薬指に半分隠れているような場合でも、ホームポジションに手指を配置する前後に小指だけ机上に接触させる等の方法で初期文字列による学習部75に学習させることで、小指の大きさは測定できる。文字入力中の学習部76は、打鍵動作があった手指の座標と手指位置検出部71が計算した仮想キーボード上のキートップの中心の座標とがどれだけずれているかの統計を取り、仮想キーボードのキーの座標を順次修正する。コマンド判定部77については図8で説明する。 The learning unit 75 based on the initial character string asks the user to input a predetermined character string or an arbitrary character string or perform a fingertip operation before using the key input device 12, and the keystroke position information In addition, the detection parameters of the finger position detection unit 71, the home position detection unit 72, and the keystroke operation detection unit 73 are adjusted based on the size information of fingers and nails. Even when the little finger is half hidden behind the ring finger in the front image of the finger, by letting the learning unit 75 by the initial character string learn by a method such as bringing the little finger into contact with the desk before and after placing the finger at the home position, The size of the little finger can be measured. The learning unit 76 during character input takes statistics of how much the coordinates of the finger with the keystroke operation and the coordinates of the center of the key top on the virtual keyboard calculated by the finger position detection unit 71 are deviated. Correct the coordinates of the keys in order. The command determination unit 77 will be described with reference to FIG.
 ホストプロセッサ11は、キーコード生成部74からの出力を入力キー受付部78で受け付け、その出力が有効な文字コードであれば、当該文字コードに対応する文字を、ディスプレイ30上の入力済み文字列の表示部31に表示する。またホストプロセッサ11は、机上の手指位置検出部71の検出結果も受け取り、ディスプレイ30上のキーボード表示部32上に重畳して手指検出位置表示62を表示する。 The host processor 11 receives the output from the key code generation unit 74 at the input key reception unit 78. If the output is a valid character code, the host processor 11 converts the character corresponding to the character code into the input character string on the display 30. Is displayed on the display unit 31. The host processor 11 also receives the detection result of the finger position detection unit 71 on the desk, and displays the finger detection position display 62 superimposed on the keyboard display unit 32 on the display 30.
 図8は、本実施の形態の機能的構成の別の一例を示す機能ブロック図である。図8は、携帯端末10のプロセッサ14で全ての処理をする場合を例示する。携帯端末10に実装されて机上に近接して配置されたカメラ20で撮影された画像は、携帯端末10のプロセッサ14に取り込まれる。プロセッサ14は、特徴抽出部70からコマンド判定部77までの全ての処理(たとえば、爪の輪郭候補などの特徴データの抽出、入力されたコマンドの有効性の判定等)を実行する。検出結果は、ディスプレイ30に表示される。 FIG. 8 is a functional block diagram showing another example of the functional configuration of the present embodiment. FIG. 8 illustrates a case where all processing is performed by the processor 14 of the mobile terminal 10. An image captured by the camera 20 that is mounted on the mobile terminal 10 and arranged close to the desk is captured by the processor 14 of the mobile terminal 10. The processor 14 executes all the processes from the feature extraction unit 70 to the command determination unit 77 (for example, extraction of feature data such as nail contour candidates, determination of the validity of the input command, etc.). The detection result is displayed on the display 30.
 図7の機能的構成と比較して、図8の機能的構成ではマウスポインタ制御部79とマウスポインタ表示34とが追加されている。コマンド判定部77は、机上に接触して配置された使用者の左手の手指が、本来の本数よりも少なく、マウスポインタを移動させるコマンド(以下「マウスコマンド」とも言い、図20で詳しく説明する。)を示していることを判定する。そしてコマンド判定部77はマウスコマンドが発せられていることを手指位置検出部71およびホームポジション検出部72にフィードバックする。 Compared with the functional configuration of FIG. 7, a mouse pointer control unit 79 and a mouse pointer display 34 are added to the functional configuration of FIG. 8. The command determination unit 77 is a command (hereinafter also referred to as “mouse command”) that moves the mouse pointer because the number of fingers of the left hand of the user placed in contact with the desk is less than the original number, and will be described in detail with reference to FIG. .) Is determined. Then, the command determination unit 77 feeds back to the hand position detection unit 71 and the home position detection unit 72 that a mouse command has been issued.
 マウスコマンドが発せられている場合には、手指位置検出部71およびホームポジション検出部72は、図7の実施の形態で説明した動作に加えて以下の動作をする。ホームポジション検出部72は、右手の1本だけの手指が机上に接触して静止した場合に、その位置と大きさとをマウスポインタの現在位置(ホームポジション)として記憶する。手指位置検出部71は右手の机上で静止した1本の指の位置や大きさが変化した場合に、ホームポジションからの移動方向と距離とを計算して、マウスポインタ制御部79に送る。 When the mouse command is issued, the finger position detection unit 71 and the home position detection unit 72 perform the following operations in addition to the operations described in the embodiment of FIG. The home position detection unit 72 stores the position and size of the right hand of the right hand as the current position (home position) of the mouse pointer when only one finger of the right hand touches the desk and stops. The finger position detection unit 71 calculates the moving direction and distance from the home position when the position or size of one finger resting on the desk on the right hand changes, and sends it to the mouse pointer control unit 79.
 マウスポインタ制御部79は、入力済み文字列の表示部31に重畳して表示されているマウスポインタ表示34の現在位置を保持している。そして、マウスポインタ制御部79は、手指位置検出部71から送られた移動の情報にもとづいて、マウスポインタ表示34の表示位置を移動させる。 The mouse pointer control unit 79 holds the current position of the mouse pointer display 34 displayed superimposed on the input character string display unit 31. The mouse pointer control unit 79 moves the display position of the mouse pointer display 34 based on the movement information sent from the finger position detection unit 71.
 以上のマウスポインタを制御する機能を使えば、例えば、入力済み文字列に誤りが見つかった場合に、使用者はマウスコマンドを発してマウスポインタを誤りのある位置に移動してクリック(打鍵)することで、カーソルを移動し、誤りを修正することが可能となる。 If the above-described function for controlling the mouse pointer is used, for example, when an error is found in the input character string, the user issues a mouse command, moves the mouse pointer to an erroneous position, and clicks (keystrokes). By moving the cursor, it becomes possible to correct the error.
 また上記のマウスポインタを制御する機能を使えば、キー入力をしない場合においても、例えば、表示画面上のボタンの入力や画面のスクロールにも使用することができる。 Also, if the function for controlling the mouse pointer is used, it can be used for, for example, inputting a button on the display screen or scrolling the screen even when no key is input.
 ライト照射部40は、机上の手指を照明し、カメラ20が撮影する手指や爪の輪郭等を鮮明にするとともに、暗い環境でもキー入力を可能にする。マイク45は、例えば使用者が手指をホームポジションに配置したことの情報を音声で携帯端末に入力するような場合に使用できる。スピーカ46は、キー入力のクリック音を出力することで、携帯端末10がキー入力を受け付けたことを使用者に知らせることができる。 The light illuminating unit 40 illuminates fingers on the desk, sharpens the contours of fingers and nails photographed by the camera 20, and enables key input even in a dark environment. The microphone 45 can be used, for example, when the user inputs information to the mobile terminal by voice that the finger is placed at the home position. The speaker 46 can notify the user that the mobile terminal 10 has accepted the key input by outputting a click sound of the key input.
 図9は、ホームポジションに静止した手指でコマンド入力をする方法の一例の説明図である。図5AがQWERTYキーボードの両手キー入力のために配置した場合の手指であるのに対して、図9は、左手50Dをこのように握りしめることで、キー入力のために配置した場合の手指の画像とは異なる特徴を持った画像をキー入力装置に与え、コマンド入力中であることを示す。図9に示される動作によれば、例えば今から右手51Cで入力するキーが大文字であることをキー入力装置に示したり、右手51Cが配置されている仮想キーボ-ドがテンキーボードであることをキー入力装置に示したりできる。 FIG. 9 is an explanatory diagram showing an example of a method for inputting a command with a finger stationary at the home position. FIG. 5A shows a finger when the QWERTY keyboard is arranged for two-hand key input, whereas FIG. 9 shows an image of a finger arranged for key input by grasping the left hand 50D in this way. An image having a characteristic different from the above is given to the key input device to indicate that a command is being input. According to the operation shown in FIG. 9, for example, it is indicated to the key input device that the key to be input with the right hand 51C is capitalized from now on, or the virtual keyboard on which the right hand 51C is arranged is a numeric keyboard. It can be shown on a key input device.
 図10は、仮想テンキーボード上のホームポジションに置かれた手指の説明図である。手指は、通常のハードキーボード上に置く場合の構えである。仮想のテンキーボード60D上のホームポジションに右手51Fは、配置されている。仮想キーボードの構成は、図3Aに示すような仮想QWERTYキーボードから、このような仮想テンキーボードに、図9のように左手を握りしめることで瞬時に切り替えることができる。 FIG. 10 is an explanatory diagram of fingers placed at the home position on the virtual numeric keyboard. The fingers are ready to be placed on a normal hard keyboard. The right hand 51F is disposed at the home position on the virtual numeric keyboard 60D. The configuration of the virtual keyboard can be instantaneously switched from the virtual QWERTY keyboard as shown in FIG. 3A to such a virtual numeric keyboard by grasping the left hand as shown in FIG.
 図11は、ディスプレイ上に表示される仮想キーボードに、机上に配置された手指の現在の検出位置を重ねて表示する方法の説明図である。キーボード表示部32に表示されたキーボード表示61の上に、手指検出位置表示62Aとその中心を示す手指中心位置表示63Aとは、重ねて表示されている。もし左手人差し指の位置が手指検出位置表示62Bのように2つ以上のキートップにまたがった状態で打鍵をした場合、通常のハードキーボードであれば2重打鍵になってしまう。しかし、本実施の形態に係るキー入力装置であれば、手指中心位置表示63Bが置かれたFキーがキー入力装置に入力される。このように仮想キーボードの場合、手指の位置ずれに対して通常のハードキーボードよりも余裕があり、キー入力装置の手指検出位置に誤差があっても誤入力になりにくいという特徴がある。 FIG. 11 is an explanatory diagram of a method for displaying the current detection position of the finger placed on the desk on the virtual keyboard displayed on the display. On the keyboard display 61 displayed on the keyboard display unit 32, a finger detection position display 62A and a finger center position display 63A indicating the center thereof are displayed in an overlapping manner. If a key is pressed in a state where the position of the left index finger extends over two or more key tops as in the finger detection position display 62B, an ordinary hard keyboard will be double-keyed. However, with the key input device according to the present embodiment, the F key on which the finger center position display 63B is placed is input to the key input device. As described above, the virtual keyboard has a feature that there is a margin with respect to the positional deviation of fingers compared to a normal hard keyboard, and even if there is an error in the finger detection position of the key input device, there is a feature that erroneous input is less likely to occur.
 カメラ20が撮影している範囲は、撮影領域表示64Aと64Bで示される。使用者は、撮影領域表示64Aと64Bを見て、仮想キーボードの使用する範囲が撮影領域に収まっているかどうかを確認できる。使用者が手指をカメラ20から離すほど、また手指の間隔を狭めるほど、仮想キーボードの使用する範囲がカメラ20の水平方向画角内に収まりやすくなり、仮想キーボードの撮影可能範囲は広がる。 The range in which the camera 20 is shooting is indicated by shooting area displays 64A and 64B. The user can check whether the range used by the virtual keyboard is within the shooting area by looking at the shooting area displays 64A and 64B. As the user moves his / her finger away from the camera 20 and the distance between the fingers is reduced, the range used by the virtual keyboard is more easily accommodated within the horizontal angle of view of the camera 20, and the shootable range of the virtual keyboard is expanded.
 図12は、ディスプレイ30上に仮想キーボード、カメラ20が捉えた手指の画像および入力済み文字列を表示する方法の一例の説明図である。図12は、ディスプレイ30に表示される入力済み文字列の表示部31、キーボード表示部32、および、手指画像表示部33を示している。本実施の形態によると、手指画像の左右が反転して表示されるので、使用者は、仮想キーボード上の手指の位置をイメージしやすくなる。 FIG. 12 is an explanatory diagram of an example of a method for displaying a virtual keyboard, a finger image captured by the camera 20 and an input character string on the display 30. FIG. 12 shows an input character string display unit 31, a keyboard display unit 32, and a finger image display unit 33 that are displayed on the display 30. According to the present embodiment, since the left and right sides of the finger image are displayed in an inverted manner, the user can easily imagine the position of the finger on the virtual keyboard.
 図13は、手指の位置と大きさを測定する方法の一例の説明図である。図13は、図6Bで打鍵をしている右手中指だけを取り出した部分を図示している。まず、手指位置検出部71は、中指の輪郭82を抽出する。例えば、ライト照射部40が正面から使用者の手指を照らしている場合、カメラ20からの画像は、手指は明るく、それ以外は暗い画像として得られる。そこで、手指位置検出部71は、その濃度値が急激に変化する部分を微分演算等で取り出すことで、手指の輪郭を抽出できる。手指位置検出部71は、この輪郭の最下点のZ座標値であるZPを右手中指のZ座標値とする(打鍵をしているので机上に接触している)。 FIG. 13 is an explanatory diagram of an example of a method for measuring the position and size of a finger. FIG. 13 illustrates a portion in which only the middle finger of the right hand that is pressing the key in FIG. 6B is taken out. First, the finger position detection unit 71 extracts the contour 82 of the middle finger. For example, when the light irradiation unit 40 illuminates the user's fingers from the front, the image from the camera 20 is obtained as a dark image with the fingers bright and the others. Therefore, the finger position detection unit 71 can extract the contour of the finger by taking out the portion where the density value changes rapidly by differentiation or the like. The finger position detection unit 71 sets ZP, which is the Z coordinate value of the lowest point of the contour, as the Z coordinate value of the middle finger of the right hand (because of the keystroke, it is in contact with the desk).
 手指位置検出部71は、座標値ZPを起点にして、そこからの所定の高さZ1、Z2ないしZnのn個の高さにおいて、指の左右の輪郭の水平幅83を求め、その中点84を求める。手指位置検出部71は、このようにしてn個の中点を求め、これらn個の点から最小二乗法で回帰直線85を求める。手指位置検出部71は、この直線と指の輪郭が交わった点86を画像上の中指のX座標値XPとする。X座標値を単に輪郭線の最下点としない理由は、打鍵をしている指先は平坦になるため、最下点が多く存在する場合があるからである。 The finger position detection unit 71 uses the coordinate value ZP as a starting point, obtains the horizontal width 83 of the left and right contours of the finger at n predetermined heights Z1, Z2, and Zn from the coordinate value ZP. 84 is determined. The finger position detection unit 71 obtains n middle points in this way, and obtains a regression line 85 from these n points by the least square method. The finger position detection unit 71 sets a point 86 where the straight line and the outline of the finger intersect as the X coordinate value XP of the middle finger on the image. The reason why the X coordinate value is not simply set as the lowest point of the contour line is that the fingertip with the keystroke is flat, and there may be many lowest points.
 さらにZ軸のn個の所定の高さと回帰直線85が交わるn個の点を通り、回帰直線85に直行する直線を引く。この直線が中指の輪郭82で切り取られる線分87が高さZnにおける中指の径になる。手指位置検出部71は、このようにしてn個の径を求め、その平均値を中指の平均径(画素数)とする。 Furthermore, a straight line that passes through the n predetermined points on the Z axis and n points where the regression line 85 intersects and goes straight to the regression line 85 is drawn. A line segment 87 where this straight line is cut off by the contour 82 of the middle finger is the diameter of the middle finger at the height Zn. The finger position detection unit 71 obtains n diameters in this way, and sets the average value as the average diameter (number of pixels) of the middle finger.
 以上の手順で、手指位置検出部71は、中指の位置と大きさ(この場合平均径)が求まる。手指位置検出部71は、他の指も同様にして求めることができるが、重要なのは打鍵動作をした手指の座標値である。したがって、例えば、手指位置検出部71によって先に全ての手指の輪郭の最下点が求められ、全ての手指のZ軸上の位置関係から打鍵動作をしていると判断された指があれば、手指位置検出部71は、その手指のみ上記手順で座標と大きさを求める、という方法で計算時間が短縮できる。 With the above procedure, the finger position detection unit 71 determines the position and size of the middle finger (in this case, the average diameter). The finger position detection unit 71 can obtain other fingers in the same manner, but what is important is the coordinate value of the finger that has performed the keystroke operation. Therefore, for example, if the finger position detection unit 71 first obtains the lowest point of the contour of all fingers, and there is a finger that is determined to be performing a keystroke operation based on the positional relationship of all fingers on the Z axis. The finger position detection unit 71 can reduce the calculation time by a method in which only the finger is used to obtain the coordinates and size according to the above procedure.
 ここで求めた中指のX座標値XPは、あくまで画像上の水平方向画素番号である。また手指の平均径は、画像上の手指が占める画素数である。従ってその画素数をもとに仮想キーボード上の手指のホームポジションからの水平移動距離を求める必要がある。 The X coordinate value XP of the middle finger obtained here is a horizontal pixel number on the image. The average diameter of the finger is the number of pixels occupied by the finger on the image. Therefore, it is necessary to obtain the horizontal movement distance from the home position of the finger on the virtual keyboard based on the number of pixels.
 図14は、仮想キーボード上の手指のホームポジションからの水平移動距離を計算する方法の一例の説明図である。図14は、ホームポジションにある左手52Dが仮想キーボード上を移動して左手52E(打鍵する中指だけを図示)まで斜めにカメラに近づいた状態を例示している。 FIG. 14 is an explanatory diagram showing an example of a method for calculating the horizontal movement distance from the home position of the finger on the virtual keyboard. FIG. 14 illustrates a state in which the left hand 52D at the home position moves on the virtual keyboard and approaches the camera diagonally to the left hand 52E (only the middle finger to be pressed is shown).
 カメラの水平方向画角を角α、水平方向画素数を画素数N(図示なし)とする。また左手中指の実際の平均径を平均径Dmm(図示なし)、中指画像の平均径(画素数)を平均径d(図示なし)とする。ホームポジションP1における左手中指画像の平均径(画素数)を平均径d1、水平方向画角の左端からの角度を角度θ1とする。この左手中指が机上を打鍵位置P2に移動し、そのときの平均径(画素数)が平均径d2、水平方向画角の左端からの角度が角度θ2になったとする。 Suppose that the horizontal angle of view of the camera is the angle α and the number of pixels in the horizontal direction is the number N of pixels (not shown). The actual average diameter of the left middle finger is defined as an average diameter Dmm (not shown), and the average diameter (number of pixels) of the middle finger image is defined as an average diameter d (not shown). Let the average diameter (number of pixels) of the left-hand middle finger image at the home position P1 be the average diameter d1, and the angle from the left end of the horizontal field angle be the angle θ1. It is assumed that the middle finger of the left hand moves to the keying position P2 on the desk, the average diameter (number of pixels) at that time is the average diameter d2, and the angle from the left end of the horizontal field angle is the angle θ2.
 水平方向画角の左端からの角度は、図4で示したように水平方向画素番号と対応が取れる。そこで、カメラ20のレンズの収差などによるひずみの補正も含めて、角度と水平方向画素番号との変換表をキー入力装置内のメモリ(図示しない)に予め記憶しておけば、画素番号から角度は直ちに求められる。この変換表では、水平方向画素番号は整数で画素数分だけ設けられ、当該水平方向画素番号に対応する角度は少数点以下の値を含む。手指位置は、図13の説明のように回帰直線85で求められることから、画素番号は小数点以下まで求められる。従って、その画素番号の小数点以下を切り捨てた画素番号に対応する角度と、小数点以下を切り上げた画素番号に対応する角度とを変換表から求め、画素番号の小数点以下の値から比例計算すれば、角度の近似値が得られる。このようにすれば、水平方向画素数分の小さな変換表で精度のよい角度が少ない計算時間で求められる。 The angle from the left end of the horizontal angle of view can correspond to the horizontal pixel number as shown in FIG. Therefore, if a conversion table between the angle and the horizontal pixel number is stored in advance in a memory (not shown) in the key input device, including correction of distortion due to aberration of the lens of the camera 20, etc., the angle from the pixel number is calculated. Is sought immediately. In this conversion table, the horizontal pixel numbers are integers corresponding to the number of pixels, and the angle corresponding to the horizontal pixel number includes a value less than the decimal point. Since the finger position is obtained by the regression line 85 as described in FIG. 13, the pixel number is obtained to the decimal point. Therefore, if the angle corresponding to the pixel number obtained by rounding down the decimal number of the pixel number and the angle corresponding to the pixel number obtained by rounding up the decimal point are obtained from the conversion table and proportionally calculated from the values after the decimal point of the pixel number, An approximation of the angle is obtained. In this way, an accurate angle can be obtained with a small conversion table for the number of pixels in the horizontal direction and with a short calculation time.
 このようにして、画像上の手指の(図13で求めた)水平方向画素番号から、角度θ1、θ2は求められる。従って、カメラからの距離L1、L2が求められれば、左手中指のホームポジションからの移動ベクトルVが求められる。距離L1、L2は、近似的に次の式(1)の距離Lから求めることができる(この式(1)の詳細は、図15で説明する)。 In this manner, the angles θ1 and θ2 are obtained from the horizontal pixel numbers (obtained in FIG. 13) of the fingers on the image. Therefore, if the distances L1 and L2 from the camera are obtained, the movement vector V from the home position of the middle finger of the left hand is obtained. The distances L1 and L2 can be approximately calculated from the distance L in the following expression (1) (details of the expression (1) will be described with reference to FIG. 15).
 L=D/sin(α・d/N) ・・・(1)
 さらに三角関数の近似を用いると、次の式(2)から、距離Lが求められる。
L = D / sin (α · d / N) (1)
Further, when the approximation of the trigonometric function is used, the distance L is obtained from the following equation (2).
 L=D・N/(α・d) ・・・(2)
 この式(2)に中指画像の平均径(画素数)d1、d2を代入すると、手指までの距離L1、L2が得られ、結果として移動ベクトルVが得られる。仮想キーボード上のホームポジションからの手指の前後移動距離と左右移動距離とは、移動ベクトルVから計算できる。仮想キーボードのキーの間隔はホームポジションにおける手指の間隔と等しく、またキーは正方形(あるいは高さと底辺の長さが等しい平行四変形)として支障はない。以上から打鍵動作があった位置の仮想キーボード上のキーが決定できる。
L = D · N / (α · d) (2)
By substituting the average diameter (number of pixels) d1 and d2 of the middle finger image into this equation (2), distances L1 and L2 to the fingers are obtained, and as a result, the movement vector V is obtained. The forward / backward movement distance and the left / right movement distance of the finger from the home position on the virtual keyboard can be calculated from the movement vector V. The distance between the keys on the virtual keyboard is equal to the distance between fingers at the home position, and the keys are square (or parallel four deformations with the same height and bottom length), and there is no problem. From the above, it is possible to determine the key on the virtual keyboard at the position where the keystroke operation has occurred.
 ここで問題になるのは、実際の手指の平均径を平均径Dと表わしたが、この平均径Dの値はカメラでは測定できないということである。平均径Dの値を決定する手段はいくつかある。一つの方法は、使用者が想定している仮想キーボードのキー間隔(mm)をあらかじめキー入力装置に初期設定してもらうことである。ホームポジションにおける各手指の平均間隔(画素数)がキー間隔(mm)に相当するので、実際の手指の径(mm)は、その画像上の各手指の平均径(画素数)から決定できる。 The problem here is that the average diameter of the actual fingers is expressed as average diameter D, but the value of this average diameter D cannot be measured by the camera. There are several means for determining the value of the average diameter D. One method is to have the key input device preliminarily set the key interval (mm) of the virtual keyboard assumed by the user. Since the average interval (number of pixels) of each finger at the home position corresponds to the key interval (mm), the actual finger diameter (mm) can be determined from the average diameter (number of pixels) of each finger on the image.
 次に仮想キーボード上を1行分手指が移動した場合にどれくらい手指画像の平均径(画素数)が変化するか式(2)を用いて逆算してみる。手指の実際の平均径Dを15mm、カメラの水平方向画角αをπ/2(即ち90度)、水平方向の画素数Nを640とする。ホームポジションにおける手指とカメラとの距離Lを250mmとする。これらを式(2)に代入すると、手指画像の平均径(画素数)dは約24.5であることがわかる。仮想キーボードのキーピッチを19mmとし、手指がホームポジションから1行分カメラ20に近づき、その距離Lが231mmになったとする。これを式(2)に代入すると、手指画像の平均径(画素数)dは約26.5になる。従って、仮想キーボード上を1行分移動することで手指画像の平均径(画素数)は約2画素増えることになる。 Next, let's calculate backward how much the average diameter (number of pixels) of the finger image changes when the finger moves on the virtual keyboard for one line using Equation (2). Assume that the actual average diameter D of the fingers is 15 mm, the horizontal field angle α of the camera is π / 2 (that is, 90 degrees), and the number N of pixels in the horizontal direction is 640. The distance L between the finger and the camera at the home position is 250 mm. By substituting these into equation (2), it can be seen that the average diameter (number of pixels) d of the finger image is about 24.5. It is assumed that the key pitch of the virtual keyboard is 19 mm, the finger approaches the camera 20 for one line from the home position, and the distance L becomes 231 mm. By substituting this into equation (2), the average diameter (number of pixels) d of the finger image is about 26.5. Therefore, moving the virtual keyboard by one line increases the average diameter (number of pixels) of the finger image by about two pixels.
 手指画像の平均径(画素数)としてZ軸方向の種々高さでの径の平均値を用いることで、ノイズの影響を考慮しても、2画素の差は十分検出できる。またホームポジションと打鍵位置という2つの画像だけで説明したが、フレーム周波数を高くし、手指が移動中の打鍵前後の手指画像の平均径(画素数)の情報も使用することで、さらに精度を向上できる。 Using the average value of the diameters at various heights in the Z-axis direction as the average diameter (number of pixels) of the finger image, the difference between the two pixels can be sufficiently detected even when the influence of noise is taken into consideration. Although only two images, the home position and the keystroke position, have been described, the frame frequency is increased, and information on the average diameter (number of pixels) of the finger image before and after the keystroke while the finger is moving is used for further accuracy. It can be improved.
 図15は式(1)の説明図である。図15は、距離Lだけ離れた位置に手指の実際の平均径Dがある二等辺三角形を示している。角度βは、水平方向画角α、水平方向の画素数Nおよび手指画像の平均径(画素数)dを用いて、近似的に次の式(3)で表すことができる。 FIG. 15 is an explanatory diagram of Expression (1). FIG. 15 shows an isosceles triangle having an actual average diameter D of fingers at positions separated by a distance L. FIG. The angle β can be approximately expressed by the following expression (3) using the horizontal view angle α, the number of pixels N in the horizontal direction, and the average diameter (number of pixels) d of the finger image.
 β=α・d/N ・・・(3)
 この様な表現が用いられるのは、水平方向の画素数Nの内で手指画像の平均径(画素数)dが占める割合が、水平方向画角αの内で図のβが占める割合に近似するからである。さらに、図15の底辺に垂直な線分Aの長さは、次の式(4)で表される。
β = α · d / N (3)
Such a representation is used because the ratio of the average diameter (number of pixels) d of the finger image in the number of pixels N in the horizontal direction approximates the ratio of β in the figure in the horizontal view angle α. Because it does. Further, the length of the line segment A perpendicular to the bottom of FIG. 15 is expressed by the following equation (4).
 A=L・sin(α・d/N) ・・・(4)
 手指の実際の平均径Dを近似値である線分Aで代用すれば、式(1)が導出される。この図15では、手指の平均径Dが大きく書かれている。実際には距離Lに比べて十分小さいことから、Aで代用しても誤差は小さい。
A = L · sin (α · d / N) (4)
If the actual average diameter D of the finger is replaced by the approximated segment A, Equation (1) is derived. In FIG. 15, the average diameter D of the fingers is greatly written. Actually, it is sufficiently smaller than the distance L, so that the error is small even if A is substituted.
 以上のようにしてカメラ画像から得られるXZ面上の手指のX座標位置と、手指の大きさという2つのパラメータとで、机上の水平面上の位置の相対的な変化が計算される。カメラ20が机上に近接するように携帯端末10を配置しても、カメラ20は、実際には机上から少し離れている。そのため、手指がホームポジションから移動して例えばカメラ20に近づいた場合、画像上の指先のZ座標は、少し下に下がる。したがって、手指の水平面上の位置の相対的な変化の計算にこのZ座標の変化を活用することも可能である。しかし次に述べる打鍵動作の検出精度を高めるためには、この手指の水平移動による画像上のZ座標の変化が極力小さくなるようにカメラ20を机上に十分近接する態様で携帯端末10を配置するのが望ましい。 As described above, the relative change of the position on the horizontal plane on the desk is calculated by the two parameters of the X coordinate position of the finger on the XZ plane obtained from the camera image and the size of the finger. Even if the mobile terminal 10 is arranged so that the camera 20 is close to the desk, the camera 20 is actually slightly away from the desk. Therefore, when the finger moves from the home position and approaches the camera 20, for example, the Z coordinate of the fingertip on the image is slightly lowered. Therefore, it is also possible to utilize the change in the Z coordinate for calculating the relative change in the position of the finger on the horizontal plane. However, in order to improve the detection accuracy of the keystroke operation described below, the mobile terminal 10 is arranged in a manner in which the camera 20 is sufficiently close to the desk so that the change of the Z coordinate on the image due to the horizontal movement of the finger is minimized. Is desirable.
 図16Aおよび図16Bは、打鍵動作検出方法の一例の説明図である。図16Aのホームポジションにある右手53Cの全ての指先は、机上に接触している。打鍵動作検出部73は、そのときの各手指の指先のZ軸上の高さを記憶しておいて判定に用いる。中指のホームポジションでの高さは高さZHで表わされ、高さZHをもとに閾値(ZH+a)と閾値(ZH+b)とが設けられる。これらの閾値は、中指が仮想キーボード上を1行分カメラから離れて打鍵をする場合に用いられる。打鍵動作検出部73が打鍵と判定する条件は、中指の最下点が閾値(ZH+a)よりも下にあること、また同時に、中指以外の指は閾値(ZH+b)よりも上にあることである。ホームポジションから移動して打鍵動作をした図16Bの右手53Dは、この条件を満たしている。 FIG. 16A and FIG. 16B are explanatory diagrams of an example of a keystroke operation detection method. All the fingertips of the right hand 53C in the home position in FIG. 16A are in contact with the desk. The keystroke detection unit 73 stores the height of the fingertip of each finger at that time on the Z axis and uses it for the determination. The height of the middle finger at the home position is represented by a height ZH, and a threshold value (ZH + a) and a threshold value (ZH + b) are provided based on the height ZH. These threshold values are used when the middle finger strikes the virtual keyboard one line away from the camera. The condition that the keystroke detection unit 73 determines that the keystroke is keyed is that the lowest point of the middle finger is below the threshold value (ZH + a), and at the same time, fingers other than the middle finger are above the threshold value (ZH + b). . The right hand 53D of FIG. 16B that has moved from the home position and performed a keystroke operation satisfies this condition.
 ハードキーボードのキートップは打鍵をしたときに数ミリの沈み込みがあり、これに慣れた使用者は、打鍵した指以外の指を浮かせる癖がついている。打鍵動作検出部73は、これを利用して二つの閾値を設け、打鍵動作の検出精度を上げている。この打鍵検出のための二つの閾値を手指ごとに、また仮想キーボードの行ごとに設定する。 The key top of a hard keyboard has a sinking of several millimeters when a key is pressed, and users accustomed to this have a habit of lifting a finger other than the key that has been pressed. The keystroke operation detection unit 73 uses this to provide two thresholds to increase the detection accuracy of the keystroke operation. Two thresholds for detecting the keystroke are set for each finger and each line of the virtual keyboard.
 このように本実施の形態の場合、カメラ20が机上に近接するように携帯端末10が配置されている。そのため、手指が仮想キーボード上を前後に移動しても画像上のZ軸の変化は微小であり、ホームポジションでの各手指のZ軸上の高さをもとに打鍵動作の検出は容易である。 Thus, in the case of the present embodiment, the portable terminal 10 is arranged so that the camera 20 is close to the desk. Therefore, even if the fingers move back and forth on the virtual keyboard, the change in the Z-axis on the image is very small, and it is easy to detect the keystroke operation based on the height of each finger on the Z-axis at the home position. is there.
 図17は、ハードウエアブロック図の一例であり、図8の機能ブロック図の機能を構成している。図8の特徴抽出部70からマウスポインタ制御部79までの機能は、プロセッサ14のROM(Read Only Memory)17に書き込まれたソフトウエアプログラム18で実現され、DSP15とCPU(Central Processing Unit)16に読み込まれて実行される。実行中に得られた情報はRAM(Random Access Memory)19に記憶される。ROM17は外付けにすることもできる。 FIG. 17 is an example of a hardware block diagram, and configures the functions of the functional block diagram of FIG. The functions from the feature extraction unit 70 to the mouse pointer control unit 79 in FIG. 8 are realized by a software program 18 written in a ROM (Read Only Memory) 17 of the processor 14, and are stored in the DSP 15 and the CPU (Central Processing Unit) 16. Read and execute. Information obtained during execution is stored in a RAM (Random Access Memory) 19. The ROM 17 can be externally attached.
 図18は、机エッジ102の検出方法と手指の面積の測定方法の一例を示す図である。画像100は机上に近接して配置されたカメラ20が撮影した画像である。状態(A)のように、使用者101は仮想キーボードを起動した後、一旦手を机から降ろす。このことで机エッジ102が手に遮られることなく画像上に現れる。例えば画像をZ軸方向に微分演算し、微分値の絶対値が大きい線分を抽出することで机エッジ102は検出できる。次に状態(B)のように、使用者101が手指103を机上に配置すると、手指によって机エッジ102は部分的に見えなくなり、微分演算で検出できない箇所が生じる。逆に言うと机エッジ102が見えなくなった画素の位置に使用者の手指が配置されたことになり、手指位置検出部71は、容易に手指の検出ができる。 FIG. 18 is a diagram illustrating an example of a method for detecting the desk edge 102 and a method for measuring the area of the finger. An image 100 is an image taken by the camera 20 arranged close to the desk. As in the state (A), after the user 101 activates the virtual keyboard, the user 101 once drops his hand from the desk. As a result, the desk edge 102 appears on the image without being blocked by the hand. For example, the desk edge 102 can be detected by differentiating the image in the Z-axis direction and extracting a line segment having a large absolute value of the differential value. Next, when the user 101 places the finger 103 on the desk as in the state (B), the desk edge 102 is partially invisible by the finger, and a portion that cannot be detected by the differential calculation occurs. In other words, the user's finger is placed at the pixel position where the desk edge 102 is no longer visible, and the finger position detector 71 can easily detect the finger.
 状態(C)は、状態(A)の破線矩形内の画像を拡大したものであり、机エッジ102と机上104を表す。状態(D)は、状態(B)の破線矩形内の画像を拡大したものであり、使用者の手指106が配置された図である。状態(A)と状態(B)の画像のフレーム差分を取り、机エッジ102で切り取られた机上の面積だけを抽出すると状態(E)の手指の面積107が求められる。実際には手指の影が机上に映るために単純にフレーム差分するだけでは面積107を正確には求めることはできないが、手指の輪郭と机エッジ102から面積107は求められる。図13では手指の径の測定方法について説明したが、図18のようにして面積も測定できる。その面積を図14で説明した仮想キーボード上の手指のホームポジションからの水平移動距離の計算に用いることもできる。 State (C) is an enlarged image of the broken line rectangle in state (A) and represents the desk edge 102 and the desk top 104. The state (D) is an enlarged image of the broken line rectangle in the state (B), and is a diagram in which the user's fingers 106 are arranged. When the frame difference between the images of the state (A) and the state (B) is taken and only the area on the desk cut out by the desk edge 102 is extracted, the area 107 of the finger in the state (E) is obtained. Actually, since the shadow of the finger is reflected on the desk, the area 107 cannot be accurately obtained by simply performing a frame difference, but the area 107 is obtained from the contour of the finger and the desk edge 102. Although the method for measuring the diameter of the finger has been described with reference to FIG. 13, the area can also be measured as shown in FIG. The area can also be used to calculate the horizontal movement distance from the home position of the finger on the virtual keyboard described in FIG.
 机エッジ102で区切られる机上の画像は濃淡変化が少ない。この机上の画像と比較して残りの画像については、人体の衣服の模様や背景の配置物によって濃淡変化が多いので、画像全体を対象として手指の検出のための切出し閾値等を決めると、精度が向上しない。ある局面において、本実施の形態に係る携帯端末10は、机エッジ102を検出する。携帯端末10は、検出された机エッジ102によって得られる部分領域としての机上の画像だけを対象として手指の切出し閾値等のパラメータを計算する。これにより、検出精度を高めることができる。手指の画像も、図18の状態(E)のように、机エッジ102の画像よりも下に配置された手指の先端だけを対象として検出することにより、比較的容易なパターン認識の対象となる。 The image on the desk divided by the desk edge 102 has little change in shading. Compared with the image on the desk, the remaining image has a large variation in shading depending on the pattern of the human body and the arrangement of the background. Therefore, if the extraction threshold value for finger detection is determined for the entire image, the accuracy Does not improve. In one aspect, mobile terminal 10 according to the present embodiment detects desk edge 102. The portable terminal 10 calculates a parameter such as a finger clipping threshold only for a desk image as a partial area obtained by the detected desk edge 102. Thereby, detection accuracy can be improved. A finger image is also a relatively easy pattern recognition target by detecting only the tip of a finger arranged below the image of the desk edge 102 as in the state (E) of FIG. .
 図19は、爪の面積を得る方法の一例である。本実施の形態に従う携帯端末10は、手指106の爪108の画像の輪郭から、爪108の最大幅109と最大高さ110とを求める。最大幅109および最大高さ110はそれぞれ、手指や爪の画像の大きさを決める特徴量の一つとして用いることもできるが、最大幅109と最大高さ110との積を爪の面積の代用として用いることもできる。 FIG. 19 shows an example of a method for obtaining the nail area. Mobile terminal 10 according to the present embodiment obtains maximum width 109 and maximum height 110 of nail 108 from the contour of the image of nail 108 of finger 106. Each of the maximum width 109 and the maximum height 110 can be used as one of the feature quantities that determine the size of the image of the finger or nail, but the product of the maximum width 109 and the maximum height 110 is used as a substitute for the nail area. Can also be used.
 図20は、手指でコマンド入力をする方法の手順を例示する。図9では手を握ることでコマンドを入力する例が説明されたが、図20は、図9で例示される方法以外の方法を示しており、手指103の正面画像と机エッジ102だけを抜き出した状態を例示している。机エッジ102は、手指103で隠されているかどうかにかかわらず、直線で示されている。コマンドのない通常のキー入力の場合は、状態(A)のホームポジションから状態(B)のように一旦右手が浮き、状態(C)のように人差し指が打鍵をする。その間、使用者は左手を浮かせてもよいが、状態(C)のようにホームポジションで静止したままでもよい。状態(D)では、使用者が左手の親指を浮かせた状態で右手を移動し、状態(E)のように打鍵をすることで、状態(C)とは異なる画像をキー入力装置に入力することができ、これらの一連の操作をコマンド判定部77に対しての一つのコマンド入力とすることができる。使用者の左手親指が浮いている場合は、携帯端末10は、例えばシフトキーを押したのと同じ機能を実現することができる。これにより、通常のハードキーボードの操作と比較して左手の動きが少なく、高速で誤りの少ないキー入力が実現できる。なお、浮かせる手指の組み合わせによって、シフトキー以外の修飾子キー(Ctrl,Alt等)のコマンド入力も可能である。 FIG. 20 exemplifies the procedure of a method for inputting a command with fingers. FIG. 9 illustrates an example in which a command is input by grasping a hand. FIG. 20 illustrates a method other than the method illustrated in FIG. 9, and only the front image of the finger 103 and the desk edge 102 are extracted. The state is illustrated. The desk edge 102 is shown as a straight line regardless of whether it is hidden by the finger 103. In the case of normal key input without a command, the right hand temporarily floats from the home position in the state (A) as in the state (B), and the index finger strikes the key as in the state (C). Meanwhile, the user may lift his left hand, but may remain stationary at the home position as in the state (C). In the state (D), the user moves the right hand with the thumb of the left hand floating, and presses the key as in the state (E), thereby inputting an image different from the state (C) to the key input device. These series of operations can be input to the command determination unit 77 as one command. When the user's left thumb is floating, the mobile terminal 10 can realize the same function as pressing a shift key, for example. As a result, it is possible to realize key input with high speed and few errors with less movement of the left hand as compared with normal hard keyboard operations. Note that it is also possible to input commands for modifier keys (Ctrl, Alt, etc.) other than the shift key, depending on the combination of fingers to be floated.
 状態(F)では、携帯端末10の使用者は、左手の親指と人差し指とを浮かせたコマンドを与えている。この例はマウスポインタを移動させるコマンド(以下「マウスコマンド」ともいう。)を示しており、右手は人差し指だけが机上に接しており、その位置が現在表示画面に表示されているマウスポインタの位置に対応付けられる。状態(G)のように右手を動かすと人差し指の移動距離と方向が図13で説明した方法で計算され、それに応じてマウスポインタの表示画面上の位置を移動させる。図では示していないが、状態(G)の位置で右手人差し指を机上から浮かせ、再度机上に戻すことで、マウスのクリック入力と同等の機能も実現できる。この機能を使用することで、キー入力をしない場合においても、例えば、表示画面上のボタンの入力や画面のスクロールも実現できる。 In state (F), the user of the mobile terminal 10 gives a command with the left thumb and index finger lifted. This example shows a command to move the mouse pointer (hereinafter also referred to as “mouse command”). Only the index finger is in contact with the desk on the right hand, and that position is the position of the mouse pointer currently displayed on the display screen. Is associated with. When the right hand is moved as in the state (G), the moving distance and direction of the index finger are calculated by the method described with reference to FIG. 13, and the position of the mouse pointer on the display screen is moved accordingly. Although not shown in the figure, a function equivalent to mouse click input can be realized by lifting the right index finger from the desk in the state (G) and returning it to the desk again. By using this function, even when key input is not performed, for example, input of buttons on the display screen and scrolling of the screen can be realized.
 図21Aおよび図21Bは、本実施の形態のフローチャートの一例である。図21Aを参照して、使用者が、携帯端末10の画面にソフトウェアキーとして表示されている仮想キーボードの開始ボタンを押す等によって仮想キーボードのモードを起動すると、カメラ20は、机上の連続撮影を開始する(ステップS0)。予め規定された使用方法にしたがって、使用者は、一旦手を机から下ろし、その間に携帯端末10は、机エッジ102を検出する(ステップS1)。検出が終わると、携帯端末10は、使用者にそのことを伝えるとともに、机上の手指の位置検出を開始する(ステップS2)。位置検出は、画像上の手指の位置だけでなく、手指の大きさの検出を含み、仮想キーボードプログラムの終了まで連続して実施される。次に、携帯端末10は、使用者の手指の静止を待機する(ステップS3)。携帯端末10は、手指が静止したことを検出すると(ステップS3:YES)、ホームポジションにある各手指の位置と大きさを測定して、その位置と大きさとを、RAM19に記憶する(ステップS4)。いったんステップ4でホームポジションが記憶されると、携帯端末10は、使用者の各種操作の検出を始める。まず、再度全ての手指が机上に配置されて静止したかどうかを確認する(ステップS5)。そうであれば(ステップS5:YES)、携帯端末10は、ホームポジションの再検出を実行する(ステップS6)。静止していなければ(ステップS5:NO)、携帯端末10は、終了コマンドが示されているか否か確認する(ステップS7)。例えば、携帯端末10は、使用者が両手をともに握りしめることを検出すると、終了コマンドが示されたと判断する。終了コマンドが示されていれば(ステップS7:YES)、携帯端末10は、机上の手指の位置検出を終了し、カメラ20による机上の連続撮影を停止し(ステップS8)、仮想キーボードを終了する(ステップ9)。 FIG. 21A and FIG. 21B are examples of a flowchart of the present embodiment. Referring to FIG. 21A, when the user activates the virtual keyboard mode by pressing the start button of the virtual keyboard displayed as a software key on the screen of the mobile terminal 10, the camera 20 performs continuous shooting on the desk. Start (step S0). In accordance with a predetermined usage method, the user once removes his / her hand from the desk, and the portable terminal 10 detects the desk edge 102 (step S1). When the detection is completed, the mobile terminal 10 informs the user of the fact and starts detecting the position of the finger on the desk (step S2). The position detection includes not only the position of the finger on the image but also the size of the finger, and is continuously performed until the end of the virtual keyboard program. Next, the mobile terminal 10 waits for the user's fingers to rest (step S3). When the portable terminal 10 detects that the finger is stationary (step S3: YES), it measures the position and size of each finger at the home position, and stores the position and size in the RAM 19 (step S4). ). Once the home position is stored in step 4, the mobile terminal 10 starts detecting various operations of the user. First, it is confirmed again whether or not all fingers are placed on the desk and stopped (step S5). If so (step S5: YES), the mobile terminal 10 performs re-detection of the home position (step S6). If it is not stationary (step S5: NO), the portable terminal 10 checks whether or not an end command is indicated (step S7). For example, when it is detected that the user holds both hands together, the mobile terminal 10 determines that an end command has been indicated. If the end command is indicated (step S7: YES), the mobile terminal 10 ends the position detection of the fingers on the desk, stops continuous shooting on the desk by the camera 20 (step S8), and ends the virtual keyboard. (Step 9).
 図21Bを参照して、次に、携帯端末10は、図20で説明したマウスコマンドが示されているかを確認する(ステップS10)。マウスコマンドが示されていれば(ステップS10:YES)、携帯端末10は、マウスポインタに対応する一本の手指が机上で静止しているかを確認する(ステップS18)。静止していなければ(ステップS18:NO)、携帯端末10は、制御をステップS5に戻す。静止していれば(ステップS18:YES)、携帯端末10は、この指の位置をディスプレイ30の表示画面に表示されているマウスの現在位置(ホームポジション)とし、その指の画像上の位置と大きさをRAM19に記憶する(ステップS19)。マウスコマンドが解除されていれば(ステップS20:NO)、携帯端末10は、制御をステップS5に戻す。マウスコマンドが保持されていれば(ステップS20:YES)、携帯端末10は、使用者の手指が移動したか否かをチェックする(ステップS21)。手指が移動していれば(ステップS21:YES)、携帯端末10は、移動後の手指の位置および大きさと、RAM19に記憶されているホームポジションの手指の位置および大きさとを比較して、手指の移動方向と移動距離とを計算する。携帯端末10は、計算された移動方向および移動距離に合わせて、ディスプレイ30の表示画面に表示されるマウスポインタを移動させる(ステップS22)。 Referring to FIG. 21B, next, the mobile terminal 10 checks whether the mouse command described in FIG. 20 is indicated (step S10). If the mouse command is indicated (step S10: YES), the mobile terminal 10 confirms whether one finger corresponding to the mouse pointer is stationary on the desk (step S18). If not stationary (step S18: NO), the mobile terminal 10 returns the control to step S5. If the mobile terminal 10 is stationary (step S18: YES), the mobile terminal 10 uses the position of the finger as the current position (home position) of the mouse displayed on the display screen of the display 30, and the position of the finger on the image. The size is stored in the RAM 19 (step S19). If the mouse command is released (step S20: NO), the mobile terminal 10 returns the control to step S5. If the mouse command is held (step S20: YES), the mobile terminal 10 checks whether or not the user's finger has moved (step S21). If the finger has moved (step S21: YES), the portable terminal 10 compares the position and size of the finger after movement with the position and size of the finger at the home position stored in the RAM 19, and The moving direction and the moving distance of are calculated. The mobile terminal 10 moves the mouse pointer displayed on the display screen of the display 30 in accordance with the calculated moving direction and moving distance (step S22).
 携帯端末10によって検出された指の操作がマウスコマンドでなければ(ステップS10:NO)、携帯端末10は、打鍵動作があったか否かを確認する(ステップS11)。打鍵動作がなければ(ステップS11:NO)、携帯端末10は制御をステップS5に戻す。打鍵動作があれば(ステップS11:YES)、携帯端末10は、打鍵した手指の画像上の位置および手指の大きさと、ステップS4あるいはステップS6で記憶したホームポジションでのその手指の画像上の位置および手指の大きさとを比較することで、机上の手指の移動方向と距離を算出し、打鍵があった仮想キーボード上のキーの位置を求める(ステップS12)。次に、携帯端末10は、コマンドが示されているかどうかを確認する(ステップS13)。コマンドが示されていなければ(ステップS13:NO)、携帯端末10は、その位置に該当する仮想QWERTYキーボードのキーコードを生成する(ステップS14)。打鍵した手と反対の手でコマンドが示されている場合(ステップS13:YES)、その示されているコマンドがシフトコマンドであれば、携帯端末10は、打鍵と同時にシフトキーが押下されていたものと判断して、その判断に応じたキーコードを生成する(ステップS15)。コマンドがテンキーコマンドであった場合、携帯端末10は、仮想キーボードがテンキーボードに切り替わったものと判断して、打鍵のあった位置に該当するテンキーのキーコードを生成する(ステップS16)。キー入力後、携帯端末10は、打鍵をした手指が机上を離れるのを待つ(ステップS17)。手指が離れれば(ステップS17:YES)、携帯端末10は、制御をステップ5に戻す。 If the finger operation detected by the mobile terminal 10 is not a mouse command (step S10: NO), the mobile terminal 10 confirms whether or not a keystroke operation has been performed (step S11). If there is no keystroke operation (step S11: NO), the portable terminal 10 returns control to step S5. If there is a keystroke operation (step S11: YES), the mobile terminal 10 determines the position of the finger on the image and the size of the finger and the position of the finger on the image at the home position stored in step S4 or step S6. The movement direction and distance of the finger on the desk are calculated by comparing the size of the finger and the finger, and the position of the key on the virtual keyboard where the key is hit is obtained (step S12). Next, the mobile terminal 10 confirms whether a command is indicated (step S13). If no command is indicated (step S13: NO), the mobile terminal 10 generates a key code of the virtual QWERTY keyboard corresponding to the position (step S14). If the command is indicated with the hand opposite to the key that is pressed (step S13: YES), if the indicated command is a shift command, the portable terminal 10 has the shift key pressed at the same time as the key is pressed. And a key code corresponding to the determination is generated (step S15). If the command is a numeric keypad command, the mobile terminal 10 determines that the virtual keyboard has been switched to the numeric keyboard, and generates a key code for the numeric keypad corresponding to the position where the key was pressed (step S16). After the key input, the mobile terminal 10 waits for the finger that has pressed the key to leave the desk (step S17). If the finger is released (step S17: YES), the mobile terminal 10 returns the control to step 5.
 図21には、キーの無い位置を打鍵した場合などのエラー処理は記載されていない。また図21では、キー入力用のコマンドとして、シフトコマンドとテンキーコマンドとが例として説明されたが、キー入力用のコマンドはシフトコマンドおよびテンキーコマンドに限られない。これらのコマンド以外に、文字キーまたは数字キーとコントロールキーとを同時押下していることを示すコントロールコマンドや、片手で操作できるファンクションキーコマンドも追加することができる。ファンクションキーコマンドは、例えばテンキーボードと同等の形状のキーボードの各キーに、QWERTYキーボードのF1キーからF12キーや、Back Spaceキー、Deleteキーなどを配置したものとして構成することができ、ホームポジションから遠いキーを瞬時に入力可能にすることができる。 FIG. 21 does not describe error handling when a keyless position is pressed. In FIG. 21, the shift command and the numeric key command are described as examples of the key input command. However, the key input command is not limited to the shift command and the numeric key command. In addition to these commands, a control command indicating that a character key or numeric key and a control key are simultaneously pressed, or a function key command that can be operated with one hand can be added. The function key command can be configured as, for example, each key of a keyboard having the same shape as a numeric keyboard, with the F1 key to F12 key, Back Space key, Delete key, etc. of the QWERTY keyboard arranged from the home position. Distant keys can be input instantly.
 図22Aおよび図22Bは、各指が担当するキーの位置が相互に重なって構成されている仮想キーボードの例である。図22Aのキーボード112は、右手111の人差し指が担当する仮想キーボードである。人差し指は、ホームポジションであるキー“C”の上に配置されている。使用者は手首を支点にして手を回転し、人差し指はキー“A”からキー“E”を打鍵する。図22Bは、同じ右手111の中指が担当する仮想キーボードである。人差し指は、ホームポジションであるキー“H”の上に配置されている。同様にして使用者が手を回転すると、人差し指はFキーからJキーを打鍵することができる。図22Aの右手111と図22Bの右手111とを重ねると、この二つの仮想キーボードは互いに一部が重なり、机上の同じ位置に複数のキーが存在することになる。 FIG. 22A and FIG. 22B are examples of a virtual keyboard in which the positions of keys assigned to each finger overlap each other. The keyboard 112 in FIG. 22A is a virtual keyboard that is handled by the index finger of the right hand 111. The index finger is arranged on the key “C” which is the home position. The user rotates his / her hand with the wrist as a fulcrum, and the index finger strikes the key “E” from the key “A”. FIG. 22B is a virtual keyboard handled by the middle finger of the same right hand 111. The index finger is arranged on the key “H” which is the home position. Similarly, when the user rotates his hand, the index finger can press the J key from the F key. When the right hand 111 of FIG. 22A and the right hand 111 of FIG. 22B are overlapped, the two virtual keyboards partially overlap each other, and a plurality of keys exist at the same position on the desk.
 図22Aおよび図22Bの実施の形態によれば、親指を除く8本の指が各々5個のキーを担当することで、40種類以上のキーを持つ仮想キーボードが実現できる。この場合、手指の動きは左右の動きのみであり、前後の動きは必要ない。従って本実施の形態では、手指位置検出部71は、カメラ画像の手指の位置の変化だけを検出すればよく、手指や爪の大きさの変化を検出する必要はない。各手指が担当する各列のキーは1個しかないが、各手指の長さは異なるため、この場合もやはり、各指が担当するキーのカメラからの距離が異なり、かつ、使用者に固有の形状を有する仮想キーボードである。 22A and 22B, a virtual keyboard having 40 or more types of keys can be realized by having eight fingers excluding the thumb in charge of five keys each. In this case, the finger moves only to the left and right, and does not need to move back and forth. Therefore, in the present embodiment, the finger position detection unit 71 only needs to detect a change in the position of the finger in the camera image, and does not need to detect a change in the size of the finger or nail. Each finger is responsible for only one key in each column, but the length of each finger is different, so in this case too, the distance from the camera of the key responsible for each finger is different and is specific to the user This is a virtual keyboard having the following shape.
 図22Aおよび図22Bの実施の形態を拡張して、各指が担当する仮想キーボードを複数行にすることもできる。複数行の場合、各手指の前後の移動も必要となり、手指位置検出部71は、手指や爪の大きさの変化も検出する必要がある。図7および図8の機能構成や、図21のフローチャートは、各指が担当するキーの位置が相互に重なって構成される仮想キーボードの説明図にもなっている。 The embodiment shown in FIGS. 22A and 22B can be expanded so that the virtual keyboard assigned to each finger has a plurality of lines. In the case of a plurality of lines, it is necessary to move each finger forward and backward, and the finger position detection unit 71 needs to detect changes in the size of fingers and nails. The functional configuration of FIGS. 7 and 8 and the flowchart of FIG. 21 are also explanatory diagrams of a virtual keyboard configured by overlapping the positions of keys assigned to each finger.
 このように本実施の形態に従う仮想キーボードは、各指が担当するキーの位置が相互に重なって構成される仮想キーボードとして構成されるので、キー操作に要する指の移動を最小限にすることができる。こうすることで手や腕が疲れにくく、操作に要する机上の必要面積を最小にした仮想キーボードが実現できる。使用者はQWERTYキーボードのような使い慣れた配列の仮想キーボードが使いこなせたのちに、次のステップとして、使用者は新規配列の、各指が担当するキーの位置が相互に重なって構成される仮想キーボードを使うことが望ましい。 図23Aおよび図23Bは、本実施の形態のシステム斜視図の他の例である。これまで携帯端末10を中心に説明してきたが、仮想キーボードは他の態様でも実現可能である。本実施の形態の仮想キーボードは手指を楽な姿勢でかまえることができるという特徴を持ち、長時間のキー操作による腱鞘炎や肩こりなどを改善できる。そのため、本実施の形態に係る仮想キーボードは、パーソナルコンピュータなどの情報機器での使用も有用である。図23Aは、キー入力装置118をパーソナルコンピュータの外付け機器として構成した例である。キー入力装置118は机上に配置され、カメラ116は机上に近接した位置から使用者の手指画像を撮影する。入力されたキーの情報は、ケーブル117を通じてパーソナルコンピュータ114に送られ、モニタ115に表示される。図23Bは、キー入力装置としての機能をパーソナルコンピュータ114のプログラムで実現した例である。具体的には、カメラ116は、モニタ115に取りつけており、机上の画像を撮影し、その画像をケーブル117を通じてパーソナルコンピュータ114に送る。パーソナルコンピュータ114のCPU(図示しない)は、携帯端末10で行なわれた各処理(図21)を実行し、仮想キーボードとして機能する。 As described above, the virtual keyboard according to the present embodiment is configured as a virtual keyboard configured such that the positions of the keys assigned to each finger are overlapped with each other, thereby minimizing finger movement required for key operation. it can. By doing so, it is possible to realize a virtual keyboard in which hands and arms are less likely to get tired and the required area on the desk required for operation is minimized. After the user has mastered the familiar layout of the virtual keyboard such as the QWERTY keyboard, the next step is for the user to create a virtual keyboard composed of the keys arranged by each finger overlapping each other. It is desirable to use FIG. 23A and FIG. 23B are other examples of a system perspective view of the present embodiment. Although the description has been made centering on the mobile terminal 10 so far, the virtual keyboard can be realized in other modes. The virtual keyboard according to the present embodiment has a feature that the fingers can be held in a comfortable posture, and can improve tendonitis and stiff shoulders caused by long-time key operations. Therefore, the virtual keyboard according to this embodiment is also useful for use in information equipment such as a personal computer. FIG. 23A shows an example in which the key input device 118 is configured as an external device of a personal computer. The key input device 118 is disposed on the desk, and the camera 116 captures a user's finger image from a position close to the desk. The inputted key information is sent to the personal computer 114 through the cable 117 and displayed on the monitor 115. FIG. 23B shows an example in which the function as a key input device is realized by a program of the personal computer 114. Specifically, the camera 116 is attached to the monitor 115, takes an image on the desk, and sends the image to the personal computer 114 through the cable 117. A CPU (not shown) of the personal computer 114 executes each process (FIG. 21) performed on the mobile terminal 10 and functions as a virtual keyboard.
 これまで仮想キーボードを操作する手指には何も取り付けないことを前提に説明をしてきたが、実施の形態はこれに限られない。たとえば、ホームポジションにある各手指や打鍵した手指がどの指であるかを検出しやすくするために、使用者は、カメラで識別できる相互に異なる特徴を持った指サックのような器具を各手指に取り付けてもよい。または、使用者は、指先に上記器具を取り付けて、携帯端末10その他のキー入力装置は、手指そのものの位置や大きさを検出する代わりに、この器具の位置や大きさを検出するように構成してもよい。 So far, the description has been made on the assumption that nothing is attached to the fingers operating the virtual keyboard, but the embodiment is not limited to this. For example, in order to make it easier to detect which finger is in the home position or the finger that has been pressed, the user uses a device such as a finger sack with different characteristics that can be identified by the camera. You may attach to. Alternatively, the user attaches the instrument to the fingertip, and the mobile terminal 10 and other key input devices are configured to detect the position and size of the instrument instead of detecting the position and size of the finger itself. May be.
 さらに他の局面において、本キー入力装置を使い慣れない使用者は、自分の指の長さに適合したキーボードを印刷したシートを机上に敷いて、キーの位置を確かめながら操作することもできる。 In yet another aspect, a user who is unfamiliar with the key input device can also operate while checking the position of the key by placing a sheet on which a keyboard suitable for the length of his / her finger is printed on the desk.
 本実施の形態の携帯端末10は、ある局面において、プロセッサが、データ記録媒体に格納されたコンピュータプログラムを実行することにより実現される。当該コンピュータプログラムは、周知の構成を有するコンピュータシステムによっても実行可能である。したがって、本発明の本質的な部分は、RAM、ハードディスク、CD-ROMその他のデータ記録媒体に格納されたソフトウェア、あるいはネットワークを介してダウンロード可能なソフトウェアであるともいえる。なお、コンピュータシステムの各ハードウェアの動作は周知であるので、詳細な説明は繰り返さない。 The mobile terminal 10 according to the present embodiment is realized in one aspect by a processor executing a computer program stored in a data recording medium. The computer program can also be executed by a computer system having a known configuration. Therefore, it can be said that the essential part of the present invention is software stored in a RAM, hard disk, CD-ROM or other data recording medium, or software downloadable via a network. Since the operation of each hardware of the computer system is well known, detailed description will not be repeated.
 なお、データ記録媒体としては、CD-ROM、FD(Flexible Disk)、ハードディスクに限られず、磁気テープ、カセットテープ、光ディスク(MO(Magnetic Optical Disc)/MD(Mini Disc)/DVD(Digital Versatile Disc))、IC(Integrated Circuit)カード(メモリカードを含む)、光カード、マスクROM、EPROM(Electronically Programmable Read-Only Memory)、EEPROM(Electronically Erasable Programmable Read-Only Memory)、フラッシュROMなどの半導体メモリ等の固定的にプログラムを担持する媒体でもよい。 Data recording media are not limited to CD-ROM, FD (Flexible Disk), and hard disk, but are magnetic tape, cassette tape, optical disc (MO (Magnetic Optical Disc) / MD (Mini Disc) / DVD (Digital Versatile Disc)). ), IC (Integrated Circuit) card (including memory card), optical card, mask ROM, EPROM (Electronically Programmable Read-Only Memory), EEPROM (Electronically Erasable Programmable Read-Only Memory), semiconductor memory such as flash ROM, etc. It may be a medium that carries the program in a fixed manner.
 ここでいうプログラムとは、CPUにより直接実行可能なプログラムだけでなく、ソースプログラム形式のプログラム、圧縮処理されたプログラム、暗号化されたプログラム等を含み得る。 Here, the program may include not only a program that can be directly executed by the CPU, but also a program in a source program format, a compressed program, an encrypted program, and the like.
 ある局面において、本実施の形態に係るキー入力装置12によると、携帯端末のウエアラブル性を損なうことなく、使用者は手指や腕の疲れを軽減できる自分に最適な位置や形状を持った仮想キーボードを使用して、両手を使った高速キー入力が可能になる。通常のハードキーボードは各行のキーが横方向に一直線に並んでいるために長さの違う各指をキーボードに対して指先がほぼ垂直になるように構える必要があるが、この構成であれば、例えば全ての指をまっすぐ伸ばした状態で構えることも可能である。この場合、仮想キーボードの各行のキーは横方向に一直線に並んでおらず、多くの使用者は中指が一番長いため、中指のホームポジションに該当するキー(たとえば、QWERTYキーボードであればDキーとKキー)の列のキーが他の列のキーよりもカメラに近い位置にあるような使用者の手指に最適な形状の仮想キーボードになる。そして指を楽に伸ばしているために、指の腹(指紋のある部分)で打鍵することになる。通常のハードキーボードであれば指の腹で打鍵すると手前のキーも同時に打鍵してしまう2重打鍵になりやすいが、仮想キーボードであることから指先位置のキーだけが入力されたとして処理できる。指の腹で打鍵をすることで指関節等への衝撃は非常に軽減される。この特性は携帯端末以外のキー入力が必要なデスクトップコンピュータのような情報機器にも適用して効果を発揮する。 In one aspect, according to the key input device 12 according to the present embodiment, the user can reduce fatigue of fingers and arms without sacrificing the wearability of the mobile terminal, and the virtual keyboard having the optimal position and shape for himself / herself Using, enables fast key input using both hands. A normal hard keyboard has keys in each row aligned in a horizontal direction, so it is necessary to set each finger of different length so that the fingertip is almost perpendicular to the keyboard. For example, it is possible to hold all fingers straight. In this case, the keys in each row of the virtual keyboard are not aligned in the horizontal direction, and many users have the longest middle finger. Therefore, the key corresponding to the home position of the middle finger (for example, the D key for a QWERTY keyboard) And K key) is a virtual keyboard having a shape most suitable for the user's fingers such that the keys in the row are closer to the camera than the keys in the other rows. And since the finger is stretched easily, the key is pressed with the belly of the finger (the part with the fingerprint). If it is a normal hard keyboard, it is easy to be double-keyed when the key of the finger is pressed at the same time, but since it is a virtual keyboard, it can be processed as if only the key at the fingertip position was input. The impact on the finger joint and the like is greatly reduced by hitting the key with the belly of the finger. This characteristic is effective when applied to information devices such as desktop computers that require key input other than portable terminals.
 他の局面において、キー入力装置12は、カメラ20で得られた2次元画像上の手指の2次元位置情報に加えて、手指画像の大きさという情報を導入でき、入力情報を3次元にすることができる。入力装置は、手指がホームポジションに配置されたときの手指画像の大きさを基準とし、そこからの相対的な移動距離を手指の大きさの変化から算出することができる。 In another aspect, the key input device 12 can introduce the information about the size of the finger image in addition to the two-dimensional position information of the finger on the two-dimensional image obtained by the camera 20, and makes the input information three-dimensional. be able to. The input device can calculate the relative movement distance from the change in the size of the finger based on the size of the finger image when the finger is placed at the home position.
 ある局面において、カメラ20は、机上に近接して配置されて机上の手指の正面画像を撮影するように構成されているので、キー入力装置12は、机上の手指が机上から離れた高さ距離を測定しやすくなるとともに、机上の手指の2次元位置の相対変化距離を容易に計算できる。また、キー入力装置12が打鍵動作を判定する上において、打鍵動作をした手指とその他の手指の机上からの高さの違いが画像の分析から容易に判別できる。 In one aspect, since the camera 20 is arranged close to the desk and is configured to capture a front image of the fingers on the desk, the key input device 12 has a height distance at which the fingers on the desk are separated from the desk. And the relative change distance of the two-dimensional position of the finger on the desk can be easily calculated. In addition, when the key input device 12 determines the keystroke operation, the difference in height between the finger performing the keystroke operation and the other fingers from the desk can be easily determined from the analysis of the image.
 また、本実施の形態に係るキー入力装置12によると、使用者はキー配列を確認することができ、またキー入力装置12が検出している手指の現在位置を確認することができる。使用者がゆっくりとキー入力する場合には、自分が次に入力しようとしているキーの上に、キー入力装置12の検出した手指の位置が表示されたことを確認してから打鍵動作をすれば、誤りなくキー入力ができる。また使用者は、自分が入力しようとしている仮想キーボードの全体をカメラ20が撮影できているかどうかを確認できる。仮想キーボードの一部分がカメラの撮像範囲外になっていた場合、使用者は、手指の間隔または手指とカメラとの距離を調整することにより、撮像範囲を修正できる。 Further, according to the key input device 12 according to the present embodiment, the user can confirm the key arrangement, and can confirm the current position of the finger detected by the key input device 12. When the user slowly inputs a key, if the user presses the key after confirming that the position of the finger detected by the key input device 12 is displayed on the key to be input next, , Key input without error. In addition, the user can check whether the camera 20 can capture the entire virtual keyboard that the user is trying to input. When a part of the virtual keyboard is outside the imaging range of the camera, the user can correct the imaging range by adjusting the interval between fingers or the distance between the fingers and the camera.
 他の実施の形態に係るキー入力装置12によると、使用者は、キー入力装置12に対して何の操作もする必要がなくキーボードの種類を瞬時に切り替えることができる。例えば、使用者が左手をグーの形に握ってしまえば、キー入力装置12は左手が有効でないことを容易に検出できる。 According to the key input device 12 according to another embodiment, the user does not need to perform any operation on the key input device 12 and can instantly switch the keyboard type. For example, if the user holds the left hand in the shape of a goo, the key input device 12 can easily detect that the left hand is not effective.
 他の実施の形態に係るキー入力装置12は、カメラ20が撮影する手指や爪の輪郭をより明瞭に検出できることから、手指の大きさの測定精度を向上させることができる。また暗い環境でも、使用者は、キー入力装置12を使うことができる。 Since the key input device 12 according to another embodiment can detect the contours of fingers and nails photographed by the camera 20 more clearly, it is possible to improve the accuracy of measuring the size of the fingers. Further, even in a dark environment, the user can use the key input device 12.
 この発明を詳細に説明し示してきたが、これは例示のためのみであって、限定ととってはならず、発明の範囲は添付の請求の範囲によって解釈されることが明らかに理解されるであろう。 Although the invention has been described and shown in detail, it is clearly understood that this is by way of example only and should not be taken as a limitation, the scope of the invention being construed by the appended claims Will.
 10 携帯端末、11 ホストプロセッサ、12 キー入力装置、13 キー入力プロセッサ、14 プロセッサ、15 DSP、16 CPU、17 ROM、18 プログラム、19 RAM、20 机上に近接して配置されるカメラ、21,22 カメラの水平方向画角の範囲、23,24,25,26,27,28 VGA画像の水平方向画素番号の指定位置、29 DSP、30 ディスプレイ、31 入力済み文字列の表示部、32 キーボード表示部、33,34 マウスポインタ表示、手指画像表示部、40 ライト照射部、41,42 ライトの水平方向照射範囲、45 マイク、46 スピーカ、50A,50B,50C 一直線上に並んだ仮想キーボード上に配置した左手画像、50D にぎりしめた左手画像、51A,51B,51C,51D,51E,51F 一直線上に並んだ仮想キーボード上に配置した右手画像、52A,52B,52C,52D,52E 楽な姿勢で仮想キーボード上に配置した左手、53A,53B,53C,53D 楽な姿勢で仮想キーボード上に配置した右手、60A,60B,60C 仮想QWERTYキーボード、60D 仮想テンキーボード、61 キーボード表示、62,62A,62B 手指検出位置表示、63A,63B 手指中心位置表示、64A,64B 撮影領域表示、70 指、爪の輪郭特徴抽出部、71 手指位置検出部、72 ホームポジション検出部、73 打鍵動作検出部、74 キーコード生成部、75 初期文字列による学習部、76 文字入力中の学習部、77 片手か両手かの判定部、78 入力キー受付部、79 マウスポインタ制御部、80A,80B 4本の手指の輪郭線、81A,81B,81C,81D,81E、81F 手指画像の幅、82 中指の輪郭線、83 輪郭の水平幅、84 輪郭の水平幅の中点、85 n個の中点の回帰直線、86 画像上の中指の位置、87 中指の径、90 カメラモジュール、100 カメラが撮影した画像、101 使用者、102 机エッジ、103 机上に配置された使用者の手指、104 机上、106 使用者の手指、107 机エッジで区切られた手指の面積、108 爪の画像、109 爪の横径、110 爪の縦径、111 仮想キーボード上に配置された右手、112 右手人差し指が担当する仮想キーボード、113 右手中指が担当する仮想キーボード、114 パーソナルコンピュータ、115 モニタ、116 カメラ、117 ケーブル、118 キー入力装置、D 中指の実際の径(mm)、d1,d2 中指画像の平均径(画素数)、L1,L2 中指のカメラからの距離、P1,P2 仮想キーボード上の中指の位置、V 中指の移動ベクトル、XP,ZP,ZH 中指の座標、α カメラの水平方向画角、θ,θ1,θ2 水平方向画角の左端から手指がある位置までの角度。 10 mobile terminals, 11 host processors, 12 key input devices, 13 key input processors, 14 processors, 15 DSP, 16 CPU, 17 ROM, 18 programs, 19 RAM, 20 cameras placed close to the desk, 21, 22 Range of horizontal angle of view of camera, 23, 24, 25, 26, 27, 28 VGA image horizontal pixel number specified position, 29 DSP, 30 display, 31 Displayed character string display, 32 Keyboard display 33, 34 Mouse pointer display, finger image display unit, 40 light irradiation unit, 41, 42 light horizontal irradiation range, 45 microphones, 46 speakers, 50A, 50B, 50C arranged on a virtual keyboard aligned in a straight line Left-hand image, 50D left-hand image, 5 A, 51B, 51C, 51D, 51E, 51F Right hand image arranged on a virtual keyboard aligned in a straight line, 52A, 52B, 52C, 52D, 52E Left hand arranged on the virtual keyboard in an easy posture, 53A, 53B, 53C, 53D Right hand, 60A, 60B, 60C virtual QWERTY keyboard, 60D virtual numeric keyboard, 61 keyboard display, 62, 62A, 62B finger detection position display, 63A, 63B finger center position display arranged on the virtual keyboard in an easy posture 64A, 64B shooting area display, 70 finger and nail contour feature extraction unit, 71 finger position detection unit, 72 home position detection unit, 73 keystroke detection unit, 74 key code generation unit, 75 learning unit with initial character string, 76 Learning part during character input, 77 One hand or both hand Part, 78 input key accepting part, 79 mouse pointer control part, 80A, 80B, four finger outlines, 81A, 81B, 81C, 81D, 81E, 81F finger image width, 82 middle finger outline, 83 outlines Horizontal width, 84 Midpoint of horizontal width of contour, 85 n middle points regression line, 86 Middle finger position on image, 87 Middle finger diameter, 90 camera module, 100 Camera image, 101 user, 102 desk edge, 103 user's finger placed on the desk, 104 desk, 106 user's finger, 107 finger area separated by desk edge, 108 nail image, 109 nail horizontal diameter, 110 nail vertical Diameter, 111 Right hand placed on the virtual keyboard, 112 Virtual keyboard for the right hand index finger, 113 Virtual key for the right hand middle finger From board, 114 personal computer, 115 monitor, 116 camera, 117 cable, 118 key input device, D middle finger actual diameter (mm), d1, d2 middle finger image average diameter (number of pixels), L1, L2 from middle finger camera , P1, P2 Middle finger position on virtual keyboard, V Middle finger movement vector, XP, ZP, ZH Middle finger coordinates, α Camera horizontal angle of view, θ, θ1, θ2 Horizontal finger angle from left edge of horizontal field of view The angle to the position where there is.

Claims (16)

  1.  入力装置であって、
     前記入力装置が机上に近接して配置されると、前記入力装置の使用者の手指の正面画像を連続的に撮影するように構成されたカメラと、
     前記撮影された正面画像から使用者の手指の位置を検出するように構成された手指位置検出部と、
     前記使用者の手指が机上に接触して静止した時に、その時点の使用者の手指の位置を、各指が担当するキーの前記カメラからの距離が異なり、かつ、前記使用者に固有の形状を有する仮想キーボード上のホームポジションとして検出するように構成されたホームポジション検出部と、
     前記使用者の手指が前記仮想キーボード上のキーを打鍵したことを検出するように構成された打鍵動作検出部と、
     打鍵動作があった位置に該当する前記仮想キーボード上のキーのコードを生成するように構成されたキーコード生成部とを備える、入力装置。
    An input device,
    A camera configured to continuously capture front images of the fingers of the user of the input device when the input device is placed close to the desk;
    A finger position detector configured to detect the position of the user's finger from the photographed front image;
    When the user's finger touches the desk and stops, the position of the user's finger at that time is different in the distance from the camera of the key assigned to each finger, and the shape unique to the user A home position detector configured to detect as a home position on a virtual keyboard having
    A keystroke operation detector configured to detect that the user's finger has pressed a key on the virtual keyboard;
    An input device, comprising: a key code generation unit configured to generate a code of a key on the virtual keyboard corresponding to a position where a keystroke operation has been performed.
  2.  前記手指位置検出部は、
     前記ホームポジションにおける手指の画像上の位置および手指や爪の大きさと、前記ホームポジションから手指が移動した後の手指の画像上の位置および手指や爪の大きさとを比較することで、移動後の手指の前記仮想キーボード上の位置を算出するように構成されている、請求項1に記載の入力装置。
    The finger position detector is
    By comparing the position on the image of the finger at the home position and the size of the finger or nail with the position on the image of the finger after the finger has moved from the home position and the size of the finger or nail, The input device according to claim 1, wherein the input device is configured to calculate a position of a finger on the virtual keyboard.
  3.  前記机上に接触して静止した指先の数が片方の手だけ所定の数より少ない場合、前記仮想キーボードは、前記片方の手の机上に接触した指先の組み合わせに応じて修飾子キーの機能を有するように構成されている、請求項1または2に記載の入力装置。 When the number of fingertips that touch and rest on the desk is less than a predetermined number for only one hand, the virtual keyboard has a function of a modifier key according to the combination of the fingertips touching the desk on one hand The input device according to claim 1, configured as described above.
  4.  前記机上に接触して静止した指先の数が片方の手だけ所定の数より少ない場合、前記入力装置は、前記片方の手の机上に接触した指先の組み合わせによって、もう一方の手で操作する机上の前記仮想キーボードがアルファニューメリックキーボードかそれ以外の片手による操作を受け付けるように適合されたキーボードかを自動的に切り替えるように構成されている、請求項1または2に記載の入力装置。 When the number of fingertips that are in contact with the desk and are stationary is less than a predetermined number for only one hand, the input device can be operated with the other hand by a combination of fingertips that are in contact with the desk on the other hand. The input device according to claim 1, wherein the virtual keyboard is configured to automatically switch between an alphanumeric keyboard and a keyboard adapted to accept an operation with one hand.
  5.  入力装置であって、
     前記入力装置が机上に近接して配置されると、前記入力装置の使用者の手指の正面画像を連続的に撮影するように構成されたカメラと、
     前記撮影された正面画像から使用者の手指の位置を検出するように構成された手指位置検出部と、
     少なくともマウスポインタを表示領域に表示するように構成されたモニタと、
     前記使用者の手指が机上に接触して静止した時に、その時点の使用者の手指の位置を、前記表示領域上のマウスポインタの現在位置に対応付けるように構成されたホームポジション検出部と、
    前記使用者の手指が机上を移動した時に、移動方向と距離に応じて前記表示領域上のマウスポインタを移動させるように構成されたマウスポインタ制御部とを備え、
     前記手指位置検出部は、
     ホームポジションにおける手指の画像上の位置および手指や爪の大きさと、前記ホームポジションから手指が移動した後の手指の画像上の位置および手指や爪の大きさとを比較することで、机上の手指の移動方向と距離を算出するように構成されている、入力装置。
    An input device,
    A camera configured to continuously capture front images of the fingers of the user of the input device when the input device is placed close to the desk;
    A finger position detector configured to detect the position of the user's finger from the photographed front image;
    A monitor configured to display at least a mouse pointer in the display area;
    A home position detector configured to associate the position of the user's finger at that time with the current position of the mouse pointer on the display area when the user's finger touches the desk and is stationary;
    A mouse pointer control unit configured to move the mouse pointer on the display area according to the moving direction and distance when the user's finger moves on the desk;
    The finger position detector is
    By comparing the position of the finger on the image at the home position and the size of the finger or nail with the position on the image of the finger after the finger has moved from the home position and the size of the finger or nail, An input device configured to calculate a moving direction and a distance.
  6.  前記手指位置検出部は、手指の位置や大きさを測定するために、机エッジの情報を使用するように構成されている、請求項1から5のいずれかに記載の入力装置。 The input device according to any one of claims 1 to 5, wherein the finger position detection unit is configured to use information on a desk edge in order to measure a position and a size of a finger.
  7.  請求項1から6のいずれかに記載の入力装置を備えた、情報機器。 An information device comprising the input device according to any one of claims 1 to 6.
  8.  コンピュータを入力装置として機能させるためのプログラムであって、前記プログラムは、前記コンピュータに、
     机上に近接して配置されたカメラで、前記入力装置の使用者の手指の正面画像を連続的に撮影するステップと、
     前記撮影された正面画像から使用者の手指の位置を検出するステップと、
     前記使用者の手指が机上に接触して静止した時に、その時点の使用者の手指の位置を、各指が担当するキーの前記カメラからの距離が異なり、かつ、前記使用者に固有の形状を有する仮想キーボード上のホームポジションとして検出するステップと、
     前記使用者の手指が前記仮想キーボード上のキーを打鍵したことを検出するステップと、
     打鍵動作があった位置に該当する前記仮想キーボード上のキーのコードを生成するステップとを実行させる、プログラム。
    A program for causing a computer to function as an input device, the program being
    Continuously capturing front images of the fingers of the user of the input device with a camera arranged close to the desk; and
    Detecting a position of a user's finger from the photographed front image;
    When the user's finger touches the desk and stops, the position of the user's finger at that time is different in the distance from the camera of the key assigned to each finger, and the shape unique to the user Detecting as a home position on a virtual keyboard having:
    Detecting that the user's finger has pressed a key on the virtual keyboard;
    Generating a code of a key on the virtual keyboard corresponding to a position where a keystroke operation has been performed.
  9.  前記手指の位置を検出するステップは、
     前記ホームポジションにおける手指の画像上の位置および手指や爪の大きさと、前記ホームポジションから手指が移動した後の手指の画像上の位置および手指や爪の大きさとを比較することで、移動後の手指の前記仮想キーボード上の位置を算出するステップを含む、請求項8に記載のプログラム。
    Detecting the position of the finger;
    By comparing the position on the image of the finger at the home position and the size of the finger or nail with the position on the image of the finger after the finger has moved from the home position and the size of the finger or nail, The program according to claim 8, comprising calculating a position of a finger on the virtual keyboard.
  10.  コンピュータを入力装置として機能させるためのプログラムであって、前記プログラムは前記コンピュータに、
     机上に近接して配置されたカメラで前記入力装置の使用者の手指の正面画像を連続的に撮影するステップと、
     前記撮影された正面画像から使用者の手指の位置を検出するステップと、
     モニタの表示領域にマウスポインタを表示するステップと、
     前記使用者の手指が机上に接触して静止した時に、その時点の使用者の手指の位置を前記表示領域上のマウスポインタの現在位置であるホームポジションに対応付けるステップと、
     前記使用者の手指が机上を移動した時に、移動方向と距離に応じて前記表示画面上のマウスポインタを移動させるステップとを実行させ、
     前記手指の位置を検出するステップは、
      前記ホームポジションにおける手指の画像上の位置および手指や爪の大きさと、前記ホームポジションから手指が移動した後の手指の画像上の位置および手指や爪の大きさとを比較することで、机上の手指の移動方向と距離を算出するステップを含む、プログラム。
    A program for causing a computer to function as an input device, wherein the program causes the computer to
    Continuously capturing front images of the fingers of the user of the input device with a camera placed close to the desk;
    Detecting a position of a user's finger from the photographed front image;
    Displaying a mouse pointer in the display area of the monitor;
    Associating the position of the user's finger at that time with the home position, which is the current position of the mouse pointer on the display area, when the user's finger contacts and rests on the desk;
    Moving the mouse pointer on the display screen according to the moving direction and distance when the user's finger moves on the desk;
    Detecting the position of the finger;
    By comparing the position on the image of the finger and the size of the finger or nail at the home position with the position on the image of the finger after the finger has moved from the home position and the size of the finger or nail, the finger on the desk A program including a step of calculating a moving direction and a distance.
  11.  前記プログラムは、前記コンピュータに、
     前記手指の位置を検出するステップとして、
     手指の位置や大きさを測定するために机のエッジ情報を使用するステップを実行させる、請求項8から10のいずれかに記載のプログラム。
    The program is stored in the computer.
    As a step of detecting the position of the finger,
    The program according to any one of claims 8 to 10, wherein the program uses a step of using desk edge information to measure the position and size of a finger.
  12.  請求項8から11のいずれかに記載のプログラムを格納した、コンピュータ読み取り可能な不揮発性のデータ記録媒体。 A non-volatile computer-readable data recording medium storing the program according to any one of claims 8 to 11.
  13.  入力装置を用いて文字を入力するための方法であって、
     机上に近接して配置されたカメラで、前記入力装置の使用者の手指の正面画像を連続的に撮影するステップと、
     前記撮影された正面画像から使用者の手指の位置を検出するステップと、
     前記使用者の手指が机上に接触して静止した時に、その時点の使用者の手指の位置を、各指が担当するキーの前記カメラからの距離が異なり、かつ、前記使用者に固有の形状を有する仮想キーボード上のホームポジションとして検出するステップと、
     前記使用者の手指が前記仮想キーボード上のキーを打鍵したことを検出するステップと、
     打鍵動作があった位置に該当する前記仮想キーボード上のキーのコードを生成するステップとを含む、方法。
    A method for inputting characters using an input device,
    Continuously capturing front images of the fingers of the user of the input device with a camera arranged close to the desk; and
    Detecting a position of a user's finger from the photographed front image;
    When the user's finger touches the desk and stops, the position of the user's finger at that time is different in the distance from the camera of the key assigned to each finger, and the shape unique to the user Detecting as a home position on a virtual keyboard having:
    Detecting that the user's finger has pressed a key on the virtual keyboard;
    Generating a code of a key on the virtual keyboard corresponding to a position where a keystroke operation has occurred.
  14.  前記手指の位置を検出するステップは、
     前記ホームポジションにおける手指の画像上の位置および手指や爪の大きさと、前記ホームポジションから手指が移動した後の手指の画像上の位置および手指や爪の大きさとを比較することで、移動後の手指の前記仮想キーボード上の位置を算出するステップを含む、請求項13に記載の方法。
    Detecting the position of the finger;
    By comparing the position on the image of the finger at the home position and the size of the finger or nail with the position on the image of the finger after the finger has moved from the home position and the size of the finger or nail, The method of claim 13, comprising calculating a position of a finger on the virtual keyboard.
  15.  入力装置を用いて文字を入力するための方法であって、
     机上に近接して配置されたカメラで前記入力装置の使用者の手指の正面画像を連続的に撮影するステップと、
     前記撮影された正面画像から使用者の手指の位置を検出するステップと、
     表示領域にマウスポインタを表示するステップと、
     前記使用者の手指が机上に接触して静止した時に、その時点の使用者の手指の位置を前記表示領域上のマウスポインタの現在位置であるホームポジションに対応付けるステップと、
     前記使用者の手指が机上を移動した時に、移動方向と距離に応じて前記表示画面上のマウスポインタを移動させるステップとを含み、
     前記手指の位置を検出するステップは、
      前記ホームポジションにおける手指の画像上の位置および手指や爪の大きさと、前記ホームポジションから手指が移動した後の手指の画像上の位置および手指や爪の大きさとを比較することで、机上の手指の移動方向と距離を算出するステップを含む、方法。
    A method for inputting characters using an input device,
    Continuously capturing front images of the fingers of the user of the input device with a camera placed close to the desk;
    Detecting a position of a user's finger from the photographed front image;
    Displaying a mouse pointer in the display area;
    Associating the position of the user's finger at that time with the home position, which is the current position of the mouse pointer on the display area, when the user's finger contacts and rests on the desk;
    Moving the mouse pointer on the display screen according to the moving direction and distance when the user's finger moves on the desk,
    Detecting the position of the finger;
    By comparing the position on the image of the finger and the size of the finger or nail at the home position with the position on the image of the finger after the finger has moved from the home position and the size of the finger or nail, the finger on the desk Calculating a moving direction and a distance of.
  16.  前記手指の位置を検出するステップは、
     手指の位置や大きさを測定するために机のエッジ情報を使用するステップを含む、請求項13から15のいずれかに記載の方法。
    Detecting the position of the finger;
    16. A method according to any of claims 13 to 15, comprising the step of using desk edge information to measure finger position and size.
PCT/JP2011/073203 2010-10-28 2011-10-07 Input device, information apparatus provided with the input device, program for causing computer to function as input device, and method for using the input device to input characters WO2012056864A1 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2010-256288 2010-10-28
JP2010256288 2010-10-28
JP2010-288701 2010-12-07
JP2010288701A JP4846871B1 (en) 2010-10-28 2010-12-07 KEY INPUT DEVICE, PORTABLE TERMINAL PROVIDED WITH THE SAME, AND PROGRAM FOR MAKING PORTABLE TERMINAL FUNCTION AS INPUT DEVICE

Publications (1)

Publication Number Publication Date
WO2012056864A1 true WO2012056864A1 (en) 2012-05-03

Family

ID=45475296

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2011/073203 WO2012056864A1 (en) 2010-10-28 2011-10-07 Input device, information apparatus provided with the input device, program for causing computer to function as input device, and method for using the input device to input characters

Country Status (2)

Country Link
JP (1) JP4846871B1 (en)
WO (1) WO2012056864A1 (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014006594A (en) * 2012-06-21 2014-01-16 Fujitsu Ltd Character input program, information processor, and character input method
JP2014119660A (en) * 2012-12-18 2014-06-30 Konica Minolta Inc Image forming apparatus
JP2016054852A (en) * 2014-09-08 2016-04-21 嘉泰 小笠原 Electronic apparatus
WO2016079774A1 (en) * 2014-11-21 2016-05-26 Johri Abhishek System and method for data and command input
JP2020080049A (en) * 2018-11-13 2020-05-28 クリスタルメソッド株式会社 Estimation system and estimation apparatus
JP2020177363A (en) * 2019-04-16 2020-10-29 クリスタルメソッド株式会社 Estimation system and estimation device
CN112183447A (en) * 2020-10-15 2021-01-05 尚腾 Information input system based on image recognition
CN112540679A (en) * 2020-12-11 2021-03-23 深圳市创智成科技股份有限公司 Keyboard pattern projection method
WO2021054589A1 (en) 2019-09-18 2021-03-25 Samsung Electronics Co., Ltd. Electronic apparatus and controlling method thereof
US11853509B1 (en) 2022-05-09 2023-12-26 Microsoft Technology Licensing, Llc Using a camera to supplement touch sensing

Families Citing this family (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5621927B2 (en) * 2011-06-23 2014-11-12 富士通株式会社 Information processing apparatus, input control method, and input control program
JP5799817B2 (en) * 2012-01-12 2015-10-28 富士通株式会社 Finger position detection device, finger position detection method, and computer program for finger position detection
JP6232694B2 (en) * 2012-10-15 2017-11-22 キヤノンマーケティングジャパン株式会社 Information processing apparatus, control method thereof, and program
KR101411569B1 (en) * 2013-06-05 2014-06-27 고려대학교 산학협력단 Device and method for information processing using virtual keyboard
JP6524589B2 (en) * 2013-08-30 2019-06-05 国立大学法人山梨大学 Click operation detection device, method and program
KR101534282B1 (en) 2014-05-07 2015-07-03 삼성전자주식회사 User input method of portable device and the portable device enabling the method
KR101873842B1 (en) * 2015-03-11 2018-07-04 한양대학교 산학협력단 Apparatus for providing virtual input using depth sensor and method for using the apparatus
KR101577359B1 (en) * 2015-03-16 2015-12-14 박준호 Wearable device
JP2017037583A (en) * 2015-08-14 2017-02-16 レノボ・シンガポール・プライベート・リミテッド Computer input system
JP6570376B2 (en) * 2015-08-27 2019-09-04 キヤノン株式会社 Information processing apparatus, control method therefor, program, and storage medium
KR101976605B1 (en) * 2016-05-20 2019-05-09 이탁건 A electronic device and a operation method
KR101853339B1 (en) * 2016-12-02 2018-04-30 광운대학교 산학협력단 A keyboard apparatus based motion recognition for the smart mini display
KR101883866B1 (en) * 2016-12-23 2018-08-01 단국대학교 산학협력단 Ground contact type finger input device and method
KR101998786B1 (en) * 2017-08-31 2019-07-10 단국대학교 산학협력단 Non-contact Finger Input Device and Method in Virtual Space
WO2020039703A1 (en) * 2018-08-21 2020-02-27 株式会社Nttドコモ Input device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003288156A (en) * 2002-03-28 2003-10-10 Minolta Co Ltd Input device
JP2004500657A (en) * 2000-02-11 2004-01-08 カネスタ インコーポレイテッド Data input method and apparatus using virtual input device
JP2007133835A (en) * 2005-11-14 2007-05-31 Sharp Corp Virtual key input device, information terminal device, charger for information terminal device, and program
JP2008123316A (en) * 2006-11-14 2008-05-29 Konica Minolta Holdings Inc Data input method and data input device
JP2008234594A (en) * 2007-03-23 2008-10-02 Denso Corp Operation input device

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE69204045T2 (en) * 1992-02-07 1996-04-18 Ibm Method and device for optical input of commands or data.
JP2007328445A (en) * 2006-06-06 2007-12-20 Toyota Motor Corp Input device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004500657A (en) * 2000-02-11 2004-01-08 カネスタ インコーポレイテッド Data input method and apparatus using virtual input device
JP2003288156A (en) * 2002-03-28 2003-10-10 Minolta Co Ltd Input device
JP2007133835A (en) * 2005-11-14 2007-05-31 Sharp Corp Virtual key input device, information terminal device, charger for information terminal device, and program
JP2008123316A (en) * 2006-11-14 2008-05-29 Konica Minolta Holdings Inc Data input method and data input device
JP2008234594A (en) * 2007-03-23 2008-10-02 Denso Corp Operation input device

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014006594A (en) * 2012-06-21 2014-01-16 Fujitsu Ltd Character input program, information processor, and character input method
JP2014119660A (en) * 2012-12-18 2014-06-30 Konica Minolta Inc Image forming apparatus
JP2016054852A (en) * 2014-09-08 2016-04-21 嘉泰 小笠原 Electronic apparatus
WO2016079774A1 (en) * 2014-11-21 2016-05-26 Johri Abhishek System and method for data and command input
US10705619B2 (en) 2014-11-21 2020-07-07 Abhishek Johri System and method for gesture based data and command input via a wearable device
JP2020080049A (en) * 2018-11-13 2020-05-28 クリスタルメソッド株式会社 Estimation system and estimation apparatus
JP2020177363A (en) * 2019-04-16 2020-10-29 クリスタルメソッド株式会社 Estimation system and estimation device
WO2021054589A1 (en) 2019-09-18 2021-03-25 Samsung Electronics Co., Ltd. Electronic apparatus and controlling method thereof
EP4004695A4 (en) * 2019-09-18 2022-09-28 Samsung Electronics Co., Ltd. Electronic apparatus and controlling method thereof
US11709593B2 (en) 2019-09-18 2023-07-25 Samsung Electronics Co., Ltd. Electronic apparatus for providing a virtual keyboard and controlling method thereof
CN112183447A (en) * 2020-10-15 2021-01-05 尚腾 Information input system based on image recognition
CN112540679A (en) * 2020-12-11 2021-03-23 深圳市创智成科技股份有限公司 Keyboard pattern projection method
US11853509B1 (en) 2022-05-09 2023-12-26 Microsoft Technology Licensing, Llc Using a camera to supplement touch sensing

Also Published As

Publication number Publication date
JP4846871B1 (en) 2011-12-28
JP2012108857A (en) 2012-06-07

Similar Documents

Publication Publication Date Title
WO2012056864A1 (en) Input device, information apparatus provided with the input device, program for causing computer to function as input device, and method for using the input device to input characters
US11093086B2 (en) Method and apparatus for data entry input
US20210271340A1 (en) Gesture recognition devices and methods
KR100811015B1 (en) Method and apparatus for entering data using a virtual input device
EP2717120B1 (en) Apparatus, methods and computer program products providing finger-based and hand-based gesture commands for portable electronic device applications
US9529523B2 (en) Method using a finger above a touchpad for controlling a computerized system
US9317130B2 (en) Visual feedback by identifying anatomical features of a hand
US9430147B2 (en) Method for user input from alternative touchpads of a computerized system
US9477874B2 (en) Method using a touchpad for controlling a computerized system with epidermal print information
TWI303773B (en)
US20060028457A1 (en) Stylus-Based Computer Input System
US20150363038A1 (en) Method for orienting a hand on a touchpad of a computerized system
GB2470654A (en) Data input on a virtual device using a set of objects.
US9557825B2 (en) Finger position sensing and display
JP2013171529A (en) Operation input device, operation determination method, and program
JP5928628B2 (en) Virtual keyboard input method
CN110291495B (en) Information processing system, information processing method, and program
JP2013077180A (en) Recognition device and method for controlling the same
JP6481360B2 (en) Input method, input program, and input device
JP2016122475A (en) Information device having virtual keyboard
JP2022143788A (en) Display device, method for display, and program
Khare et al. QWERTY Keyboard in Virtual Domain Using Image Processing
WO2019169644A1 (en) Method and device for inputting signal
Pullan et al. High Resolution Touch Screen Module
JP5029472B2 (en) Character input device, character input method, program, and recording medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11836004

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11836004

Country of ref document: EP

Kind code of ref document: A1