WO2007097548A1 - Method and apparatus for user-interface using the hand trace - Google Patents

Method and apparatus for user-interface using the hand trace Download PDF

Info

Publication number
WO2007097548A1
WO2007097548A1 PCT/KR2007/000846 KR2007000846W WO2007097548A1 WO 2007097548 A1 WO2007097548 A1 WO 2007097548A1 KR 2007000846 W KR2007000846 W KR 2007000846W WO 2007097548 A1 WO2007097548 A1 WO 2007097548A1
Authority
WO
WIPO (PCT)
Prior art keywords
hand shape
hand
user interface
mouse
character
Prior art date
Application number
PCT/KR2007/000846
Other languages
French (fr)
Inventor
Cheol Woo Kim
Original Assignee
Cheol Woo Kim
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR1020070015439A external-priority patent/KR100858358B1/en
Application filed by Cheol Woo Kim filed Critical Cheol Woo Kim
Publication of WO2007097548A1 publication Critical patent/WO2007097548A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures

Definitions

  • the present invention relates to a user interface method and device using hand gesture recognition, and more particularly to a user interface method and device using hand gesture recognition which recognizes the movement and shape of a hand of a user to provide a user interface to an electronic device such as a computer.
  • the conventional input device such as the mouse also has spatial limitations in that it must be used on a flat surface and cannot correctly input data when the surface of the desk is covered with glass or carries a large amount of dust or other substances.
  • the present invention has been made in view of the above problems, and it is an object of the present invention to provide a user interface device and method using hand gesture recognition which allows users to input desired data only with their hand action and movement without using any separate input device.
  • the above object can be accomplished by the provision of a user interface device using hand gesture recognition, the user interface device comprising image capture means for capturing a hand of a user, the hand moving in the air; and user command identification means for generating a data input signal corresponding to mouse or keyboard input based on the movement of the hand captured by the image capture means and outputting the generated data input signal to user command processing means that processes mouse or key input.
  • the user command identification means includes a hand shape extractor for extracting a hand shape from an image received from the image capture means; and a hand shape recognizer for recognizing a change and movement of the hand shape extracted from the hand shape extractor.
  • the user command identification means further includes a noise remover for removing noise from a hand shape image extracted at the hand shape extractor and then outputting the hand shape image to the hand shape recognizer.
  • the hand shape extractor extracts a hand shape from the image received from the image capture means using a color difference between the hand and a background.
  • the hand shape extractor extracts a hand contour from the image received from the image capture means.
  • the hand shape recognizer turns on or off a mouse function according to an action of sequentially bending and straightening an index finger .
  • the hand shape recognizer turns off a mouse function when an index finger is bent and turns on the mouse function when the index finger is straightened.
  • the hand shape recognizer turns on or off a mouse function ⁇
  • the hand shape recognizer turns on a mouse function when middle, ring, and little fingers are bent and turns off the mouse function when the middle, ring, and little fingers are straightened.
  • the hand shape recognizer turns off a mouse function when five fingers are all bent.
  • the hand shape recognizer measures an area of the hand shape extracted at the hand shape extractor and recognizes a state of the hand shape based on a change in the measured area.
  • the user command identification means includes at least one of a position signal generator for tracing a movement path of the hand shape recognized at the hand shape recognizer and generating and outputting a mouse position signal corresponding to the traced movement path to the user command processing means; a click signal generator for generating a mouse click signal corresponding to a change in the hand shape recognized at the hand shape recognizer and outputting the mouse click signal to the user command processing means; a character signal generator for tracing a movement path of the hand shape recognized at the hand shape recognizer and generating and outputting a character signal corresponding to the traced movement path to the user command processing means; and a scroll signal generator for generating a mouse scroll signal corresponding to the hand shape recognized at the hand shape recognizer and outputting the mouse scroll signal to the user command processing means.
  • a position signal generator for tracing a movement path of the hand shape recognized at the hand shape recognizer and generating and outputting a mouse position signal corresponding to the traced movement path to the user command processing means
  • a click signal generator for
  • the position signal generator traces a movement path of an index finger.
  • the click signal generator generates one of a left click signal corresponding to a left mouse button and a right click signal corresponding to a right mouse button according to an action of sequentially bending and straightening a thumb with a palm facing the image capture means.
  • the click signal generator generates one of a left click signal corresponding to a left mouse button and a right click signal corresponding to a right mouse button according to an action of sequentially bending and straightening a thumb with a back of the hand facing the image capture means.
  • the click signal generator generates one of a left click signal corresponding to a left mouse button and a right click signal corresponding to a right mouse button according to an action of sequentially bending and straightening an index finger.
  • the click signal generator generates one of a left click signal corresponding to a left mouse button and a right click signal corresponding to a right mouse button according to an action of sequentially bending and straightening middle, ring, and little fingers.
  • the character signal generator traces a movement path of an index finger.
  • the scroll signal generator generates a scroll signal according to a direction of a predetermined finger in a state of the hand shape recognized at the hand shape recognizer.
  • the user command identification means further includes a mode setter for controlling one of the position signal, the character signal, the click signal, and the scroll signal to be output to the user command processing means, based on a change in the hand shape extracted at the hand shape extractor or based on a mode setting key signal input by operating a predetermined key button.
  • a mode setter for controlling one of the position signal, the character signal, the click signal, and the scroll signal to be output to the user command processing means, based on a change in the hand shape extracted at the hand shape extractor or based on a mode setting key signal input by operating a predetermined key button.
  • the character signal generator includes a stroke generation module for tracing a movement path of a predetermined feature point in the hand shape recognized at the hand shape recognizer and generating a stroke corresponding to the traced movement path; a stroke combination module for combining strokes generated at the stroke generation module to generate a character pattern; a pattern DB in which patterns of various characters are stored; and a pattern comparison module for generating a character signal corresponding to the character pattern generated at the stroke combination module when the character pattern generated at the stroke combination module is present in the pattern DB.
  • the character signal generator includes a pattern generation module for tracing a movement path of a predetermined feature point in the hand shape recognized at the hand shape recognizer and generating a movement path pattern corresponding to the traced movement path; a pattern DB in which various character patterns are stored; and a pattern comparison module for generating a character signal corresponding to a character pattern, matching the movement path pattern, included in the character patterns stored in the pattern DB.
  • a user interface method using hand gesture recognition comprising the steps of a) extracting a hand shape from an image captured by image capture means; b) recognizing a change and movement of the hand shape extracted at the step a), tracing a movement path of the recognized hand, generating a mouse position signal corresponding to the traced movement path, and generating a mouse click signal corresponding to the change in the extracted hand shape; and c) outputting the click signal and the position signal generated at the step b) to user command processing means that processes mouse input.
  • the step a) includes extracting a hand shape from the image received from the image capture means using a color difference between the hand and a background.
  • the step a) includes extracting a hand contour from the image received from the image capture means.
  • the step b) includes measuring an area of the hand shape extracted at the step a) and recognizing a state of the hand shape based on a change in the measured area.
  • a user interface method using hand gesture recognition comprising the steps of a) extracting a hand shape from an image captured by image capture means; b) recognizing a change and movement of the hand shape extracted at the step a), tracing a movement path of the recognized hand, and generating a stroke corresponding to the traced movement path; c) combining strokes generated at the step b) to generate a character pattern and generating a character signal corresponding to the generated character pattern; and d) outputting the character signal generated at the step c) to user command processing means that processes keyboard input.
  • the step a) includes extracting a hand shape from the image received from the image capture means using a color difference between the hand and a background.
  • the step a) includes extracting a hand contour from the image received from the image capture means.
  • the step b) includes measuring an area of the hand shape extracted at the step a) and recognizing a state of the hand shape based on a change in the measured area.
  • a user interface method using hand gesture recognition comprising the steps of a) extracting a hand shape from an image captured by image capture means; b) recognizing a change and movement of the hand shape extracted at the step a), tracing a movement path of the recognized hand, generating a movement path pattern corresponding to the traced movement path, and comparing the generated movement path pattern with previously stored character patterns; c) generating a character signal corresponding to a character pattern, matching the movement path pattern, included in the previously stored character patterns; and d) outputting the character signal generated at the step c) to user command processing means that processes keyboard input.
  • the step a) includes extracting a hand shape from the image received from the image capture means using a color difference between the hand and a background.
  • the step a) includes extracting a hand contour from the image received from the image capture means.
  • the step b) includes measuring an area of the hand shape extracted at the step a) and recognizing a state of the hand shape based on a change in the measured area.
  • a user interface method using hand gesture recognition comprising the steps of a) extracting a hand shape from an image captured by image capture means; b) recognizing a change and movement of the hand shape extracted at the step a) and generating a scroll signal, instructing scrolling to a predetermined direction, based on the recognized hand shape if the recognized hand shape is a predetermined specific shape; and c) outputting the scroll signal generated at the step b) to user command processing means that processes mouse input.
  • the step a) includes extracting a hand shape from the image received from the image capture means using a color difference between the hand and a background.
  • the step a) includes extracting a hand contour from the image received from the image capture means.
  • the step b) includes measuring an area of the hand shape extracted at the step a) and recognizing a state of the hand shape based on a change in the measured area.
  • a user interface method using hand gesture recognition comprising the steps of a) extracting a hand shape from an image captured by image capture means; b) recognizing a change and movement of the hand shape extracted at the step a); c) setting a data input mode based on a change in the hand shape extracted at the step b) or based on a mode setting key signal input by operating a predetermined key button; d) generating one of a mouse position signal, a mouse click signal, a mouse scroll signal, and a keyboard character signal corresponding to the change and movement of the hand shape recognized at the step b), based on the data input mode set at the step c); and e) outputting the signal generated at the step d) to user command processing means that processes mouse or keyboard input.
  • the step a) includes extracting a hand shape from the image received from the image capture means using a color difference between the hand and a background.
  • the step a) includes extracting a hand contour from the image received from the image capture means.
  • the step b) includes measuring an area of the hand shape extracted at the step a) and recognizing a state of the hand shape based on a change in the measured area.
  • the present invention captures the shape and movement of the hand of the user using the image capture means such as a CCD camera installed on an electronic device and extracts the conventional mouse input data, character input data, and scroll input signal from the captured hand shape and movement, thereby making it easy to input data without a separate data input device.
  • the image capture means such as a CCD camera installed on an electronic device and extracts the conventional mouse input data, character input data, and scroll input signal from the captured hand shape and movement, thereby making it easy to input data without a separate data input device.
  • the present invention may be applied to a television whose functions are controlled using the conventional remote controller. This allows users to easily perform, only with their hand gestures, the functions of the television such as channel switching and volume control without managing the separate remote controller.
  • the present invention may also be applied to other electronic devices with various functions such as electronic blackboards.
  • FIG. 1 is a block diagram of a user interface device that uses hand gesture recognition according to an embodiment of the present invention
  • FIG. 2 is a block diagram of user command identification means in FIG. 1 for inputting mouse data according to a first embodiment of the present invention
  • FIG. 3 is a flow chart illustrating a method for inputting mouse data according to the first embodiment of the present invention
  • FIGS. 4 and 5 are block diagrams of user command identification means in FIG. 1 for inputting character data according to a second embodiment of the present invention and according to a modification of the second embodiment, respectively
  • FIGS. 6 and 7 are flow charts illustrating a method for inputting character data according to the second embodiment of the present invention and according to the modification of the second embodiment, respectively
  • FIG. 8 is a block diagram illustrating user command identification means in FIG. 1 for inputting screen scroll data according to a third embodiment of the present invention
  • FIG. 9 is a flow chart illustrating a method for inputting screen scroll data according to the third embodiment of the present invention
  • FIG. 10 is a block diagram of the user command identification means in
  • FIG. 1 for integrating and implementing mouse and character input modes according to a fourth embodiment of the present invention
  • FIG. 11 is a flow chart illustrating a method for integrating and implementing mouse and character input modes according to the fourth embodiment of the present invention
  • FIGS. 12 to 14 illustrate one example of extracting and recognizing a hand shape
  • FIGS. 15 to 17 illustrate another example of extracting and recognizing a hand shape
  • FIG. 18 illustrates an example of various character patterns stored in a pattern database
  • FIGS. 19 to 23 illustrate another example of extracting and recognizing a hand shape
  • 24 to 26 illustrate example terminals provided with a user interface device using hand gesture recognition according to the present invent ion.
  • image capture means 200 user command identification means
  • FIG. 1 is a block diagram of a user interface device that uses hand gesture recognition according to an embodiment of the present invention.
  • the user interface device according to the present invention includes image capture means 100, user command identification means 200, user command processing means 300, and display means 400.
  • the image capture means 100 is installed on an electronic device such as a portable terminal, a general computer, and a television.
  • the image capture means 100 captures, in real time, an image of a hand of a user which moves in the air and outputs the captured image to the user command identification means 200.
  • the image capture means 100 is preferably implemented using a CCD camera although it may be implemented using any of a variety of cameras such as a CMOS camera and a CCD camera.
  • the user command identification means 200 extracts a hand shape of the user from the image input from the image capture means 100 in real time.
  • the identification means 200 extracts a predetermined feature point (for example, the tip of an index finger) from the extracted overall hand shape.
  • the identification means 200 Based on the extracted hand shape, the identification means 200 recognizes a movement of the hand and a change in the hand shape. Based on this recognition, the identification means 200 generates a data input signal corresponding to mouse or keyboard input and outputs the data input signal to the user command processing means 300.
  • the user command processing means 300 (also referred to as a "processing means” for short) performs original functions of the electronic device and controls the display means 400 and other peripheral devices.
  • the display means 400 receives a control signal and a position signal from the processing means 300 and displays a position pointer, which is the same concept as a mouser cursor, at coordinates corresponding to the position signal.
  • a click signal has been input from the processing means 300
  • the display means 400 displays the input of the click signal to the user using a method such as reversing the state of an item pointed to by the position pointer.
  • the display means 400 displays a character represented by the character signal at coordinates represented by the position signal.
  • a scroll signal has been input from the processing unit 300, the display means 400 scrolls a displayed document, webpage, or the like vertically or horizontally according to the scroll signal.
  • FIG. 2 is a block diagram of the user command identification means in FIG. 1 for inputting mouse data according to the first embodiment of the present invention. The following is a description of an example in which the identification means 200 is implemented to input only mouse input data to the ⁇
  • a mouse function is implemented by associating the tip of an index finger of a person, which is part of the hand shape of the person, with the cursor (i.e., the position pointer), associating the state of a thumb of the person with a mouse click signal, and associating states of the index finger or the overall hand shape with on/off states of the mouse function.
  • the identification means 200 traces the position of the tip of the index finger and generates and outputs a position signal corresponding to the movement of the index finger.
  • the display means 400 displays the position pointer at a position corresponding to the position of the moving index finger.
  • the identification means 200 regards this action as pressing the left button of the mouse and generates and outputs a left click signal.
  • the identification means 200 regards this action as pressing the right button of the mouse and generates and outputs a right click signal.
  • the identification means 200 turns off the mouse function.
  • the identification means 200 turns on the mouse function.
  • the user repeats the following actions.
  • the user moves the tip of the index finger in a desired direction while keeping the index finger straightened and temporarily bends the index finger and then straightens it again to deactivate the cursor function.
  • the user repeats the action of bending and straightening the index finger and then moves the cursor in the desired direction.
  • the user can move the position pointer to a distant location by repeating these actions.
  • the distance between the hand and the camera is a new variable that may affect the movement of the position pointer. If the hand moves at a short distance from the camera, the identification means 200 responds sensitively to the movement to allow the cursor to move to a greater distance than when the hand moves at a long distance from the camera.
  • a mouse function is implemented by associating the tip of the index finger with the movement of the cursor as described above, associating an index finger action of pressing and releasing in the air with a left mouse button click, associating the state of the thumb with a right mouse button, and associating states of the middle, ring, and little fingers with on/off states of the mouse function.
  • the following is a more detailed description. A description of the cursor movement is omitted here since it is similar to that of the above embodiment.
  • the identification means 200 regards this action as pressing the left mouse button and generates and outputs a left click signal.
  • the identification means 200 regards this action as pressing the right mouse button and generates and outputs a right click signal. If the user straightens the middle, ring, and little fingers, then the identification means 200 turns off the mouse function (i.e., deactivates the cursor). On 1 ,
  • the identification means 200 turns on the mouse function.
  • the mouse function implementation method according to the present invention is not limited to the above two embodiments and may be provided in various other forms.
  • a mouse function may be implemented by associating an action of temporarily bending the thumb and then straightening it again while keeping the palm facing the CCD camera with the left mouse button and associating an action of temporarily bending the middle, ring, and little fingers and then straightening them again while keeping the palm facing the CCD camera with the right mouse button.
  • a mouse function may also be implemented by associating the straightened state of all the five fingers with the on state of the mouse function and associating the bent state of all the five fingers with the off state of the mouse function.
  • the identification means 200 includes a hand shape extractor 210, a noise remover 220, a hand shape recognizer 230, a position signal generator 240, and a click signal generator 250.
  • the hand shape extractor 210 extracts a hand shape from an image input from the image capture means 100 and then outputs the hand shape to the noise remover 210.
  • the noise remover 220 removes a noise image from the hand shape image input from the hand shape extractor 210 and then outputs the hand shape image to the hand shape recognizer 230.
  • FIG. 15 illustrates such a method of extracting a hand shape.
  • One example is a method of extracting a hand shape based on the color difference between hand and background images, which will be referred to as a color difference based method.
  • the hand shape extractor 210 divides the hand image including a hand shape into sections at predetermined intervals in each of the x and y directions and then samples values of pixels at locations, where the divided sections cross each other, to form a matrix image.
  • the hand shape extractor 210 then performs the step of separating a part, over which a hand color according to the skin color such as white, yellow, or black is distributed, from the matrix image and extracting the hand part based on the image color; the step of obtaining a binary image from the extracted hand part; and the step of removing noise from the binary image and then labeling a region in the binary image, thereby determining the region of the hand.
  • the sampling is preferably implemented according to a coarse sampling method which samples pixels of the image while skipping every few pixels.
  • the step of extracting the hand part may be performed using a so-called 'RGB' method which converts the RGB color of a hand part where the hand is located into a binary image.
  • the step of extracting the hand part is performed using the following YUV method.
  • a color recognized as that of the hand part is converted into a YUV(YCbCr) color and the converted color is modeled as a color difference (CbCr) value.
  • CbCr color difference
  • a specific region around which the color occurrence frequency is high is identified based on this modeling.
  • the identified region i.e., the hand part
  • This method is advantageous over the empirical method or the above-described RGB method in that it is possible to extract the desired flesh color without being affected by the Y value (i.e., the luminance).
  • the hand shape may also be extracted in a different manner from the above methods.
  • the hand shape extractor 210 may perform the step of extracting the hand part from an image input from the image capture means 100 according to a contour extraction method in the following manner.
  • the hand shape extractor 210 converts the RGB color of a hand part where the hand is located into a grey level and differentiates the converted grey level to extract a contour of the user hand, and preferably extracts a predetermined feature point (for example, the tip of the thumb or index finger) from the extracted overall contour.
  • the hand shape recognizer 230 recognizes a change in the hand shape based on the hand shape extracted through the hand shape extractor 210 and outputs a corresponding control signal to the position signal generator 240 or the click signal generator 250.
  • FIGS. 12 to 14 illustrate one example of the recognition of the hand shape extracted according to the color difference based method described above.
  • FIGS. 16 and 17 illustrate another example of the recognition of the hand shape extracted according to the color difference based method and the contour extraction method. Preferred methods for recognizing the hand shape according to the present invention will now be described with reference to these figures. ⁇ 9i> First, a hand shape recognition method according to an embodiment of the color difference based method is described with reference to FIGS. 12 and 13.
  • the hand shape recognizer 230 checks brightness levels of the extracted hand shape image while scanning it row by row from the top to the bottom. A region included in the hand exhibits a high brightness level and a region not included in the hand exhibits a low brightness level. Accordingly, the hand shape recognizer 230 recognizes the hand as being fully straightened if the length of a line of pixels exhibiting a high brightness level in a scanned row gradually increases as pixels are scanned row by row and recognizes only one finger (i.e., the index finger) of the hand as being straightened if the length of the line of pixels sharply increases. More specifically, in the case of FIG.
  • a line of pixels with high brightness representing the hand (hereinafter referred to as a pixel line) is detected first at a region 501.
  • the length of the pixel line is increased at a region 502 compared to that of the region 501 by about 100%.
  • the length of the pixel line is increased at a region 503 compared to that of the region 502 by about 50%.
  • the length of the pixel line is increased at a region 504 compared to that of the region 503 by about 25%.
  • the hand shape recognizer 230 recognizes the hand shape as being straightened if the length of the line of pixels with high brightness is gradually increased in this manner. In the case of FIG.
  • the length of the pixel line at a region 511 is 0 and pixels with high brightness are scanned first at a region 512. Thereafter, the length of the pixel line is not increased by more than 10% at each region 513 to 515 and the line of the pixel line is sharply increased by about 200% or more at a region 516.
  • the hand shape recognizer 230 recognizes the hand shape as having only one straightened finger if the length of the line of pixels with high brightness is sharply increased in this manner. In the case of FIG. 14 where all the fingers are bent, the length of the pixel line is zero until a region 521 is reached and the length of the pixel line is sharply increased at a region 522.
  • the hand shape recognizer 230 recognizes the hand shape as being closed into a fist if the length of the line of pixels with high brightness is almost not changed after it is sharply increased from zero in this manner.
  • the hand shape recognizer 230 measures the area of a hand shape that has been converted into a matrix through the coarse sampling method described above and recognizes the state of the hand shape based on a change in the measured area. More specifically, the hand shape recognizer 230 adds a number "1" to each pixel representing the hand in the matrix hand shape to create a graph with respect to the x and y axes and then determines that coordinates added with "1" in this graph correspond to positions on the hand.
  • the hand shape recognizer 230 After determining these on-hand positions in this manner, the hand shape recognizer 230 performs more detailed sampling of the hand part where the hand is located and recreates a graph to determine the area of the hand. After measuring the area of the hand, the hand shape recognizer 230 defines a reference region for determining whether the middle, ring, and little fingers are straightened or bent, based on the measured hand area. The reference region corresponds to an upper portion in the determined on-hand positions as shown in FIG. 16. If the area of the hand measured within the reference region is close to a preset hand area when all the four Io
  • the hand shape recognizer 230 determines that the middle, ring, and little fingers are straightened. If the hand area measured within the reference region is less than the preset hand area, the hand shape recognizer 230 determines that the middle, ring, and little fingers are bent. When it is determined that the middle, ring, and little fingers are bent, it is necessary to measure the area of the index finger. Before this measurement, a reference region used to determine whether or not the thumb has been bent is determined in the same manner as the above reference region setting method.
  • the thumb is straightened if the measured area of the thumb is larger than the measured area of the index finger while it is determined that the thumb is bent if the measured area of the thumb is less than that of the index finger.
  • the hand shape recognizer 230 determines positions on the hand in the same manner as described above and separately stores left, right, tip, and wrist portions of the hand.
  • the hand shape recognizer 230 recognizes the state of the hand shape by comparing the brightness levels of the separately stored hand shape with those of a previously stored hand shape (template).
  • the hand shape recognizer 230 determines positions on the hand in the same manner as described above and separately stores left, right, tip, and wrist portions of the hand. Since the hand is represented by "Is” , the hand shape recognizer 230 counts “Is” while scanning the upper portion of the positions on the hand, which corresponds to its finger region, row by row from the top to the bottom. Based on the count of "Is” , the hand shape recognizer 230 determines the state of the hand, i.e., whether the middle, ring, and little fingers are bent or straightened. In the same manner, the hand shape recognizer 230 determines whether the thumb is bent or straightened.
  • the position signal generator 240 determines coordinates of a predetermined feature point (preferably, the tip of the index finger) in the hand shape recognized at the hand shape recognizer 230 and traces a change in the predetermined feature point to generate and output a position signal corresponding to the movement of the index finger to the user command processing means 300.
  • a predetermined feature point preferably, the tip of the index finger
  • the click signal generator 250 generates a mouse click signal corresponding to a change in a predetermined feature point (preferably, corresponding to an action of sequentially bending and straightening the thumb) in the hand shape recognized at the hand shape recognizer 230 and outputs the mouse click signal to the user command processing means 300.
  • FIG. 3 is a flow chart illustrating a method for inputting mouse data according to the first embodiment of the present invention, which is performed at the user interface device described above with reference to FlG. 2.
  • the image capture means 100 captures an image of a hand of the user in real time and outputs the generated hand image to the identification means 200 (step SIl).
  • the hand shape extractor 210 extracts a hand shape from the image input from the image capture means 100 and outputs the extracted hand shape to the hand shape recognizer 230 through the noise remover 220 (step S13).
  • the hand shape recognizer 230 determines whether or not an index finger is straightened in the received hand shape. If it is determined that the index finger is bent, the hand shape recognizer 230 waits until it receives an image with the index finger straightened.
  • the hand shape recognizer 230 proceeds to step S17 to output coordinates of the tip of the index finger to the position signal generator 240.
  • the position signal generator 214 generates a position signal using the coordinates of the tip of the index finger received from the hand shape recognizer 213 and outputs the position signal to the processing means 300.
  • the processing means 300 controls, at step S19, the display means 400 to move the position pointer according to the position signal.
  • the hand shape recognizer 230 constantly checks whether or not a thumb has been sequentially bent and straightened in the received hand shape. If the thumb has been sequentially bent and straightened, the hand shape recognizer 230 outputs a control signal instructing generation of a click signal to the click signal generator 250. When receiving the control signal from the hand shape recognizer 230, the click signal generator 250 generates and outputs a click signal to the processing means 300 at step S23.
  • the display means 400 displays the clicked region according to the control signal from the processing means 300 at step S25.
  • buttons selection function can be implemented using the above method.
  • functions such as a button selection function, a drag function, an "open file" function, a drawing function, and a typing function using a keyboard created on the screen can be implemented.
  • the button selection function is implemented by moving the cursor to point to a desired icon in a command button icon region on the screen and then bending the thumb to selecting the icon.
  • the drag function is implemented by moving the cursor to point to a desired icon in a command button icon region on the screen and then bending the thumb to selecting the icon and moving the hand again to move the command button icon to a desired position and then straightening the thumb.
  • the "open file” function is implemented by sequentially bending and straightening the thumb twice after moving the cursor to a desired position.
  • the drawing function is implemented by associating the moment when the straightened thumb is bent with the start point of a line to be drawn and associating the moment when the thumb is straightened again with the end point of the drawn line.
  • the typing function can be implemented by creating a letter button corresponding to a keyboard on the computer screen and then clicking the letter button.
  • FIG. 4 is a block diagram of the user command identification means in FIG. 1 for inputting character data according to the second embodiment of the present invention.
  • the second embodiment of the present invention is exemplified by the identification means 200 implemented to input only character input data to the processing means 300.
  • the character input method is similar to the function to write characters with a mouse in a graphic tool program or the like.
  • the user moves the index finger while keeping the thumb bent to input a stroke of a character.
  • the user moves the position of the index finger after straightening the thumb and then inputs the stroke by moving the index finger after bending the thumb again.
  • the identification means 200 for inputting character data includes the hand shape extractor 210, the noise remover 220, and the hand shape recognizer 230 described above in the first embodiment and further includes a character signal generator 260.
  • the character signal generator 260 includes a stroke generation module 261, a stroke combination module 262, a pattern comparison module 263, and a pattern DB 264.
  • the stroke generation module 261 determines coordinates of a predetermined feature point (preferably, the tip of the index finger) in the hand shape recognized at the hand shape recognizer 230 and traces a change in the predetermined feature point to generate a stroke signal corresponding to the movement of the index finger.
  • the stroke combination module 262 combines stroke signals generated at the stroke generation module 261 to generate a character pattern.
  • the pattern DB 264 stores patterns of various characters.
  • the pattern comparison module 263 compares the character pattern output from the stroke combination module 262 with the patterns stored in a pattern DB 264. If the comparison result is that a similar character pattern is present in the pattern DB 264, the pattern comparison module 263 generates and outputs a corresponding character signal to the processing means 300.
  • FIG. 6 is a flow chart illustrating a method for inputting character data according to the second embodiment of the present invention.
  • the image capture means 100 captures an image of the hand of the user in real time and outputs the generated hand image to the identification means 200 (step S31).
  • the hand shape extractor 210 extracts a hand shape from the image received from the image capture means 100 and outputs the extracted hand shape to the hand shape recognizer 230 through the noise remover 220 (step S33).
  • a detailed description of the processes for extracting and recognizing the hand shape is omitted here since the processes are similar to those of the first embodiment described above.
  • the hand shape recognizer 230 checks whether or not the index finger is straightened in the same manner as described above in the first embodiment. If the index finger is not straightened, the hand shape recognizer 230 waits until the user straightens the index finger to activate a character input mode. On the other hand, if the index finger is straightened, the hand shape recognizer 230 checks, at step S37, whether or not the thumb is bent. A detailed description of the method for determining the shapes of the thumb and index fingers is omitted here since it is similar to that of the first embodiment.
  • the hand shape recognizer 230 outputs the coordinates of the tip of the index finger while the thumb is bent to the stroke generation module 261.
  • the stroke generation module 261 generates a stroke signal using the received coordinates of the tip of the index finger.
  • the stroke combination module 262 combines generated stroke signals to generate a character pattern. That is. since bending of the thumb represents the start point of a stroke and straightening of the thumb represents the end point of the stroke, the stroke generation module 261 generates a stroke signal by connecting the coordinates of the tip of the index finger from the start point to the end point of the stroke.
  • the pattern comparison module 263 compares the character pattern output from the stroke combination module 262 with patterns stored in the pattern DB 264. If the comparison result is that a similar character pattern is present in the pattern DB 264, the pattern comparison module 263 generates and outputs a corresponding character signal to the processing means 300. Then, at step S45, the processing means 300 receives the character signal and outputs a predetermined control signal, which instructs output of the corresponding character, to the display means 400 to output the character input by the user.
  • the second embodiment of the present invention identifies the start and end points of a stroke according to the positions of the thumb.
  • patterns of movement paths of the tip of the index finger when the user writes characters in the air are produced regardless of the movement of the thumb and the produced patterns are stored in a pattern DB and a character signal is then generated by extracting a movement path of the tip of the index finger from a hand image of the user and then comparing the movement path with the patterns stored in the pattern DB.
  • FIG. 5 is a block diagram of the user command identification means in FIG. 1 for inputting character data according to the modification of the second embodiment described above.
  • the user command identification means 200 includes a hand shape extractor 210, a noise remover 220, a hand shape recognizer 230, and a character signal generator 260.
  • the character signal generator 260 includes a pattern generation module 265, a pattern comparison module 266, and a pattern DB 267, which are different from the components of the character signal generator of the second embodiment.
  • a description of the functions of the hand shape extractor 210 and the noise remover 220 is omitted here since the functions are the same as described above.
  • FIG. 7 is a flow chart illustrating a method for inputting character data according to the modification of the second embodiment.
  • the hand shape recognizer 230 recognizes a hand shape using received feature points and checks whether or not the index finger is straightened. If the index finger is bent, the hand shape recognizer 230 waits until it receives a hand image with the index finger straightened. If the index finger is straightened, the hand shape recognizer 230 outputs the coordinates of the tip of the index finger to the pattern generation module 265.
  • the pattern generation module 265 generates a pattern representing a movement path of the tip of the index finger using the coordinates received from the hand shape recognizer 230 and outputs the generated pattern to the pattern comparison module 266.
  • the pattern generation module 265 can generate the pattern in various methods. The simplest method is that, in the case where the user inputs characters in the air, one at a time at some time intervals, the pattern generation module 265 can determine that the user has completed inputting one character and thus can generate a pattern if the coordinates of the tip of the index finger are located in a specific range during a predetermined time.
  • ⁇ ii9> In another method, sections for inputting characters are displayed on the screen for the user and the position of the index finger of the user is displayed in real time on the screen.
  • the pattern generation module 265 determines that the user has completed inputting one character and generates a pattern if the coordinates of the tip of the index finger, which have been located in one section for inputting the character, move to another section for writing the next character.
  • the pattern comparison module 266 compares a movement path pattern received from the pattern generation module 265 with character patterns previously stored in the pattern DB 267 and extracts character data corresponding to the matching pattern to generate a character signal and then outputs the generated character signal to the processing means 300.
  • the pattern DB 267 functions to store patterns corresponding to characters and numbers.
  • FIG. 18 illustrates example patterns of the characters and numbers stored in the pattern DB 267.
  • each of the characters and numbers and various types of character and number patterns ( "connected character” in FIG. 18) representing the corresponding character or number are stored in association with each other in the pattern DB 267.
  • FIG. 18 It can be seen from FIG. 18 that the strokes of each of the character and number patterns of the present invention are connected to each other. The strokes of a character or number can be separated from each other when the character or number is written on paper with a hand. However, since the character and number patterns of the present invention are drawn in the air, the strokes of each of the character and number patterns cannot be separated so that all the strokes are connected to each other.
  • the pattern comparison module 266 receives a pattern generated at the pattern generation module 265 and queries the pattern DB 267 for a pattern matching the received pattern. The pattern comparison module 266 then outputs, as a character signal, the code value of a character or number corresponding to the pattern closest to the received pattern to the processing means 300. At step S61, the processing means 300 receives the character signal and outputs a predetermined control signal, which instructs output of the corresponding character, to the display means 400 to output the character input by the user.
  • FIG. 8 is a detailed block diagram illustrating the identification means in FIG. 1 for inputting a screen scroll signal according to the third preferred embodiment of the present invention
  • FIG. 9 is a flow chart illustrating a method for inputting a screen scroll signal according to the third preferred embodiment of the present invent ion.
  • the third preferred embodiment of the present invention reads a hand shape and inputs a scroll signal to the processing means 300.
  • the third embodiment of the present invention will now be described with reference to FIGS. 8 and 9.
  • the identification means 200 of the third embodiment includes a hand shape extractor 210, a noise remover 220, a hand shape recognizer 230, and a scroll signal generator 270.
  • the hand shape extractor 210 extracts, at step S103, a hand shape in the same manner as in the first and second embodiments and outputs the extracted hand shape to the hand shape recognizer 230 through the noise remover 220.
  • the hand shape recognizer 230 recognizes the hand shape using information input through the hand shape extractor 210.
  • the hand shape recognizer 230 recognizes the direction of the thumb as a scroll direction and outputs the thumb direction information to the scroll signal generator 270.
  • the finger used to determine the scroll direction is not necessarily limited to the thumb.
  • the scroll signal generator 270 checks whether the thumb direction is upward, downward, leftward, or rightward as shown in FIGS. 19 to 23 and generates and outputs a scroll signal corresponding to the direction to the processing means 300.
  • the processing means 300 performs scrolling by outputting a predetermined control signal, which instructs scrolling of a document, a webpage, or the like according to the scroll signal, to the display means 400.
  • the user command identification means 200 has been described as performing only one of the functions of the three embodiments. However, the identification means 200 is preferably implemented to perform all the functions described above.
  • FIG. 10 is a block diagram of the user command identification means in FIG. 1 for integrating and implementing mouse and character input modes according to the fourth embodiment of the present invention
  • FIG. 11 is a flow chart illustrating a method for integrating and implementing mouse and character input modes according to the fourth embodiment.
  • the identification means 200 of the fourth embodiment includes a hand shape extractor 210, a noise remover 220, a hand shape recognizer 230, a position signal generator 240, a click signal generator 250, a scroll signal generator 270, a character signal generator 260, and a mode setter 280.
  • the functions of the hand shape extractor 210, the noise remover 220, the scroll signal generator 270, the click signal generator 250, the position signal generator 240, and the character signal generator 260 are the same as those of the first to third embodiments described above.
  • the character signal generator 260 includes the modules implemented in the second or third embodiment. Accordingly, only the differences with the first to third embodiments will be described in the description of the fourth embodiment.
  • the mode setter 280 controls the operating mode of the hand shape recognizer 230 to be set to one of the character mode, the position movement mode, the click mode, and the scroll mode based on the state of the hand shape output from the noise remover 220 or based on a mode setting key signal" input by operating a predetermined key/button.
  • This mode setting function may also be performed at the hand shape recognizer 230. That is, the mode setter 280 (or the hand shape recognizer 260) controls the scroll signal generator 270 when a hand shape as shown in FIGS. 19 to 23 has been input through the hand shape extractor 210 or when a scroll setting key signal has been input through a key/button.
  • the setting mode may also be switched through any specific hand shape although it will be preferable that the setting mode be switched through a predetermined key/button when the keyboard and mouse functions are performed through a look-alike hand shape as described above in the first and second embodiments.
  • the data input mode may be switched from the keyboard to the o
  • the image capture means 100 captures an image of a hand of the user in real time and outputs the generated hand image to the identification means 200 (step Sill).
  • the hand shape extractor 210 extracts a hand shape from the image input from the image capture means 100 and outputs the extracted hand shape to the hand shape recognizer 230 through the noise remover 220 (step S113).
  • the hand shape recognizer 230 determines which data input mode has been set.
  • the hand shape recognizer 230 performs steps S117 and S119 if it is determined that the set input mode is the character mode, performs steps S125 and S127 if it is the position movement mode, performs steps S131 and S133 if it is the click mode, and performs steps S137 and S139 if it is the scroll mode.
  • the position movement mode and the click mode represent the function of the first embodiment
  • the character mode represents the function of the second embodiment or its modification
  • the scroll mode represents the function of the third embodiment.
  • the hand shape recognizer 230 performs one of the functions of the hand shape recognizer of the first to third embodiments according to the input hand shape information and mode control signal and outputs corresponding information to one of the scroll signal generator 270, the click signal generator 250, the position signal generator 240, and the character signal generator 260 to generate and output a scroll signal to the user (steps S137 and S139) , to generate and output a position signal or a click signal (steps S125 and S127 or steps S131 and S133), or to generate and output a character signal (steps S117 and S119).
  • FIGS. 24 to 26 illustrate examples of implementation of the user interface device according to the preferred embodiments of the present invention.
  • FIG. 24 illustrates an example in which the user interface device of the present invention is implemented for a PDA
  • FIG. 25 illustrates an example in which the user interface device of the present invention is applied in place of a television remote controller
  • FIG. 26 illustrates an example in which the user interface device of the present invention is applied to a desktop computer.
  • Devices for which the user interface device of the present invention is implemented are not limited to the electronic devices described with reference to FIGS. 24 to 26.
  • the user interface device of the present invention may be applied to a mobile communication terminal, an electronic blackboard, and the like and may also be applied to an automatic teller machine (ATM) and a Kiosk device to replace their touch screen functions.
  • the mobile communication terminal is a wireless communication device with portability and mobility and may be not only the Personal Digital Assistant (PDA) described above but also any type of handheld wireless communication device such as a Personal Communication System (PCS), a Personal Digital Cellular (PDC), a Personal Handyphone System (PHS), an International Mobile Telecommunication (IMT)-2000 communication terminal, a Digital Multimedia Broadcasting (DMB) phone, a smart phone, and a WiBro phone.
  • PDA Personal Digital Assistant
  • PDC Personal Digital Cellular
  • PHS Personal Handyphone System
  • IMT International Mobile Telecommunication
  • DMB Digital Multimedia Broadcasting
  • the user interface method of the present invention can be embodied as computer readable code on a computer readable recording medium.
  • the computer readable recording medium includes any data storage device that stores data which can be read by a computer system. Examples of the computer readable recording medium include a read-only memory (ROM), a random-access memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, a flash memory, and an optical data storage device.
  • the computer readable recording medium can also be embodied in the form of carrier waves (for example, signals transmitted over the Internet).
  • the computer readable recording medium can also be distributed over a network of coupled computer systems so that the computer readable code is stored and executed in a distributed fashion.

Abstract

The present invention relates to a user interface method and device using hand gesture recognition, and more particularly to a user interface method and device using hand gesture recognition, which recognizes the movement and shape of a hand of a user to provide a user interface to an electronic device such as a computer. The user interface device includes image capture means for capturing a hand of the user moving in the air and user command identification means for generating a data input signal corresponding to mouse or keyboard input based on the movement of the hand captured by the image capture means and output ting the generated data input signal to user command processing means that processes mouse or key input.

Description

[DESCRIPTION]
[Invent ion Tit Ie]
METHOD AND APPARATUS FOR USER-INTERFACE USING THE HAND TRACE
[Technical Field]
<i> The present invention relates to a user interface method and device using hand gesture recognition, and more particularly to a user interface method and device using hand gesture recognition which recognizes the movement and shape of a hand of a user to provide a user interface to an electronic device such as a computer.
[Background Art]
<2> Along with the development of electronics, a computer has become a necessity of life which carries out important functions indispensable in the office and at home. A mouse and keyboard is generally used as a device to input data to the computer.
<3> Due to the increase in the semiconductor integration density and the data processing speed, conventional desktop computer functions can now be embodied in a small portable terminal such as PDA, PSP, and PMP and such portable terminals are increasingly used. However, there are limitations in decreasing the size of a mouse or keyboard that is an input device used for the conventional desktop computer since it is composed of physical components. Even if a small mouse or keyboard is provided, always carrying it with a portable terminal causes great inconvenience in use.
<4> The conventional input device such as the mouse also has spatial limitations in that it must be used on a flat surface and cannot correctly input data when the surface of the desk is covered with glass or carries a large amount of dust or other substances.
[Disclosure]
[Technical Problem]
<5> Therefore, the present invention has been made in view of the above problems, and it is an object of the present invention to provide a user interface device and method using hand gesture recognition which allows users to input desired data only with their hand action and movement without using any separate input device.
[Technical Solution]
<6> In accordance with an embodiment of the present invention, the above object can be accomplished by the provision of a user interface device using hand gesture recognition, the user interface device comprising image capture means for capturing a hand of a user, the hand moving in the air; and user command identification means for generating a data input signal corresponding to mouse or keyboard input based on the movement of the hand captured by the image capture means and outputting the generated data input signal to user command processing means that processes mouse or key input.
<7> Preferably, the user command identification means includes a hand shape extractor for extracting a hand shape from an image received from the image capture means; and a hand shape recognizer for recognizing a change and movement of the hand shape extracted from the hand shape extractor. <8> Preferably, the user command identification means further includes a noise remover for removing noise from a hand shape image extracted at the hand shape extractor and then outputting the hand shape image to the hand shape recognizer.
<9> Preferably, the hand shape extractor extracts a hand shape from the image received from the image capture means using a color difference between the hand and a background.
<io> Preferably, the hand shape extractor extracts a hand contour from the image received from the image capture means.
<ii> Preferably, the hand shape recognizer turns on or off a mouse function according to an action of sequentially bending and straightening an index finger .
<12> Preferably, the hand shape recognizer turns off a mouse function when an index finger is bent and turns on the mouse function when the index finger is straightened.
<i3> Preferably, the hand shape recognizer turns on or off a mouse function ό
according to an action of sequentially bending and straightening middle, ring, and little fingers.
<i4> Preferably, the hand shape recognizer turns on a mouse function when middle, ring, and little fingers are bent and turns off the mouse function when the middle, ring, and little fingers are straightened.
<i5> Preferably, the hand shape recognizer turns off a mouse function when five fingers are all bent.
<i6> Preferably, the hand shape recognizer measures an area of the hand shape extracted at the hand shape extractor and recognizes a state of the hand shape based on a change in the measured area.
<i7> Preferably, the user command identification means includes at least one of a position signal generator for tracing a movement path of the hand shape recognized at the hand shape recognizer and generating and outputting a mouse position signal corresponding to the traced movement path to the user command processing means; a click signal generator for generating a mouse click signal corresponding to a change in the hand shape recognized at the hand shape recognizer and outputting the mouse click signal to the user command processing means; a character signal generator for tracing a movement path of the hand shape recognized at the hand shape recognizer and generating and outputting a character signal corresponding to the traced movement path to the user command processing means; and a scroll signal generator for generating a mouse scroll signal corresponding to the hand shape recognized at the hand shape recognizer and outputting the mouse scroll signal to the user command processing means.
<i8> Preferably, the position signal generator traces a movement path of an index finger.
<19> Preferably, the click signal generator generates one of a left click signal corresponding to a left mouse button and a right click signal corresponding to a right mouse button according to an action of sequentially bending and straightening a thumb with a palm facing the image capture means.
<20> Preferably, the click signal generator generates one of a left click signal corresponding to a left mouse button and a right click signal corresponding to a right mouse button according to an action of sequentially bending and straightening a thumb with a back of the hand facing the image capture means.
<2i> Preferably, the click signal generator generates one of a left click signal corresponding to a left mouse button and a right click signal corresponding to a right mouse button according to an action of sequentially bending and straightening an index finger.
<22> Preferably, the click signal generator generates one of a left click signal corresponding to a left mouse button and a right click signal corresponding to a right mouse button according to an action of sequentially bending and straightening middle, ring, and little fingers.
<23> Preferably, the character signal generator traces a movement path of an index finger.
<24> Preferably, the scroll signal generator generates a scroll signal according to a direction of a predetermined finger in a state of the hand shape recognized at the hand shape recognizer.
<25> Preferably, the user command identification means further includes a mode setter for controlling one of the position signal, the character signal, the click signal, and the scroll signal to be output to the user command processing means, based on a change in the hand shape extracted at the hand shape extractor or based on a mode setting key signal input by operating a predetermined key button.
<26> Preferably, the character signal generator includes a stroke generation module for tracing a movement path of a predetermined feature point in the hand shape recognized at the hand shape recognizer and generating a stroke corresponding to the traced movement path; a stroke combination module for combining strokes generated at the stroke generation module to generate a character pattern; a pattern DB in which patterns of various characters are stored; and a pattern comparison module for generating a character signal corresponding to the character pattern generated at the stroke combination module when the character pattern generated at the stroke combination module is present in the pattern DB.
<27> Preferably, the character signal generator includes a pattern generation module for tracing a movement path of a predetermined feature point in the hand shape recognized at the hand shape recognizer and generating a movement path pattern corresponding to the traced movement path; a pattern DB in which various character patterns are stored; and a pattern comparison module for generating a character signal corresponding to a character pattern, matching the movement path pattern, included in the character patterns stored in the pattern DB.
<28> In accordance with an embodiment of the present invention, there is provided a user interface method using hand gesture recognition, the method comprising the steps of a) extracting a hand shape from an image captured by image capture means; b) recognizing a change and movement of the hand shape extracted at the step a), tracing a movement path of the recognized hand, generating a mouse position signal corresponding to the traced movement path, and generating a mouse click signal corresponding to the change in the extracted hand shape; and c) outputting the click signal and the position signal generated at the step b) to user command processing means that processes mouse input.
<29> Preferably, the step a) includes extracting a hand shape from the image received from the image capture means using a color difference between the hand and a background.
<30> Preferably, the step a) includes extracting a hand contour from the image received from the image capture means.
<3i> Preferably, the step b) includes measuring an area of the hand shape extracted at the step a) and recognizing a state of the hand shape based on a change in the measured area.
<32> In accordance with another embodiment of the present invention, there is provided a user interface method using hand gesture recognition, the method comprising the steps of a) extracting a hand shape from an image captured by image capture means; b) recognizing a change and movement of the hand shape extracted at the step a), tracing a movement path of the recognized hand, and generating a stroke corresponding to the traced movement path; c) combining strokes generated at the step b) to generate a character pattern and generating a character signal corresponding to the generated character pattern; and d) outputting the character signal generated at the step c) to user command processing means that processes keyboard input.
<33> Preferably, the step a) includes extracting a hand shape from the image received from the image capture means using a color difference between the hand and a background.
<34> Preferably, the step a) includes extracting a hand contour from the image received from the image capture means.
<35> Preferably, the step b) includes measuring an area of the hand shape extracted at the step a) and recognizing a state of the hand shape based on a change in the measured area.
<36> In accordance with another embodiment of the present invention, there is provided a user interface method using hand gesture recognition, the method comprising the steps of a) extracting a hand shape from an image captured by image capture means; b) recognizing a change and movement of the hand shape extracted at the step a), tracing a movement path of the recognized hand, generating a movement path pattern corresponding to the traced movement path, and comparing the generated movement path pattern with previously stored character patterns; c) generating a character signal corresponding to a character pattern, matching the movement path pattern, included in the previously stored character patterns; and d) outputting the character signal generated at the step c) to user command processing means that processes keyboard input.
<37> Preferably, the step a) includes extracting a hand shape from the image received from the image capture means using a color difference between the hand and a background.
<38> Preferably, the step a) includes extracting a hand contour from the image received from the image capture means.
<39> Preferably, the step b) includes measuring an area of the hand shape extracted at the step a) and recognizing a state of the hand shape based on a change in the measured area.
<40> In accordance with another embodiment of the present invention, there is provided a user interface method using hand gesture recognition, the method comprising the steps of a) extracting a hand shape from an image captured by image capture means; b) recognizing a change and movement of the hand shape extracted at the step a) and generating a scroll signal, instructing scrolling to a predetermined direction, based on the recognized hand shape if the recognized hand shape is a predetermined specific shape; and c) outputting the scroll signal generated at the step b) to user command processing means that processes mouse input.
<4i> Preferably, the step a) includes extracting a hand shape from the image received from the image capture means using a color difference between the hand and a background.
<42> Preferably, the step a) includes extracting a hand contour from the image received from the image capture means.
<43> Preferably, the step b) includes measuring an area of the hand shape extracted at the step a) and recognizing a state of the hand shape based on a change in the measured area.
<44> In accordance with another embodiment of the present invention, there is provided a user interface method using hand gesture recognition, the method comprising the steps of a) extracting a hand shape from an image captured by image capture means; b) recognizing a change and movement of the hand shape extracted at the step a); c) setting a data input mode based on a change in the hand shape extracted at the step b) or based on a mode setting key signal input by operating a predetermined key button; d) generating one of a mouse position signal, a mouse click signal, a mouse scroll signal, and a keyboard character signal corresponding to the change and movement of the hand shape recognized at the step b), based on the data input mode set at the step c); and e) outputting the signal generated at the step d) to user command processing means that processes mouse or keyboard input.
<45> Preferably, the step a) includes extracting a hand shape from the image received from the image capture means using a color difference between the hand and a background.
<46> Preferably, the step a) includes extracting a hand contour from the image received from the image capture means.
<47> Preferably, the step b) includes measuring an area of the hand shape extracted at the step a) and recognizing a state of the hand shape based on a change in the measured area. [Advantageous Effects]
<48> As described above, the present invention captures the shape and movement of the hand of the user using the image capture means such as a CCD camera installed on an electronic device and extracts the conventional mouse input data, character input data, and scroll input signal from the captured hand shape and movement, thereby making it easy to input data without a separate data input device.
<49> The present invention may be applied to a television whose functions are controlled using the conventional remote controller. This allows users to easily perform, only with their hand gestures, the functions of the television such as channel switching and volume control without managing the separate remote controller. The present invention may also be applied to other electronic devices with various functions such as electronic blackboards. [Description of Drawings]
<50> FIG. 1 is a block diagram of a user interface device that uses hand gesture recognition according to an embodiment of the present invention;
<5i> FIG. 2 is a block diagram of user command identification means in FIG. 1 for inputting mouse data according to a first embodiment of the present invention;
<52> FIG. 3 is a flow chart illustrating a method for inputting mouse data according to the first embodiment of the present invention; <53> FIGS. 4 and 5 are block diagrams of user command identification means in FIG. 1 for inputting character data according to a second embodiment of the present invention and according to a modification of the second embodiment, respectively; <54> FIGS. 6 and 7 are flow charts illustrating a method for inputting character data according to the second embodiment of the present invention and according to the modification of the second embodiment, respectively; <55> FIG. 8 is a block diagram illustrating user command identification means in FIG. 1 for inputting screen scroll data according to a third embodiment of the present invention; <56> FIG. 9 is a flow chart illustrating a method for inputting screen scroll data according to the third embodiment of the present invention; <57> FIG. 10 is a block diagram of the user command identification means in
FIG. 1 for integrating and implementing mouse and character input modes according to a fourth embodiment of the present invention; <58> FIG. 11 is a flow chart illustrating a method for integrating and implementing mouse and character input modes according to the fourth embodiment of the present invention; <59> FIGS. 12 to 14 illustrate one example of extracting and recognizing a hand shape; <60> FIGS. 15 to 17 illustrate another example of extracting and recognizing a hand shape; <6i> FIG. 18 illustrates an example of various character patterns stored in a pattern database; <62> FIGS. 19 to 23 illustrate another example of extracting and recognizing a hand shape; and <63> FIGS. 24 to 26 illustrate example terminals provided with a user interface device using hand gesture recognition according to the present invent ion. <64> -Description of Reference Numerals of Main Elements in the Drawings- <65> 100: image capture means 200: user command identification means
<66> 210: hand shape extractor 220: noise remover <67> 230: hand shape recognizer 240: position signal generator <68> 250: click signal generator 260: character signal generator <69> 261: stroke generation module 262: stroke combination module <70> 263: pattern comparison module 264,267: pattern DB <7i> 265: pattern generation module 266: pattern comparison module <72> 270: scroll signal generator 280: mode setter
[Mode for Invention]
<73> A user interface device and method using hand gesture recognition according to preferred embodiments of the present invention will now be described in detail with reference to the accompanying drawings. The first preferred embodiment of the present invention describes a user interface device and method for implementing a mouse, the second embodiment describes a user interface device and method for implementing a keyboard, the third embodiment describes a user interface device and method for implementing a screen scroll function in the mouse function, and the fourth embodiment describes a user interface device and method for integrating and implementing the mouse and keyboard functions of the first to third embodiments. <74> FIG. 1 is a block diagram of a user interface device that uses hand gesture recognition according to an embodiment of the present invention. <75> As shown in FIG. 1, the user interface device according to the present invention includes image capture means 100, user command identification means 200, user command processing means 300, and display means 400.
<76> The image capture means 100 is installed on an electronic device such as a portable terminal, a general computer, and a television. The image capture means 100 captures, in real time, an image of a hand of a user which moves in the air and outputs the captured image to the user command identification means 200. The image capture means 100 is preferably implemented using a CCD camera although it may be implemented using any of a variety of cameras such as a CMOS camera and a CCD camera.
<77> The user command identification means 200 (also referred to as an identification means" for short) extracts a hand shape of the user from the image input from the image capture means 100 in real time. Preferably, the identification means 200 extracts a predetermined feature point (for example, the tip of an index finger) from the extracted overall hand shape. Based on the extracted hand shape, the identification means 200 recognizes a movement of the hand and a change in the hand shape. Based on this recognition, the identification means 200 generates a data input signal corresponding to mouse or keyboard input and outputs the data input signal to the user command processing means 300.
<78> Using the data input signal received from the identification means 200, the user command processing means 300 (also referred to as a "processing means" for short) performs original functions of the electronic device and controls the display means 400 and other peripheral devices.
<79> The display means 400 receives a control signal and a position signal from the processing means 300 and displays a position pointer, which is the same concept as a mouser cursor, at coordinates corresponding to the position signal. When a click signal has been input from the processing means 300, the display means 400 displays the input of the click signal to the user using a method such as reversing the state of an item pointed to by the position pointer. In addition, when a character signal has been input from the processing means 300, the display means 400 displays a character represented by the character signal at coordinates represented by the position signal. When a scroll signal has been input from the processing unit 300, the display means 400 scrolls a displayed document, webpage, or the like vertically or horizontally according to the scroll signal.
<80> FIG. 2 is a block diagram of the user command identification means in FIG. 1 for inputting mouse data according to the first embodiment of the present invention. The following is a description of an example in which the identification means 200 is implemented to input only mouse input data to the ^
processing means 300.
<8i> In a method of implementing a mouse data input function according to an embodiment of the present invention, a mouse function is implemented by associating the tip of an index finger of a person, which is part of the hand shape of the person, with the cursor (i.e., the position pointer), associating the state of a thumb of the person with a mouse click signal, and associating states of the index finger or the overall hand shape with on/off states of the mouse function. The following is a more detailed description. When the user moves his/her index finger while keeping it straightened, the identification means 200 traces the position of the tip of the index finger and generates and outputs a position signal corresponding to the movement of the index finger. The display means 400 displays the position pointer at a position corresponding to the position of the moving index finger. When the user initially straightens the thumb and then sequentially bends and straightens it while keeping the palm facing the CCD camera, the identification means 200 regards this action as pressing the left button of the mouse and generates and outputs a left click signal. When the user initially straightens the thumb and then sequentially bends and straightens it while keeping the back of the hand facing the CCD camera, the identification means 200 regards this action as pressing the right button of the mouse and generates and outputs a right click signal. If the user performs an action of sequentially pressing and releasing with the index finger, as if sequentially pressing and releasing a button, without moving the hand or if the user straightens all the fingers or bends the index finger without moving the hand, then the identification means 200 turns off the mouse function. On the other hand, if the user repeats the action of pressing and releasing with the index finger or if the user bends the middle, ring, and little fingers and straightens the index finger, then the identification means 200 turns on the mouse function.
<82> According to this method, if it is necessary to move the cursor far away from the current position on the computer screen, the user repeats the following actions. The user moves the tip of the index finger in a desired direction while keeping the index finger straightened and temporarily bends the index finger and then straightens it again to deactivate the cursor function. After moving the hand in the direction opposite to the desired direction, the user repeats the action of bending and straightening the index finger and then moves the cursor in the desired direction. The user can move the position pointer to a distant location by repeating these actions. The distance between the hand and the camera is a new variable that may affect the movement of the position pointer. If the hand moves at a short distance from the camera, the identification means 200 responds sensitively to the movement to allow the cursor to move to a greater distance than when the hand moves at a long distance from the camera.
<83> In a method of implementing a mouse function according to another embodiment of the present invention, a mouse function is implemented by associating the tip of the index finger with the movement of the cursor as described above, associating an index finger action of pressing and releasing in the air with a left mouse button click, associating the state of the thumb with a right mouse button, and associating states of the middle, ring, and little fingers with on/off states of the mouse function. The following is a more detailed description. A description of the cursor movement is omitted here since it is similar to that of the above embodiment. When the user performs an action of sequentially pressing and releasing with the index finger, as if sequentially pressing and releasing a button, without moving the hand, the identification means 200 regards this action as pressing the left mouse button and generates and outputs a left click signal. When the user initially straightens the thumb and then sequentially bends and straightens it while keeping the palm facing the CCD camera, the identification means 200 regards this action as pressing the right mouse button and generates and outputs a right click signal. If the user straightens the middle, ring, and little fingers, then the identification means 200 turns off the mouse function (i.e., deactivates the cursor). On 1 ,
14
the other hand, if the user bends the middle, ring, and little fingers, then the identification means 200 turns on the mouse function.
<84> The mouse function implementation method according to the present invention is not limited to the above two embodiments and may be provided in various other forms. For example, a mouse function may be implemented by associating an action of temporarily bending the thumb and then straightening it again while keeping the palm facing the CCD camera with the left mouse button and associating an action of temporarily bending the middle, ring, and little fingers and then straightening them again while keeping the palm facing the CCD camera with the right mouse button. A mouse function may also be implemented by associating the straightened state of all the five fingers with the on state of the mouse function and associating the bent state of all the five fingers with the off state of the mouse function.
<85> As shown in FIG. 2, the identification means 200 includes a hand shape extractor 210, a noise remover 220, a hand shape recognizer 230, a position signal generator 240, and a click signal generator 250.
<86> The hand shape extractor 210 extracts a hand shape from an image input from the image capture means 100 and then outputs the hand shape to the noise remover 210.
<87> The noise remover 220 removes a noise image from the hand shape image input from the hand shape extractor 210 and then outputs the hand shape image to the hand shape recognizer 230.
<88> The method of extracting a hand shape may be implemented in various ways. FIG. 15 illustrates such a method of extracting a hand shape. One example is a method of extracting a hand shape based on the color difference between hand and background images, which will be referred to as a color difference based method. As shown in FIG. 15, the hand shape extractor 210 divides the hand image including a hand shape into sections at predetermined intervals in each of the x and y directions and then samples values of pixels at locations, where the divided sections cross each other, to form a matrix image. The hand shape extractor 210 then performs the step of separating a part, over which a hand color according to the skin color such as white, yellow, or black is distributed, from the matrix image and extracting the hand part based on the image color; the step of obtaining a binary image from the extracted hand part; and the step of removing noise from the binary image and then labeling a region in the binary image, thereby determining the region of the hand. Here, the sampling is preferably implemented according to a coarse sampling method which samples pixels of the image while skipping every few pixels. The step of extracting the hand part may be performed using a so-called 'RGB' method which converts the RGB color of a hand part where the hand is located into a binary image. More preferably, the step of extracting the hand part is performed using the following YUV method. In this YUV method, a color recognized as that of the hand part is converted into a YUV(YCbCr) color and the converted color is modeled as a color difference (CbCr) value. A specific region around which the color occurrence frequency is high is identified based on this modeling. Then, the identified region (i.e., the hand part) is extracted through mathematical calculation. This method is advantageous over the empirical method or the above-described RGB method in that it is possible to extract the desired flesh color without being affected by the Y value (i.e., the luminance).
<89> The hand shape may also be extracted in a different manner from the above methods. For example, the hand shape extractor 210 may perform the step of extracting the hand part from an image input from the image capture means 100 according to a contour extraction method in the following manner. The hand shape extractor 210 converts the RGB color of a hand part where the hand is located into a grey level and differentiates the converted grey level to extract a contour of the user hand, and preferably extracts a predetermined feature point (for example, the tip of the thumb or index finger) from the extracted overall contour.
<90> On the other hand, the hand shape recognizer 230 recognizes a change in the hand shape based on the hand shape extracted through the hand shape extractor 210 and outputs a corresponding control signal to the position signal generator 240 or the click signal generator 250. FIGS. 12 to 14 illustrate one example of the recognition of the hand shape extracted according to the color difference based method described above. FIGS. 16 and 17 illustrate another example of the recognition of the hand shape extracted according to the color difference based method and the contour extraction method. Preferred methods for recognizing the hand shape according to the present invention will now be described with reference to these figures. <9i> First, a hand shape recognition method according to an embodiment of the color difference based method is described with reference to FIGS. 12 and 13. The hand shape recognizer 230 checks brightness levels of the extracted hand shape image while scanning it row by row from the top to the bottom. A region included in the hand exhibits a high brightness level and a region not included in the hand exhibits a low brightness level. Accordingly, the hand shape recognizer 230 recognizes the hand as being fully straightened if the length of a line of pixels exhibiting a high brightness level in a scanned row gradually increases as pixels are scanned row by row and recognizes only one finger (i.e., the index finger) of the hand as being straightened if the length of the line of pixels sharply increases. More specifically, in the case of FIG. 12 where the hand is straightened, a line of pixels with high brightness representing the hand (hereinafter referred to as a pixel line) is detected first at a region 501. The length of the pixel line is increased at a region 502 compared to that of the region 501 by about 100%. Thereafter, the length of the pixel line is increased at a region 503 compared to that of the region 502 by about 50%. The length of the pixel line is increased at a region 504 compared to that of the region 503 by about 25%. The hand shape recognizer 230 recognizes the hand shape as being straightened if the length of the line of pixels with high brightness is gradually increased in this manner. In the case of FIG. 13 where only one finger is straightened, the length of the pixel line at a region 511 is 0 and pixels with high brightness are scanned first at a region 512. Thereafter, the length of the pixel line is not increased by more than 10% at each region 513 to 515 and the line of the pixel line is sharply increased by about 200% or more at a region 516. The hand shape recognizer 230 recognizes the hand shape as having only one straightened finger if the length of the line of pixels with high brightness is sharply increased in this manner. In the case of FIG. 14 where all the fingers are bent, the length of the pixel line is zero until a region 521 is reached and the length of the pixel line is sharply increased at a region 522. Thereafter, the length of the pixel line is almost not increased at a region 523, compared to that of the region 522 and the length of the pixel line is increased at a region 524 by only about 20%. Thereafter, the length of the pixel line is almost not increased at a region 325. The hand shape recognizer 230 recognizes the hand shape as being closed into a fist if the length of the line of pixels with high brightness is almost not changed after it is sharply increased from zero in this manner.
<92> Next, a hand shape recognition method according to another embodiment of the color difference based method is described with reference to FIGS. 16 and 17. In this method, the hand shape recognizer 230 measures the area of a hand shape that has been converted into a matrix through the coarse sampling method described above and recognizes the state of the hand shape based on a change in the measured area. More specifically, the hand shape recognizer 230 adds a number "1" to each pixel representing the hand in the matrix hand shape to create a graph with respect to the x and y axes and then determines that coordinates added with "1" in this graph correspond to positions on the hand. After determining these on-hand positions in this manner, the hand shape recognizer 230 performs more detailed sampling of the hand part where the hand is located and recreates a graph to determine the area of the hand. After measuring the area of the hand, the hand shape recognizer 230 defines a reference region for determining whether the middle, ring, and little fingers are straightened or bent, based on the measured hand area. The reference region corresponds to an upper portion in the determined on-hand positions as shown in FIG. 16. If the area of the hand measured within the reference region is close to a preset hand area when all the four Io
fingers are straightened, the hand shape recognizer 230 determines that the middle, ring, and little fingers are straightened. If the hand area measured within the reference region is less than the preset hand area, the hand shape recognizer 230 determines that the middle, ring, and little fingers are bent. When it is determined that the middle, ring, and little fingers are bent, it is necessary to measure the area of the index finger. Before this measurement, a reference region used to determine whether or not the thumb has been bent is determined in the same manner as the above reference region setting method. Assuming that the area of the thumb is similar to that of the index finger, it is determined that the thumb is straightened if the measured area of the thumb is larger than the measured area of the index finger while it is determined that the thumb is bent if the measured area of the thumb is less than that of the index finger.
<93> Next, a hand shape recognition method according to another embodiment of the color difference based method is described with reference to FIGS. 16 and 17. The hand shape recognizer 230 determines positions on the hand in the same manner as described above and separately stores left, right, tip, and wrist portions of the hand. The hand shape recognizer 230 recognizes the state of the hand shape by comparing the brightness levels of the separately stored hand shape with those of a previously stored hand shape (template).
<94> Next, a hand shape recognition method according to an embodiment of the contour extraction method is described with reference to FIGS. 16 and 17. The hand shape recognizer 230 determines positions on the hand in the same manner as described above and separately stores left, right, tip, and wrist portions of the hand. Since the hand is represented by "Is" , the hand shape recognizer 230 counts "Is" while scanning the upper portion of the positions on the hand, which corresponds to its finger region, row by row from the top to the bottom. Based on the count of "Is" , the hand shape recognizer 230 determines the state of the hand, i.e., whether the middle, ring, and little fingers are bent or straightened. In the same manner, the hand shape recognizer 230 determines whether the thumb is bent or straightened.
<95> On the other hand, the position signal generator 240 determines coordinates of a predetermined feature point (preferably, the tip of the index finger) in the hand shape recognized at the hand shape recognizer 230 and traces a change in the predetermined feature point to generate and output a position signal corresponding to the movement of the index finger to the user command processing means 300.
<96> The click signal generator 250 generates a mouse click signal corresponding to a change in a predetermined feature point (preferably, corresponding to an action of sequentially bending and straightening the thumb) in the hand shape recognized at the hand shape recognizer 230 and outputs the mouse click signal to the user command processing means 300.
<97> FIG. 3 is a flow chart illustrating a method for inputting mouse data according to the first embodiment of the present invention, which is performed at the user interface device described above with reference to FlG. 2.
<98> First, the image capture means 100 captures an image of a hand of the user in real time and outputs the generated hand image to the identification means 200 (step SIl). The hand shape extractor 210 extracts a hand shape from the image input from the image capture means 100 and outputs the extracted hand shape to the hand shape recognizer 230 through the noise remover 220 (step S13). At step S15, the hand shape recognizer 230 determines whether or not an index finger is straightened in the received hand shape. If it is determined that the index finger is bent, the hand shape recognizer 230 waits until it receives an image with the index finger straightened. If it is determined that the index finger is straightened, the hand shape recognizer 230 proceeds to step S17 to output coordinates of the tip of the index finger to the position signal generator 240. At step S17, the position signal generator 214 generates a position signal using the coordinates of the tip of the index finger received from the hand shape recognizer 213 and outputs the position signal to the processing means 300. When receiving the position signal from the identification means 200, the processing means 300 controls, at step S19, the display means 400 to move the position pointer according to the position signal.
<99> On the other hand, at step S21, the hand shape recognizer 230 constantly checks whether or not a thumb has been sequentially bent and straightened in the received hand shape. If the thumb has been sequentially bent and straightened, the hand shape recognizer 230 outputs a control signal instructing generation of a click signal to the click signal generator 250. When receiving the control signal from the hand shape recognizer 230, the click signal generator 250 generates and outputs a click signal to the processing means 300 at step S23. The display means 400 displays the clicked region according to the control signal from the processing means 300 at step S25.
<ioo> Although the procedure from steps S21 to S25 has been described as being performed after the procedure from steps S15 to S19 for convenience of explanation, those skilled in the art will appreciate that the two procedures may be performed in parallel.
<ioi> In addition, the complete mouse functions can be implemented using the above method. For example, functions such as a button selection function, a drag function, an "open file" function, a drawing function, and a typing function using a keyboard created on the screen can be implemented. The button selection function is implemented by moving the cursor to point to a desired icon in a command button icon region on the screen and then bending the thumb to selecting the icon.
<1O2> The drag function is implemented by moving the cursor to point to a desired icon in a command button icon region on the screen and then bending the thumb to selecting the icon and moving the hand again to move the command button icon to a desired position and then straightening the thumb.
<i03> The "open file" function is implemented by sequentially bending and straightening the thumb twice after moving the cursor to a desired position. The drawing function is implemented by associating the moment when the straightened thumb is bent with the start point of a line to be drawn and associating the moment when the thumb is straightened again with the end point of the drawn line. The typing function can be implemented by creating a letter button corresponding to a keyboard on the computer screen and then clicking the letter button.
<iO4> FIG. 4 is a block diagram of the user command identification means in FIG. 1 for inputting character data according to the second embodiment of the present invention. The second embodiment of the present invention is exemplified by the identification means 200 implemented to input only character input data to the processing means 300.
<iO5> Here, a character input method of the second embodiment is briefly described. The character input method is similar to the function to write characters with a mouse in a graphic tool program or the like. For example, the user moves the index finger while keeping the thumb bent to input a stroke of a character. When the user needs to move the position of the index finger in order to input the next stroke, the user moves the position of the index finger after straightening the thumb and then inputs the stroke by moving the index finger after bending the thumb again.
<iO6> As shown in FIGS. 4 and 6, the identification means 200 for inputting character data includes the hand shape extractor 210, the noise remover 220, and the hand shape recognizer 230 described above in the first embodiment and further includes a character signal generator 260.
<iO7> The character signal generator 260 includes a stroke generation module 261, a stroke combination module 262, a pattern comparison module 263, and a pattern DB 264. The stroke generation module 261 determines coordinates of a predetermined feature point (preferably, the tip of the index finger) in the hand shape recognized at the hand shape recognizer 230 and traces a change in the predetermined feature point to generate a stroke signal corresponding to the movement of the index finger. The stroke combination module 262 combines stroke signals generated at the stroke generation module 261 to generate a character pattern. The pattern DB 264 stores patterns of various characters. The pattern comparison module 263 compares the character pattern output from the stroke combination module 262 with the patterns stored in a pattern DB 264. If the comparison result is that a similar character pattern is present in the pattern DB 264, the pattern comparison module 263 generates and outputs a corresponding character signal to the processing means 300.
<i08> FIG. 6 is a flow chart illustrating a method for inputting character data according to the second embodiment of the present invention.
<iO9> First, the image capture means 100 captures an image of the hand of the user in real time and outputs the generated hand image to the identification means 200 (step S31). The hand shape extractor 210 extracts a hand shape from the image received from the image capture means 100 and outputs the extracted hand shape to the hand shape recognizer 230 through the noise remover 220 (step S33). A detailed description of the processes for extracting and recognizing the hand shape is omitted here since the processes are similar to those of the first embodiment described above.
<no> On the other hand, at step S35, the hand shape recognizer 230 checks whether or not the index finger is straightened in the same manner as described above in the first embodiment. If the index finger is not straightened, the hand shape recognizer 230 waits until the user straightens the index finger to activate a character input mode. On the other hand, if the index finger is straightened, the hand shape recognizer 230 checks, at step S37, whether or not the thumb is bent. A detailed description of the method for determining the shapes of the thumb and index fingers is omitted here since it is similar to that of the first embodiment.
<iii> If the thumb is bent, the hand shape recognizer 230 outputs the coordinates of the tip of the index finger while the thumb is bent to the stroke generation module 261. At step S39, the stroke generation module 261 generates a stroke signal using the received coordinates of the tip of the index finger. At step S41, the stroke combination module 262 combines generated stroke signals to generate a character pattern. That is. since bending of the thumb represents the start point of a stroke and straightening of the thumb represents the end point of the stroke, the stroke generation module 261 generates a stroke signal by connecting the coordinates of the tip of the index finger from the start point to the end point of the stroke.
<ii2> Then, at step S43, the pattern comparison module 263 compares the character pattern output from the stroke combination module 262 with patterns stored in the pattern DB 264. If the comparison result is that a similar character pattern is present in the pattern DB 264, the pattern comparison module 263 generates and outputs a corresponding character signal to the processing means 300. Then, at step S45, the processing means 300 receives the character signal and outputs a predetermined control signal, which instructs output of the corresponding character, to the display means 400 to output the character input by the user.
<ii3> The second embodiment of the present invention identifies the start and end points of a stroke according to the positions of the thumb. However, in the following modification of the embodiment, patterns of movement paths of the tip of the index finger when the user writes characters in the air are produced regardless of the movement of the thumb and the produced patterns are stored in a pattern DB and a character signal is then generated by extracting a movement path of the tip of the index finger from a hand image of the user and then comparing the movement path with the patterns stored in the pattern DB.
<ii4> FIG. 5 is a block diagram of the user command identification means in FIG. 1 for inputting character data according to the modification of the second embodiment described above.
<ii5> As shown in FIGS. 5 and 7, the user command identification means 200 according to the modification of the second embodiment includes a hand shape extractor 210, a noise remover 220, a hand shape recognizer 230, and a character signal generator 260. The character signal generator 260 includes a pattern generation module 265, a pattern comparison module 266, and a pattern DB 267, which are different from the components of the character signal generator of the second embodiment. A description of the functions of the hand shape extractor 210 and the noise remover 220 is omitted here since the functions are the same as described above.
<H6> FIG. 7 is a flow chart illustrating a method for inputting character data according to the modification of the second embodiment.
<ii7> At step S55, the hand shape recognizer 230 recognizes a hand shape using received feature points and checks whether or not the index finger is straightened. If the index finger is bent, the hand shape recognizer 230 waits until it receives a hand image with the index finger straightened. If the index finger is straightened, the hand shape recognizer 230 outputs the coordinates of the tip of the index finger to the pattern generation module 265.
<ii8> At step S57, the pattern generation module 265 generates a pattern representing a movement path of the tip of the index finger using the coordinates received from the hand shape recognizer 230 and outputs the generated pattern to the pattern comparison module 266. The pattern generation module 265 can generate the pattern in various methods. The simplest method is that, in the case where the user inputs characters in the air, one at a time at some time intervals, the pattern generation module 265 can determine that the user has completed inputting one character and thus can generate a pattern if the coordinates of the tip of the index finger are located in a specific range during a predetermined time.
<ii9> In another method, sections for inputting characters are displayed on the screen for the user and the position of the index finger of the user is displayed in real time on the screen. In the case where the user inputs characters in the air, the pattern generation module 265 determines that the user has completed inputting one character and generates a pattern if the coordinates of the tip of the index finger, which have been located in one section for inputting the character, move to another section for writing the next character.
<i20> At step S59, the pattern comparison module 266 compares a movement path pattern received from the pattern generation module 265 with character patterns previously stored in the pattern DB 267 and extracts character data corresponding to the matching pattern to generate a character signal and then outputs the generated character signal to the processing means 300.
<i2i> The pattern DB 267 functions to store patterns corresponding to characters and numbers. FIG. 18 illustrates example patterns of the characters and numbers stored in the pattern DB 267. As shown in FIG. 18, each of the characters and numbers and various types of character and number patterns ( "connected character" in FIG. 18) representing the corresponding character or number are stored in association with each other in the pattern DB 267. It can be seen from FIG. 18 that the strokes of each of the character and number patterns of the present invention are connected to each other. The strokes of a character or number can be separated from each other when the character or number is written on paper with a hand. However, since the character and number patterns of the present invention are drawn in the air, the strokes of each of the character and number patterns cannot be separated so that all the strokes are connected to each other.
<i22> The pattern comparison module 266 receives a pattern generated at the pattern generation module 265 and queries the pattern DB 267 for a pattern matching the received pattern. The pattern comparison module 266 then outputs, as a character signal, the code value of a character or number corresponding to the pattern closest to the received pattern to the processing means 300. At step S61, the processing means 300 receives the character signal and outputs a predetermined control signal, which instructs output of the corresponding character, to the display means 400 to output the character input by the user.
<i23> The above description has been given of the first and second preferred embodiments of the present invention. FIG. 8 is a detailed block diagram illustrating the identification means in FIG. 1 for inputting a screen scroll signal according to the third preferred embodiment of the present invention and FIG. 9 is a flow chart illustrating a method for inputting a screen scroll signal according to the third preferred embodiment of the present invent ion.
<i24> The third preferred embodiment of the present invention reads a hand shape and inputs a scroll signal to the processing means 300. The third embodiment of the present invention will now be described with reference to FIGS. 8 and 9. The identification means 200 of the third embodiment includes a hand shape extractor 210, a noise remover 220, a hand shape recognizer 230, and a scroll signal generator 270.
<i25> As shown in FIGS. 19 to 23, the hand shape extractor 210 extracts, at step S103, a hand shape in the same manner as in the first and second embodiments and outputs the extracted hand shape to the hand shape recognizer 230 through the noise remover 220.
<i26> At step S105, the hand shape recognizer 230 recognizes the hand shape using information input through the hand shape extractor 210. In a preferred embodiment of the present invention, the hand shape recognizer 230 recognizes the direction of the thumb as a scroll direction and outputs the thumb direction information to the scroll signal generator 270. Of course, the finger used to determine the scroll direction is not necessarily limited to the thumb.
<i27> At step S107, the scroll signal generator 270 checks whether the thumb direction is upward, downward, leftward, or rightward as shown in FIGS. 19 to 23 and generates and outputs a scroll signal corresponding to the direction to the processing means 300.
<i28> At step S109, the processing means 300 performs scrolling by outputting a predetermined control signal, which instructs scrolling of a document, a webpage, or the like according to the scroll signal, to the display means 400.
<129> In the first to third embodiments, the user command identification means 200 has been described as performing only one of the functions of the three embodiments. However, the identification means 200 is preferably implemented to perform all the functions described above.
<i30> FIG. 10 is a block diagram of the user command identification means in FIG. 1 for integrating and implementing mouse and character input modes according to the fourth embodiment of the present invention and FIG. 11 is a flow chart illustrating a method for integrating and implementing mouse and character input modes according to the fourth embodiment.
<i3i> As shown in FIGS. 10 and 11, the identification means 200 of the fourth embodiment includes a hand shape extractor 210, a noise remover 220, a hand shape recognizer 230, a position signal generator 240, a click signal generator 250, a scroll signal generator 270, a character signal generator 260, and a mode setter 280.
<i32> The functions of the hand shape extractor 210, the noise remover 220, the scroll signal generator 270, the click signal generator 250, the position signal generator 240, and the character signal generator 260 are the same as those of the first to third embodiments described above. The character signal generator 260 includes the modules implemented in the second or third embodiment. Accordingly, only the differences with the first to third embodiments will be described in the description of the fourth embodiment.
<i33> Specifically, the mode setter 280 controls the operating mode of the hand shape recognizer 230 to be set to one of the character mode, the position movement mode, the click mode, and the scroll mode based on the state of the hand shape output from the noise remover 220 or based on a mode setting key signal" input by operating a predetermined key/button. This mode setting function may also be performed at the hand shape recognizer 230. That is, the mode setter 280 (or the hand shape recognizer 260) controls the scroll signal generator 270 when a hand shape as shown in FIGS. 19 to 23 has been input through the hand shape extractor 210 or when a scroll setting key signal has been input through a key/button. The setting mode may also be switched through any specific hand shape although it will be preferable that the setting mode be switched through a predetermined key/button when the keyboard and mouse functions are performed through a look-alike hand shape as described above in the first and second embodiments. For example, the data input mode may be switched from the keyboard to the o
mouse or from the mouse to the keyboard through an action of curling all the fingers into a fist and straightening them in turn.
<i34> A user interface method according to the fourth embodiment of the present invention will now be described based on the configuration described above. The image capture means 100 captures an image of a hand of the user in real time and outputs the generated hand image to the identification means 200 (step Sill). The hand shape extractor 210 extracts a hand shape from the image input from the image capture means 100 and outputs the extracted hand shape to the hand shape recognizer 230 through the noise remover 220 (step S113).
<i35> Then, at step S115, the hand shape recognizer 230 determines which data input mode has been set. The hand shape recognizer 230 performs steps S117 and S119 if it is determined that the set input mode is the character mode, performs steps S125 and S127 if it is the position movement mode, performs steps S131 and S133 if it is the click mode, and performs steps S137 and S139 if it is the scroll mode. Here, the position movement mode and the click mode represent the function of the first embodiment, the character mode represents the function of the second embodiment or its modification, and the scroll mode represents the function of the third embodiment.
<i36> That is, the hand shape recognizer 230 performs one of the functions of the hand shape recognizer of the first to third embodiments according to the input hand shape information and mode control signal and outputs corresponding information to one of the scroll signal generator 270, the click signal generator 250, the position signal generator 240, and the character signal generator 260 to generate and output a scroll signal to the user (steps S137 and S139) , to generate and output a position signal or a click signal (steps S125 and S127 or steps S131 and S133), or to generate and output a character signal (steps S117 and S119).
<137> The above description has been given of the user interface device and method according to the preferred embodiments of the present invention. Those skilled in the art will appreciate that various modifications of the above embodiments may be made within the scope and spirit of the present invention described above.
<i38> FIGS. 24 to 26 illustrate examples of implementation of the user interface device according to the preferred embodiments of the present invention. FIG. 24 illustrates an example in which the user interface device of the present invention is implemented for a PDA, FIG. 25 illustrates an example in which the user interface device of the present invention is applied in place of a television remote controller, and FIG. 26 illustrates an example in which the user interface device of the present invention is applied to a desktop computer. Devices for which the user interface device of the present invention is implemented are not limited to the electronic devices described with reference to FIGS. 24 to 26. The user interface device of the present invention may be applied to a mobile communication terminal, an electronic blackboard, and the like and may also be applied to an automatic teller machine (ATM) and a Kiosk device to replace their touch screen functions. The mobile communication terminal is a wireless communication device with portability and mobility and may be not only the Personal Digital Assistant (PDA) described above but also any type of handheld wireless communication device such as a Personal Communication System (PCS), a Personal Digital Cellular (PDC), a Personal Handyphone System (PHS), an International Mobile Telecommunication (IMT)-2000 communication terminal, a Digital Multimedia Broadcasting (DMB) phone, a smart phone, and a WiBro phone.
<139> The user interface method of the present invention can be embodied as computer readable code on a computer readable recording medium. The computer readable recording medium includes any data storage device that stores data which can be read by a computer system. Examples of the computer readable recording medium include a read-only memory (ROM), a random-access memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, a flash memory, and an optical data storage device. The computer readable recording medium can also be embodied in the form of carrier waves (for example, signals transmitted over the Internet). The computer readable recording medium can also be distributed over a network of coupled computer systems so that the computer readable code is stored and executed in a distributed fashion.
[Industrial Applicability]
<14O> Although the present invention has been described above with reference to its preferred embodiments, those skilled in the art will appreciate that the present invention may be implemented in other modified forms without departing from the essential characteristics of the present invention. Therefore, the embodiments described above should be construed in illustrative aspects rather than in restrictive aspects. The scope of the invention is represented by the appended claims, not by the above description, and all changes coming within the equivalent range of the appended claims should be construed as being included in the invention.

Claims

[CLAIMSl [Claim 1] <i42> A user interface device using hand gesture recognition, the user interface device comprising-' <143> image capture means for capturing a hand of a user, the hand moving in the air ; and
<i44> user command identification means for generating a data input signal corresponding to mouse or keyboard input based on the movement of the hand captured by the image capture means and outputting the generated data input signal to user command processing means that processes mouse or key input.
[Claim 2] <i45> The user interface device according to claim 1, wherein the user command identification means includes: <i46> a hand shape extractor for extracting a hand shape from an image received from the image capture means; and
<i47> a hand shape recognizer for recognizing a change and movement of the hand shape extracted from the hand shape extractor.
[Claim 3]
<i48> The user interface device according to claim 2, wherein the user command identification means further includes a noise remover for removing noise from a hand shape image extracted at the hand shape extractor and then outputting the hand shape image to the hand shape recognizer.
[Claim 4]
<i49> The user interface device according to claim 2, wherein the hand shape extractor extracts a hand shape from the image received from the image capture means using a color difference between the hand and a background.
[Claim 5]
<i50> The user interface device according to claim 2, wherein the hand shape extractor extracts a hand contour from the image received from the image capture means.
[Claim 6] <i5i> The user interface device according to claim 2. wherein the hand shape recognizer turns on or off a mouse function according to an action of sequentially bending and straightening an index finger.
[Claim 7]
<i52> The user interface device according to claim 2, wherein the hand shape recognizer turns off a mouse function when an index finger is bent and turns on the mouse function when the index finger is straightened.
[Claim 8]
<i53> The user interface device according to claim 2, wherein the hand shape recognizer turns on or off a mouse function according to an action of sequentially bending and straightening middle, ring, and little fingers.
[Claim 9]
<i54> The user interface device according to claim 2, wherein the hand shape recognizer turns on a mouse function when middle, ring, and little fingers are bent and turns off the mouse function when the middle, ring, and little fingers are straightened.
[Claim 10]
<i55> The user interface device according to claim 2, wherein the hand shape recognizer turns off a mouse function when five fingers are all bent.
[Claim 11]
<i56> The user interface device according to claim 2, wherein the hand shape recognizer measures an area of the hand shape extracted at the hand shape extractor and recognizes a state of the hand shape based on a change in the measured area.
[Claim 12]
<i57> The user interface device according to claim 2, wherein the user command identification means includes at least one of:
<i58> a position signal generator for tracing a movement path of the hand shape recognized at the hand shape recognizer and generating and outputting a mouse position signal corresponding to the traced movement path to the user command processing means; <159> a click signal generator for generating a mouse click signal corresponding to a change in the hand shape recognized at the hand shape recognizer and outputting the mouse click signal to the user command processing means;
<i60> a character signal generator for tracing a movement path of the hand shape recognized at the hand shape recognizer and generating and outputting a character signal corresponding to the traced movement path to the user command processing means! and
<]6i> a scroll signal generator for generating a mouse scroll signal corresponding to the hand shape recognized at the hand shape recognizer and outputting the mouse scroll signal to the user command processing means.
[Claim 13]
<i62> The user interface device according to claim 12, wherein the position signal generator traces a movement path of an index finger.
[Claim 14]
<i63> The user interface device according to claim 12, wherein the click signal generator generates one of a left click signal corresponding to a left mouse button and a right click signal corresponding to a right mouse button according to an action of sequentially bending and straightening a thumb with a palm facing the image capture means.
[Claim 15]
<i64> The user interface device according to claim 12, wherein the click signal generator generates one of a left click signal corresponding to a left mouse button and a right click signal corresponding to a right mouse button according to an action of sequentially bending and straightening a thumb with a back of the hand facing the image capture means.
[Claim 16]
<i65> The user interface device according to claim 12, wherein the click signal generator generates one of a left click signal corresponding to a left mouse button and a right click signal corresponding to a right mouse button according to an action of sequentially bending and straightening an index finger.
[Claim 17]
<i66> The user interface device according to claim 12, wherein the click signal generator generates one of a left click signal corresponding to a left mouse button and a right click signal corresponding to a right mouse button according to an action of sequentially bending and straightening middle, ring, and little fingers.
[Claim 18]
<i67> The user interface device according to claim 12, wherein the character signal generator traces a movement path of an index finger.
[Claim 19]
<i68> The user interface device according to claim 12, wherein the scroll signal generator generates a scroll signal according to a direction of a predetermined finger in a state of the hand shape recognized at the hand shape recognizer.
[Claim 20]
<i69> The user interface device according to any one of claims 12 to 19, wherein the user command identification means further includes:
<17O> a mode setter for controlling one of the position signal, the character signal, the click signal, and the scroll signal to be output to the user command processing means, based on a change in the hand shape extracted at the hand shape extractor or based on a mode setting key signal input by operating a predetermined key button.
[Claim 21]
<i7i> The user interface device according to claim 20, wherein the character signal generator includes:
<i72> a stroke generation module for tracing a movement path of a predetermined feature point in the hand shape recognized at the hand shape recognizer and generating a stroke corresponding to the traced movement path;
<i73> a stroke combination module for combining strokes generated at the stroke generation module to generate a character pattern; <i74> a pattern DB in which patterns of various characters are stored; and <175> a pattern comparison module for generating a character signal corresponding to the character pattern generated at the stroke combination module when the character pattern generated at the stroke combination module is present in the pattern DB.
[Claim 22] <i76> The user interface device according to claim 20, wherein the character signal generator includes: <177> a pattern generation module for tracing a movement path of a predetermined feature point in the hand shape recognized at the hand shape recognizer and generating a movement path pattern corresponding to the traced movement path;
<i78> a pattern DB in which various character patterns are stored; and <i79> a pattern comparison module for generating a character signal corresponding to a character pattern, matching the movement path pattern, included in the character patterns stored in the pattern DB.
[Claim 23] <i8o> A user interface method using hand gesture recognition, the method comprising the steps of: <i8i> a) extracting a hand shape from an image captured by image capture means ; <i82> b) recognizing a change and movement of the hand shape extracted at the step a), tracing a movement path of the recognized hand, generating a mouse position signal corresponding to the traced movement path, and generating a mouse click signal corresponding to the change in the extracted hand shape; and <i83> c) outputting the click signal and the position signal generated at the step b) to user command processing means that processes mouse input.
[Claim 24] <i84> The user interface method according to claim 23, wherein the step a) includes extracting a hand shape from the image received from the image capture means using a color difference between the hand and a background.
[Claim 25] <i85> The user interface method according to claim 23, wherein the step a) includes extracting a hand contour from the image received from the image capture means.
[Claim 26] <i86> The user interface method according to claim 23, wherein the step b) includes measuring an area of the hand shape extracted at the step a) and recognizing a state of the hand shape based on a change in the measured area.
[Claim 27] <i87> A user interface method using hand gesture recognition, the method comprising the steps of- <i88> a) extracting a hand shape from an image captured by image capture means ; <i89> b) recognizing a change and movement of the hand shape extracted at the step a), tracing a movement path of the recognized hand, and generating a stroke corresponding to the traced movement path; <i90> c) combining strokes generated at the step b) to generate a character pattern and generating a character signal corresponding to the generated character pattern; and <i9i> d) outputting the character signal generated at the step c) to user command processing means that processes keyboard input.
[Claim 28] <i92> The user interface method according to claim 27, wherein the step a) includes extracting a hand shape from the image received from the image capture means using a color difference between the hand and a background.
[Claim 29] <193> The user interface method according to claim 27. wherein the step a) includes extracting a hand contour from the image received from the image capture means.
[Claim 30] <i94> The user interface method according to claim 27, wherein the step b) includes measuring an area of the hand shape extracted at the step a) and recognizing a state of the hand shape based on a change in the measured area.
[Claim 31]
<195> A user interface method using hand gesture recognition, the method comprising the steps of:
<i96> a) extracting a hand shape from an image captured by image capture means ;
<i97> b) recognizing a change and movement of the hand shape extracted at the step a), tracing a movement path of the recognized hand, generating a movement path pattern corresponding to the traced movement path, and comparing the generated movement path pattern with previously stored character patterns;
<i98> c) generating a character signal corresponding to a character pattern, matching the movement path pattern, included in the previously stored character patterns; and
<199> d) outputting the character signal generated at the step c) to user command processing means that processes keyboard input.
[Claim 32]
<2oo> The user interface method according to claim 31, wherein the step a) includes extracting a hand shape from the image received from the image capture means using a color difference between the hand and a background.
[Claim 33]
<2oi> The user interface method according to claim 31, wherein the step a) includes extracting a hand contour from the image received from the image capture means.
[Claim 34]
<202> The user interface method according to claim 31, wherein the step b) includes measuring an area of the hand shape extracted at the step a) and recognizing a state of the hand shape based on a change in the measured area.
[Claim 35] OO
<203> A user interface method using hand gesture recognition, the method comprising the steps of: <204> a) extracting a hand shape from an image captured by image capture means ; <205> b) recognizing a change and movement of the hand shape extracted at the step a) and generating a scroll signal, instructing scrolling to a predetermined direction, based on the recognized hand shape if the recognized hand shape is a predetermined specific shape; and <206> c) outputting the scroll signal generated at the step b) to user command processing means that processes mouse input.
[Claim 36] <207> The user interface method according to claim 35, wherein the step a) includes extracting a hand shape from the image received from the image capture means using a color difference between the hand and a background.
[Claim 37] <208> The user interface method according to claim 35, wherein the step a) includes extracting a hand contour from the image received from the image capture means.
[Claim 38] <209> The user interface method according to claim 35, wherein the step b) includes measuring an area of the hand shape extracted at the step a) and recognizing a state of the hand shape based on a change in the measured area.
[Claim 39] <2io> A user interface method using hand gesture recognition, the method comprising the steps of: <2ii> a) extracting a hand shape from an image captured by image capture means ; <2i2> b) recognizing a change and movement of the hand shape extracted at the step a) '. <2i3> c) setting a data input mode based on a change in the hand shape extracted at the step b) or based on a mode setting key signal input by operating a predetermined key button; <2i4> d) generating one of a mouse position signal, a mouse click signal, a mouse scroll signal, and a keyboard character signal corresponding to the change and movement of the hand shape recognized at the step b) , based on the data input mode set at the step c); and <2i5> e) outputting the signal generated at the step d) to user command processing means that processes mouse or keyboard input.
[Claim 40] <2i6> The user interface method according to claim 39, wherein the step a) includes extracting a hand shape from the image received from the image capture means using a color difference between the hand and a background.
[Claim 41] <2i7> The user interface method according to claim 39, wherein the step a) includes extracting a hand contour from the image received from the image capture means.
[Claim 42] <2i8> The user interface method according to claim 39, wherein the step b) includes measuring an area of the hand shape extracted at the step a) and recognizing a state of the hand shape based on a change in the measured area.
PCT/KR2007/000846 2006-02-20 2007-02-20 Method and apparatus for user-interface using the hand trace WO2007097548A1 (en)

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
KR20060016036 2006-02-20
KR10-2006-0016036 2006-02-20
KR20060095754 2006-09-29
KR10-2006-0095754 2006-09-29
KR1020070015439A KR100858358B1 (en) 2006-09-29 2007-02-14 Method and apparatus for user-interface using the hand trace
KR10-2007-0015439 2007-02-14

Publications (1)

Publication Number Publication Date
WO2007097548A1 true WO2007097548A1 (en) 2007-08-30

Family

ID=38437558

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2007/000846 WO2007097548A1 (en) 2006-02-20 2007-02-20 Method and apparatus for user-interface using the hand trace

Country Status (1)

Country Link
WO (1) WO2007097548A1 (en)

Cited By (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009128064A2 (en) * 2008-04-14 2009-10-22 Pointgrab Ltd. Vision based pointing device emulation
WO2010064094A1 (en) * 2008-12-01 2010-06-10 Sony Ericsson Mobile Communications Ab Portable electronic device with split vision content sharing control and method
GB2474536A (en) * 2009-10-13 2011-04-20 Pointgrab Ltd Computer vision gesture based control by hand shape recognition and object tracking
EP2475183A1 (en) * 2011-01-06 2012-07-11 Samsung Electronics Co., Ltd. Display apparatus controlled by motion and motion control method thereof
EP2040156A3 (en) * 2007-09-19 2013-01-16 Sony Corporation Image processing for issuing commands
JP2013015877A (en) * 2011-06-30 2013-01-24 Nakayo Telecommun Inc Data input method by virtual mouse
CN103093196A (en) * 2013-01-14 2013-05-08 大连理工大学 Character interactive input and recognition method based on gestures
CN103192398A (en) * 2012-01-04 2013-07-10 三星电子株式会社 Method for controlling robot hand
JP2013143082A (en) * 2012-01-12 2013-07-22 Fujitsu Ltd Finger position detection device, finger position detection method and finger position detection computer program
EP2631739A2 (en) * 2012-05-21 2013-08-28 Huawei Technologies Co., Ltd. Method and device for contact-free control by hand gesture
US8588467B2 (en) 2009-06-25 2013-11-19 Samsung Electronics Co., Ltd. Apparatus and method for detecting hands of subject in real time
CN103425244A (en) * 2012-05-16 2013-12-04 意法半导体有限公司 Gesture recognition
CN104102340A (en) * 2013-04-15 2014-10-15 欧姆龙株式会社 Gesture recognition device, gesture recognition method, and electronic apparatus
WO2014189685A1 (en) * 2013-05-23 2014-11-27 Fastvdo Llc Motion-assisted visual language for human computer interfaces
US8938124B2 (en) 2012-05-10 2015-01-20 Pointgrab Ltd. Computer vision based tracking of a hand
US9030487B2 (en) 2011-08-01 2015-05-12 Lg Electronics Inc. Electronic device for displaying three-dimensional image and method of using the same
US20150147730A1 (en) * 2013-11-26 2015-05-28 Lenovo (Singapore) Pte. Ltd. Typing feedback derived from sensor information
US9182838B2 (en) 2011-04-19 2015-11-10 Microsoft Technology Licensing, Llc Depth camera-based relative gesture detection
US9436872B2 (en) 2014-02-24 2016-09-06 Hong Kong Applied Science and Technology Research Institute Company Limited System and method for detecting and tracking multiple parts of an object
US9513711B2 (en) 2011-01-06 2016-12-06 Samsung Electronics Co., Ltd. Electronic device controlled by a motion and controlling method thereof using different motions to activate voice versus motion recognition
CN106236336A (en) * 2016-08-15 2016-12-21 中国科学院重庆绿色智能技术研究院 A kind of myoelectric limb gesture and dynamics control method
US10043387B1 (en) 2017-01-25 2018-08-07 International Business Machines Corporation Map display with directions generating and download facility
CN109101099A (en) * 2018-07-17 2018-12-28 Oppo广东移动通信有限公司 Prevent method, storage medium and the electronic equipment of proximity state exception
CN109189249A (en) * 2018-09-14 2019-01-11 厦门盈趣科技股份有限公司 A kind of mouse control method and mouse
WO2019062205A1 (en) * 2017-09-29 2019-04-04 京东方科技集团股份有限公司 Electronic tablet and control method therefor, and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1996034332A1 (en) * 1995-04-28 1996-10-31 Matsushita Electric Industrial Co., Ltd. Interface device
KR19990061763A (en) * 1997-12-31 1999-07-26 윤종용 Method and device for interface between computer and user using hand gesture
KR20020052217A (en) * 2000-12-25 2002-07-03 가나이 쓰토무 Electronics device applying an image sensor
JP2003346162A (en) * 2002-05-28 2003-12-05 Japan Science & Technology Corp Input system by image recognition of hand
KR20050092611A (en) * 2004-03-16 2005-09-22 김성규 Apparatus and method for control using chase information of the object

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1996034332A1 (en) * 1995-04-28 1996-10-31 Matsushita Electric Industrial Co., Ltd. Interface device
KR19990061763A (en) * 1997-12-31 1999-07-26 윤종용 Method and device for interface between computer and user using hand gesture
KR20020052217A (en) * 2000-12-25 2002-07-03 가나이 쓰토무 Electronics device applying an image sensor
JP2003346162A (en) * 2002-05-28 2003-12-05 Japan Science & Technology Corp Input system by image recognition of hand
KR20050092611A (en) * 2004-03-16 2005-09-22 김성규 Apparatus and method for control using chase information of the object

Cited By (42)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2040156A3 (en) * 2007-09-19 2013-01-16 Sony Corporation Image processing for issuing commands
US8643598B2 (en) 2007-09-19 2014-02-04 Sony Corporation Image processing apparatus and method, and program therefor
WO2009128064A3 (en) * 2008-04-14 2010-01-14 Pointgrab Ltd. Vision based pointing device emulation
WO2009128064A2 (en) * 2008-04-14 2009-10-22 Pointgrab Ltd. Vision based pointing device emulation
WO2010064094A1 (en) * 2008-12-01 2010-06-10 Sony Ericsson Mobile Communications Ab Portable electronic device with split vision content sharing control and method
JP2012510659A (en) * 2008-12-01 2012-05-10 ソニー エリクソン モバイル コミュニケーションズ, エービー Portable electronic device and method with shared visual content sharing control function
US8588467B2 (en) 2009-06-25 2013-11-19 Samsung Electronics Co., Ltd. Apparatus and method for detecting hands of subject in real time
GB2474536A (en) * 2009-10-13 2011-04-20 Pointgrab Ltd Computer vision gesture based control by hand shape recognition and object tracking
GB2474536B (en) * 2009-10-13 2011-11-02 Pointgrab Ltd Computer vision gesture based control of a device
US8693732B2 (en) 2009-10-13 2014-04-08 Pointgrab Ltd. Computer vision gesture based control of a device
US8666115B2 (en) 2009-10-13 2014-03-04 Pointgrab Ltd. Computer vision gesture based control of a device
EP2475183A1 (en) * 2011-01-06 2012-07-11 Samsung Electronics Co., Ltd. Display apparatus controlled by motion and motion control method thereof
US9513711B2 (en) 2011-01-06 2016-12-06 Samsung Electronics Co., Ltd. Electronic device controlled by a motion and controlling method thereof using different motions to activate voice versus motion recognition
US9398243B2 (en) 2011-01-06 2016-07-19 Samsung Electronics Co., Ltd. Display apparatus controlled by motion and motion control method thereof
CN102681658A (en) * 2011-01-06 2012-09-19 三星电子株式会社 Display apparatus controlled by motion and motion control method thereof
US9182838B2 (en) 2011-04-19 2015-11-10 Microsoft Technology Licensing, Llc Depth camera-based relative gesture detection
JP2013015877A (en) * 2011-06-30 2013-01-24 Nakayo Telecommun Inc Data input method by virtual mouse
US9030487B2 (en) 2011-08-01 2015-05-12 Lg Electronics Inc. Electronic device for displaying three-dimensional image and method of using the same
CN103192398A (en) * 2012-01-04 2013-07-10 三星电子株式会社 Method for controlling robot hand
US9545717B2 (en) 2012-01-04 2017-01-17 Samsung Electronics Co., Ltd. Robot hand and humanoid robot having the same
JP2013143082A (en) * 2012-01-12 2013-07-22 Fujitsu Ltd Finger position detection device, finger position detection method and finger position detection computer program
US8938124B2 (en) 2012-05-10 2015-01-20 Pointgrab Ltd. Computer vision based tracking of a hand
CN103425244A (en) * 2012-05-16 2013-12-04 意法半导体有限公司 Gesture recognition
US8866781B2 (en) 2012-05-21 2014-10-21 Huawei Technologies Co., Ltd. Contactless gesture-based control method and apparatus
EP2631739A4 (en) * 2012-05-21 2013-09-11 Huawei Tech Co Ltd Contactless gesture-based control method and apparatus
EP2631739A2 (en) * 2012-05-21 2013-08-28 Huawei Technologies Co., Ltd. Method and device for contact-free control by hand gesture
CN103093196A (en) * 2013-01-14 2013-05-08 大连理工大学 Character interactive input and recognition method based on gestures
CN104102340A (en) * 2013-04-15 2014-10-15 欧姆龙株式会社 Gesture recognition device, gesture recognition method, and electronic apparatus
US9524425B2 (en) 2013-04-15 2016-12-20 Omron Corporation Gesture recognition device, gesture recognition method, electronic apparatus, control program, and recording medium
US10168794B2 (en) 2013-05-23 2019-01-01 Fastvdo Llc Motion-assisted visual language for human computer interfaces
WO2014189685A1 (en) * 2013-05-23 2014-11-27 Fastvdo Llc Motion-assisted visual language for human computer interfaces
US9829984B2 (en) 2013-05-23 2017-11-28 Fastvdo Llc Motion-assisted visual language for human computer interfaces
US20150147730A1 (en) * 2013-11-26 2015-05-28 Lenovo (Singapore) Pte. Ltd. Typing feedback derived from sensor information
US10928924B2 (en) * 2013-11-26 2021-02-23 Lenovo (Singapore) Pte. Ltd. Typing feedback derived from sensor information
US9436872B2 (en) 2014-02-24 2016-09-06 Hong Kong Applied Science and Technology Research Institute Company Limited System and method for detecting and tracking multiple parts of an object
CN106236336A (en) * 2016-08-15 2016-12-21 中国科学院重庆绿色智能技术研究院 A kind of myoelectric limb gesture and dynamics control method
US10043387B1 (en) 2017-01-25 2018-08-07 International Business Machines Corporation Map display with directions generating and download facility
US10614710B2 (en) 2017-01-25 2020-04-07 International Business Machines Corporation Map display with directions generating and download facility
US10679500B2 (en) 2017-01-25 2020-06-09 International Business Machines Corporation Map display with directions generating and download facility
WO2019062205A1 (en) * 2017-09-29 2019-04-04 京东方科技集团股份有限公司 Electronic tablet and control method therefor, and storage medium
CN109101099A (en) * 2018-07-17 2018-12-28 Oppo广东移动通信有限公司 Prevent method, storage medium and the electronic equipment of proximity state exception
CN109189249A (en) * 2018-09-14 2019-01-11 厦门盈趣科技股份有限公司 A kind of mouse control method and mouse

Similar Documents

Publication Publication Date Title
WO2007097548A1 (en) Method and apparatus for user-interface using the hand trace
KR100858358B1 (en) Method and apparatus for user-interface using the hand trace
US11137834B2 (en) Vehicle system and method for detection of user motions performed simultaneously
US8577100B2 (en) Remote input method using fingerprint recognition sensor
KR100943792B1 (en) A device and a method for identifying movement pattenrs
CN111273778B (en) Method and device for controlling electronic equipment based on gestures
US20140123079A1 (en) Drawing control method, apparatus, and mobile terminal
CN104793731A (en) Information input method for wearable device and wearable device
WO2017114002A1 (en) Device and method for inputting one-dimensional handwritten text
US20240077948A1 (en) Gesture-based display interface control method and apparatus, device and storage medium
CN106713811A (en) Video communication method and device
CN103106388B (en) Method and system of image recognition
US20180268585A1 (en) Character input method, character input device, and wearable device
CN109471586B (en) Keycap color matching method and device and terminal equipment
CN105242776A (en) Control method for intelligent glasses and intelligent glasses
EP4307096A1 (en) Key function execution method, apparatus and device, and storage medium
CN111857334A (en) Human body gesture letter recognition method and device, computer equipment and storage medium
Yin et al. CamK: Camera-based keystroke detection and localization for small mobile devices
CN109283999B (en) Gesture interaction method and interaction system
Hartanto et al. Real time hand gesture movements tracking and recognizing system
CN110519517B (en) Copy guiding method, electronic device and computer readable storage medium
Lee et al. Virtual keyboards with real-time and robust deep learning-based gesture recognition
CN104484078A (en) Man-machine interactive system and method based on radio frequency identification
CN110990238B (en) Non-invasive visual test script automatic recording method based on video shooting
CN113282164A (en) Processing method and device

Legal Events

Date Code Title Description
DPE2 Request for preliminary examination filed before expiration of 19th month from priority date (pct application filed from 20040101)
121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 07708996

Country of ref document: EP

Kind code of ref document: A1