WO2009116079A2 - Character based input using pre-defined human body gestures - Google Patents

Character based input using pre-defined human body gestures Download PDF

Info

Publication number
WO2009116079A2
WO2009116079A2 PCT/IN2009/000123 IN2009000123W WO2009116079A2 WO 2009116079 A2 WO2009116079 A2 WO 2009116079A2 IN 2009000123 W IN2009000123 W IN 2009000123W WO 2009116079 A2 WO2009116079 A2 WO 2009116079A2
Authority
WO
WIPO (PCT)
Prior art keywords
signals
receiver
devices
detecting sensor
gestures
Prior art date
Application number
PCT/IN2009/000123
Other languages
French (fr)
Other versions
WO2009116079A3 (en
Inventor
Chattopadhyay Tanushyam
Pal Arpan
Biswas Provat
Saha Biswanath
Original Assignee
Tata Counsultancy Services Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tata Counsultancy Services Limited filed Critical Tata Counsultancy Services Limited
Publication of WO2009116079A2 publication Critical patent/WO2009116079A2/en
Publication of WO2009116079A3 publication Critical patent/WO2009116079A3/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language

Definitions

  • This invention relates to the field of remote controlling.
  • this invention relates to remote controlling using gestures.
  • buttons on remote controls, keyboards and mouse with a user friendly and convenient system for controlling devices as well as providing text and character based input.
  • US Patent 6747632 B2 discloses a wireless control device, having a housing worn on the body of an operator and capable of detecting and interpreting gestures performed by the user.
  • the wireless device is capable of receiving input independent of position of the user or of a part of the user's body.
  • the housing includes a plurality of optical emitters and optical detectors for emitting and detecting motion or hand gestures of users in three dimensions.
  • the invention also comprises a voice recognition sensor.
  • the invention allows gesture recognition to be used not only for cursor control and character-level data entry, but also for word and phrase level data entry. Invention also has the gesture recognition capabilities whereby the devices can be switched on and off as the user desires.
  • An object of the present invention is to provide a single user friendly system for controlling and operating a plurality of devices.
  • Another object of the present invention is to provide a convenient system for controlling a plurality of devices by sensing and interpreting gestures performed by any member of the human body.
  • Yet another object of the present invention is to provide an inexpensive system for controlling a plurality of devices.
  • Still another object of the present invention is to provide a system which can be used by the physically challenged to conveniently provide character as well as control inputs to a plurality of devices.
  • Yet another object of the present invention is to provide a system which is easily installable.
  • Still another object of the present invention is to provide a system which will work efficiently in any room lighting condition.
  • Yet another object of the present invention is to provide a system that can be easily operated and does not require application of any skill by users.
  • Still another object of the present invention is to provide a system whose operation will be independent of the line-of-sight with the devices thus, enabling users to operate the system from any angle.
  • the invention envisages a system for controlling at least one device by means of pre-defined gestures.
  • the system typically comprises a first element which is battery operated, adapted to co-operate with the human body and is located remotely from one or more devices and second element co-operating with each of said one or more devices.
  • the first element includes at least one motion detecting sensor adapted to sense the motion of gestures created by a member of the human body and to convert said sensed motions to a set of digital signals, a micro-controller adapted to receive said set of digital signals and further adapted to convert said set of digital signals into a set of RF signals representing three dimensional co-ordinates and an RF transmitter adapted to receive a said set of RF signals and transmit them.
  • a motion detecting sensor adapted to sense the motion of gestures created by a member of the human body and to convert said sensed motions to a set of digital signals
  • a micro-controller adapted to receive said set of digital signals and further adapted to convert said set of digital signals into a set of RF signals representing three dimensional co-ordinates
  • an RF transmitter adapted to receive a said set of RF signals and transmit them.
  • the second element comprises an RF receiver adapted to receive said transmitted set of RF signals, a repository adapted to store discrete data corresponding to sets of stored signals representing pre-defined gestures, a second micro-controller adapted to receive said transmitted set of RF signals, said second micro-controller including a comparator for comparing said transmitted set of RF signals with the sets of stored signals and generate a output in response to matching said transmitted set of RF signals with the sets of stored signals and at least one controller adapted to receive said output and further adapted to operate at least one functionality of said one or more devices.
  • the motion detecting sensor is a MEMS (Micro-Electro-Mechanical system) accelerometer fabricated on a chip.
  • the sensor is a three axis sensor chip.
  • the senor is adapted, to be attached to any part of the human body, to be removably placed on the fingertips, to be removably worn in the form of wrist band, to be held in the hand, to be removably worn on the arm or to be removably worn on the leg.
  • the transmitter and the receiver are an RF transmitter and an RF receiver adapted to transmit and receive data through RF wireless transmission.
  • the transmitter and the receiver are adapted to respond to frequency in the range of 900 MHz and 2.4 GHz.
  • the transmitter and the receiver are fabricated on a chip.
  • the receiver comprises a data filter adapted to smooth the transmission between two said transmitted set of signals.
  • the data filter is a median filter.
  • the receiver also comprises a noise filter adapted to remove unwanted noise present in said transmitted set of signals.
  • the step of comparing is executed by at least chain code based recognition and by un-directional un-weighted graph based recognition.
  • Figure 1 is a simplified schematic of the device containing an RF transmitter
  • Figure 2 is a structural diagram of a receiver apparatus containing an RF receiver to be used in conjunction with the transmitter of figure 1
  • Figure 3 is a structural diagram showing the methodology of controlling at least one device by means of pre-defined gestures.
  • a wireless device for automatic recognition of 'gesture based input' for controlling a plurality of devices has been envisaged.
  • the wireless device helps user's select different menus by moving their hand in the air.
  • users can also send alphanumeric characters to the host devices by just writing the characters in air.
  • volume and television channels can be changed by some specific gesture with the help of the present invention.
  • users by moving their hand in the air can send instructions to a personal computer or other devices.
  • a method and apparatus is provided for user input in the form of alpha numeric data to an internet browser using hand gestures provided by users.
  • typically users can switch on or off devices, as an example users can write characters 'al ' in the air to switch on a particular device and 'aa' to switch on all devices in a room. Similarly, users can write 'bl ' to switch off a particular device and 'bb' to switch off all the devices in a room.
  • a system having a wireless device that can be, removably worn on the fingertips, removably worn in the form of wrist band, placed on any part of the human body, can be a hand held device which can be used as a mouse device and the like.
  • the invention can be employed in 3D mouse, where users can move a hand in the air instead of on a table and this can therefore also replace a joystick, cursor pen and 3D game pad.
  • a system typically having a hand held device which can interact with host devices, said host devices including:
  • PC Personal Computers
  • PDA Personal digital assistance
  • the hand held device in accordance with this invention will be adapted to communicate with the host devices using at least one of the following protocols: • USB
  • the invention typically consists of the following components:
  • the hand held device as seen in Figure 1, is a battery operated device and comprises at least one sensor as represented by block 100 of Figure 1 , a transmitter RF-SoC (Radio Frequency-System-on-Chip) as represented by block 102 of Figure 1, and at least one micro-controller as represented by block 104 of Figure 1.
  • the HHD in accordance with this invention is adapted to detect left right and up down movements by using the movement of the sensor's Y and Z direction movement.
  • the sensor 100 in accordance with an embodiment of the present invention, can be removably attached to any part of the human body, can be removably placed on the fingertips, can be removably worn in the form of wrist band, can be held in the hand, can be removably worn on the arm or can be removably worn on the leg and the like. According to one aspect of the invention, users typically can write characters and numerals used in any language by just using the sensor 100 attached typically to the hand.
  • the sensor 100 is a micro-electro-mechanical system (MEMS) sensor which is an accelerometer fabricated in a chip. Typically, the MEMS sensor is a 3 axis sensor chip. Every time a movement is detected, sensor 100 is adapted to sense direction of the displacement and value of acceleration and convert them into digital signals for being processed by the micro-controller 102.
  • MEMS micro-electro-mechanical system
  • the micro-controller 102 is adapted to read the digital signal received from the MEMS sensor 100 and further process the signal and send the three dimensional co-ordinates (x,y and z) of the gesture to a wireless transmitter 104.
  • the RF transmitter 104 is adapted to receive the three dimensional co-ordinates from the micro-controller 102 and convert it into radio frequency signals and transmit the RF data through RF wireless transmission.
  • the receiver apparatus is fitted in the host devices and is adapted to receive signals from the transmitter 104 which is embedded in the FfFfD, and then stores all the data as a series of co-ordinates and sends them to the second micro-controller 108.
  • the receiver apparatus comprises a receiver RF-SoC as represented by block 106 of Figure 2, at least one micro-controller as represented by block 108 of Figure 2, at least one controller as represented by block 110 of Figure 2.
  • the RF receiver 106 is adapted to receive the radio frequency signals. The data received at the receiver 106 then undergoes filtering and is then sent to second micro-controller 108 for further processing.
  • a) Data Filtering To get a smoother transition between two consecutive mouse data points, a Median Filter is applied on the input data. This filtering is applied on each stroke of the input data.
  • Noise filter is used to remove any noise present within a continuous value of any gradient. This gives a good gradient look-up which excludes most of the un-wanted part of the drawing that includes some unintentional jerking of hand while drawing.
  • the second micro-controller 108 receives the RF signals from the receiver 106 and determines the direction and value of displacement experienced by the hand held device. These values are then passed to a gesture recognition engine.
  • the second micro-controller also has a repository 112 which stores the templates for pre-defined gestures which are later used for comparison by the gesture recognition engine.
  • the automatic recognition of 'gesture based character input' is based on at least multi-factorial analysis or a similar approach that makes a decision from multiple features or parameters.
  • 'recognition of alpha numeric character input' is based on the use of a graph theoretical approach and chain code based approach.
  • the gesture recognition engine of the second micro-controller 108 has at least two different agents. Each of them recognizes the input character with some confidence factor lying in the closed interval of (0, 1). Final decision about the recognized character is made by taking the output obtained from either agent with highest confidence.
  • the main advantage of the scheme is that one agent works efficiently for curved characters and the other for the characters with linear segment. A recognition accuracy of more than 92% is obtained. The recognized output signal is then passed to the controller.
  • the controller 110 is a part of the host device and is adapted to receive the recognized output signal from the second micro-controller 108 and send the signal for operating at least one function on one or more host devices.
  • Micro Electro-Mechanical Systems (MEMS) accelerometer 100 is used to detect the movement of the wireless device, which is placed typically on the fingertip. Every time some movement is detected, direction of the displacement and value of acceleration are measured and sent to the host devices using wireless technology (ZigBee, Blue Tooth, and the like).
  • the receiver 106 is fitted in the host device which receives the signals from the transmitter 104 then stores all the data as a series of co-ordinates (x,y and z) and sends the signals to a gesture recognition engine in the second micro-controller 108.
  • the difference between two consecutive X or Y co-ordinates throughout the whole input data are taken into account to determine if they differ by a large number ⁇ THRESHOLD>, which in turn tell the receiver 106 if the mouse is taken up while drawing. If so, the position of the stroke is stored as ⁇ STROKE_INDEX[COUNT]>.
  • the data captured using MEMS sensor 100 is then made to undergo some filtering. It is observed that if the x and y coordinates are stored, there is a huge change in two consecutive points whenever there is a new stroke. Therefore, initially the input data is split into several segments.
  • Compute Area ⁇ COMPUTE_AREA(Gesture_Struct_ADDRESS)> is defined to get the total boundary covered by the input pattern which aids to make the system independent of the size of the input data.
  • Construct Chain Code ⁇ Compare_sign(Gesture_Struct_ADDRESS)> and ⁇ assign_chain_code(Gesture_Struct_ADDRESS)> are responsible to calculate the gradient value of each stroke segment of the structure and store them in a loop up table ⁇ look_up[stroke_length]>.
  • a generalized concept of line direction is followed typically as under [only by way of example] :
  • Noise filter is used to remove any noise present within a continuous value of any gradient. This gives a good gradient look-up which excludes most of the un- wanted part of the drawing that includes some un-intentional jerking of hand while drawing.
  • the recognition module An important feature of the invention is the recognition module.
  • the method of "multi factorial approach" is used to find out the candidate recognized character.
  • two agents are used to give their opinion about the input character.
  • Each of the agents gives some score indicating their confidence about the recognition.
  • the character with highest confidence is taken as the recognized character.
  • the method is as described below:
  • Agent one is typically based on chain code based approach. In this approach a chain code is assigned to every segment.
  • Determination of presence of loops If the stroke count is less than or equal to 1 and the starting and ending co-ordinates of the drawing do not differ by more than a ⁇ threshold>, the input pattern is considered to have only one loop which is justified for the character O'.
  • Determination of presence of curves The presence of a curve in any input data will help to distinguish between the characters. According to an embodiment of the present invention, if English characters are selected for making gestures, these inputs will be used for selecting differences between alphabets 'B', 1 C, 1 1 D 1 , 'G', 1 J 1 , O 1 , 1 P 1 , 'QVR', 'S', 1 U 1 and alphabets 1 A 1 , 1 E, 1 1 F 1 , ⁇ , T, 'K', 1 L 1 , 'M'/N', T, 1 V, 'W','X', ⁇ , 1 Z 1 .
  • the distance among every consecutive co-ordinates is computed and checked whether they fall within a small range, in that case the gesture contains one or more than one curves.
  • Agent two is typically based on un-directional un-weighted graph based approach for recognition. In this approach:
  • each sub-block is marked as 0,1, 2, ,8;
  • dis-similarity_matrix(i,j) 1 means there is some edge in the input data that is absent in the template
  • dissimilarity matrix(ij) -1 means there is some edge in the template that is absent in the input data; • If there is a (1,-1) pair in two adjacent (horizontally or vertically) position, this means that the input is written in a different manner than the template is written;
  • Weighted sum of deviation from input and target template can be obtained by (1 -((insertion* WEIGHT INSERT + modification *WEIGHT_MODIFY+ deletion*WEIGHT_DELETE)/(div_factor)));
  • the flow chart showing the methodology of controlling at least one device by means of predefined gestures is shown in Figure 3.
  • the typical steps followed are given by: (a) creating a repository for storing discrete data corresponding to sets of stored signals representing predefined gestures as shown by block 1000 of Figure 3;
  • the technical advancements of the present invention include:

Abstract

A system and a method for providing character input remotely using pre-defined gestures typically via RF signaling to enhance usability through easier human machine interface has been disclosed.

Description

CHARACTER BASED INPUT USING PRE-DEFINED HUMAN BODY GESTURES
FIELD OF INVENTION
This invention relates to the field of remote controlling.
In particular, this invention relates to remote controlling using gestures.
BACKGROUND OF THE INVENTION
Electronic devices are inbuilt with different switches and knobs which can be utilized towards effective functioning of volume controls, switching of channels, menu options and the like. With the advance in technology more features are being added to devices thus making operations of these devices with switches and knobs less user friendly and less feasible tot? handle. Moreover, for using knobs and switches users have to go towards the device. Therefore, there was a need to develop a control which would enable users to effectively operate devices remotely. Thereby remote controls were developed which enabled users to operate devices from a distance.
Today, at office and home alike there are many electronic devices each having a separate remote control. Since these devices have many features the remote controls host many buttons which make them difficult to understand and use, complex setup of the remote controls in turn also results in users not utilizing the devices. Also, users require a lot of skill to understand and use all the functions provided in the remote controls. Now days, digital set-top boxes are becoming more popular. These boxes provide Internet browsing, video chat, movie search along with television. To handle these applications, sometimes users need a keyboard and a mouse. However, it does not feel good to sit with a keyboard and mouse while watching TV. Some better device is required to handle the complex interfaces typically with set-top boxes. To take this usability further, there is a latent need to be able to provide text and character based input remotely to enjoy the benefit of convergent applications and services like web browsing, SMS, IM, email and the like.
Therefore, there was a need to replace the buttons on remote controls, keyboards and mouse with a user friendly and convenient system for controlling devices as well as providing text and character based input.
In the prior art, different methods have been disclosed for replacing the conventional methods of user input as mentioned above, these include recognition of visual and human body gestures. The type of gesture taken for input is matched with the approach used for recognition. Different types of gesture include:
• Head gestures,
• Arm gesture based,
• Sign language gesture,
• Eye gaze behavior.
For instance, US Patent 6747632 B2 discloses a wireless control device, having a housing worn on the body of an operator and capable of detecting and interpreting gestures performed by the user. The wireless device is capable of receiving input independent of position of the user or of a part of the user's body. The housing includes a plurality of optical emitters and optical detectors for emitting and detecting motion or hand gestures of users in three dimensions. The invention also comprises a voice recognition sensor. The invention allows gesture recognition to be used not only for cursor control and character-level data entry, but also for word and phrase level data entry. Invention also has the gesture recognition capabilities whereby the devices can be switched on and off as the user desires. However, optical emitters and detectors only work best when there is a straight unobstructed path between the device and the host. Thus, if the line of sight between a camera and an LED is obscured it can interfere with the controlling process. Parameters such as ambient light or infrared radiation can also make the system less effective. Also, the method used by US Patent 6747632 B2 recognizes the characters given in American Sign Language and a modified version of it. But the problem of this approach is that a user needs to memorize a lot of code/symbols for each character.
Therefore, there is a need for:
• a simple and user friendly system for controlling a plurality of devices
• a system where no line-of-sight will be necessary for sending control signals to devices.
• a system which allows efficient character as well cursor control to a plurality of devices
• a gesture based approach to make a system which can not only interact with a user by accurately recognizing gestures but which can learn new gestures and update its understanding of gestures it already knows in an online interactive manner. OBJECT OF THE INVENTION
An object of the present invention is to provide a single user friendly system for controlling and operating a plurality of devices.
Another object of the present invention is to provide a convenient system for controlling a plurality of devices by sensing and interpreting gestures performed by any member of the human body.
Yet another object of the present invention is to provide an inexpensive system for controlling a plurality of devices.
Still another object of the present invention is to provide a system which can be used by the physically challenged to conveniently provide character as well as control inputs to a plurality of devices.
Yet another object of the present invention is to provide a system which is easily installable.
Still another object of the present invention is to provide a system which will work efficiently in any room lighting condition.
Yet another object of the present invention is to provide a system that can be easily operated and does not require application of any skill by users.
Still another object of the present invention is to provide a system whose operation will be independent of the line-of-sight with the devices thus, enabling users to operate the system from any angle. SUMMARY OF THE INVENTION
The invention envisages a system for controlling at least one device by means of pre-defined gestures.
In accordance with an embodiment of this invention, the system typically comprises a first element which is battery operated, adapted to co-operate with the human body and is located remotely from one or more devices and second element co-operating with each of said one or more devices.
Typically, the first element includes at least one motion detecting sensor adapted to sense the motion of gestures created by a member of the human body and to convert said sensed motions to a set of digital signals, a micro-controller adapted to receive said set of digital signals and further adapted to convert said set of digital signals into a set of RF signals representing three dimensional co-ordinates and an RF transmitter adapted to receive a said set of RF signals and transmit them.
Typically, the second element comprises an RF receiver adapted to receive said transmitted set of RF signals, a repository adapted to store discrete data corresponding to sets of stored signals representing pre-defined gestures, a second micro-controller adapted to receive said transmitted set of RF signals, said second micro-controller including a comparator for comparing said transmitted set of RF signals with the sets of stored signals and generate a output in response to matching said transmitted set of RF signals with the sets of stored signals and at least one controller adapted to receive said output and further adapted to operate at least one functionality of said one or more devices. Preferably, the motion detecting sensor is a MEMS (Micro-Electro-Mechanical system) accelerometer fabricated on a chip. Moreover, the sensor is a three axis sensor chip.
In accordance with another embodiment of the present invention, the sensor is adapted, to be attached to any part of the human body, to be removably placed on the fingertips, to be removably worn in the form of wrist band, to be held in the hand, to be removably worn on the arm or to be removably worn on the leg.
In accordance with yet another embodiment of the present invention, specifically the transmitter and the receiver are an RF transmitter and an RF receiver adapted to transmit and receive data through RF wireless transmission. Typically, the transmitter and the receiver are adapted to respond to frequency in the range of 900 MHz and 2.4 GHz. The transmitter and the receiver are fabricated on a chip.
In accordance with still another embodiment of the present invention, the receiver comprises a data filter adapted to smooth the transmission between two said transmitted set of signals. Specifically, the data filter is a median filter. The receiver also comprises a noise filter adapted to remove unwanted noise present in said transmitted set of signals.
In accordance with yet another embodiment of the present invention there is provided a method for controlling at least one device by means of pre-defined gestures comprising the following steps:
(a) creating a repository for storing discrete data corresponding to sets of stored signals representing predefined gestures; (b) linking the repository with said devices that need to be controlled;
(c) securing a sensor element to a part of the human body;
(d) forming a gesture;
(e) sensing the gesture in the sensed element by means of motion of human body and converting it into a set of digital signals;
(f) converting said set of digital signal into set of RP signals representing three dimensional co-ordinates;
(g) transmitting said set of RP signals to a receiver; (h) receiving said transmitted set of RF signals;
(i) comparing said transmitted set of RF signals with the set of stored signals in the repository;
(j) matching the transmitted set of RF signals;
(k) generating a output representing a match with the set of stored signals and
(1) receiving the output in order to control at least one functionality of one or more said devices.
Typically, the step of comparing is executed by at least chain code based recognition and by un-directional un-weighted graph based recognition.
BRIEF DESCRIPTION OF THE FIGURES
Other aspects of the invention will become apparent by consideration of the accompanying drawings and their description stated below, which is merely illustrative of a preferred embodiment of the invention and does not limit in any way the nature and scope of the invention in which,
Figure 1 is a simplified schematic of the device containing an RF transmitter; Figure 2 is a structural diagram of a receiver apparatus containing an RF receiver to be used in conjunction with the transmitter of figure 1; and Figure 3 is a structural diagram showing the methodology of controlling at least one device by means of pre-defined gestures.
DETAILED DESCRIPTION
In accordance with the preferred embodiment of the present invention, a wireless device for automatic recognition of 'gesture based input' for controlling a plurality of devices has been envisaged.
In accordance with one other embodiment of the invention, the wireless device helps user's select different menus by moving their hand in the air. Typically, users can also send alphanumeric characters to the host devices by just writing the characters in air. Also, volume and television channels can be changed by some specific gesture with the help of the present invention.
In accordance with yet another embodiment of the present invention, users by moving their hand in the air can send instructions to a personal computer or other devices. In another aspect of the invention a method and apparatus is provided for user input in the form of alpha numeric data to an internet browser using hand gestures provided by users. In accordance with still another embodiment of the present invention, typically users can switch on or off devices, as an example users can write characters 'al ' in the air to switch on a particular device and 'aa' to switch on all devices in a room. Similarly, users can write 'bl ' to switch off a particular device and 'bb' to switch off all the devices in a room.
In accordance with yet another embodiment of the present invention there is provided a system having a wireless device that can be, removably worn on the fingertips, removably worn in the form of wrist band, placed on any part of the human body, can be a hand held device which can be used as a mouse device and the like. In addition, the invention can be employed in 3D mouse, where users can move a hand in the air instead of on a table and this can therefore also replace a joystick, cursor pen and 3D game pad.
In accordance with still another embodiment of the present invention there is provided a system typically having a hand held device which can interact with host devices, said host devices including:
• Personal Computers (PC),
• Digital Set Top Boxes(DSTB)
• Notebook (NB),
• Personal digital assistance (PDA),
• Mobile phone and the like.
Specifically, the hand held device in accordance with this invention, will be adapted to communicate with the host devices using at least one of the following protocols: • USB
• RS232
• Fire wire
• Or any other custom protocol and can work as a standard mouse.
In a practical embodiment, the invention typically consists of the following components:
1. Hand held device (First element)
2. Receiver apparatus (Second element)
Hand held device: The hand held device (HHD) as seen in Figure 1, is a battery operated device and comprises at least one sensor as represented by block 100 of Figure 1 , a transmitter RF-SoC (Radio Frequency-System-on-Chip) as represented by block 102 of Figure 1, and at least one micro-controller as represented by block 104 of Figure 1. The HHD in accordance with this invention is adapted to detect left right and up down movements by using the movement of the sensor's Y and Z direction movement.
Sensor: The sensor 100, in accordance with an embodiment of the present invention, can be removably attached to any part of the human body, can be removably placed on the fingertips, can be removably worn in the form of wrist band, can be held in the hand, can be removably worn on the arm or can be removably worn on the leg and the like. According to one aspect of the invention, users typically can write characters and numerals used in any language by just using the sensor 100 attached typically to the hand. The sensor 100 is a micro-electro-mechanical system (MEMS) sensor which is an accelerometer fabricated in a chip. Typically, the MEMS sensor is a 3 axis sensor chip. Every time a movement is detected, sensor 100 is adapted to sense direction of the displacement and value of acceleration and convert them into digital signals for being processed by the micro-controller 102.
Micro-Controller: The micro-controller 102, is adapted to read the digital signal received from the MEMS sensor 100 and further process the signal and send the three dimensional co-ordinates (x,y and z) of the gesture to a wireless transmitter 104.
RF Transmitter: The RF transmitter 104, is adapted to receive the three dimensional co-ordinates from the micro-controller 102 and convert it into radio frequency signals and transmit the RF data through RF wireless transmission.
Receiver Apparatus: The receiver apparatus is fitted in the host devices and is adapted to receive signals from the transmitter 104 which is embedded in the FfFfD, and then stores all the data as a series of co-ordinates and sends them to the second micro-controller 108. The receiver apparatus comprises a receiver RF-SoC as represented by block 106 of Figure 2, at least one micro-controller as represented by block 108 of Figure 2, at least one controller as represented by block 110 of Figure 2.
RF Receiver: The RF receiver 106, is adapted to receive the radio frequency signals. The data received at the receiver 106 then undergoes filtering and is then sent to second micro-controller 108 for further processing. a) Data Filtering: To get a smoother transition between two consecutive mouse data points, a Median Filter is applied on the input data. This filtering is applied on each stroke of the input data.
b) Remove noise: Noise filter is used to remove any noise present within a continuous value of any gradient. This gives a good gradient look-up which excludes most of the un-wanted part of the drawing that includes some unintentional jerking of hand while drawing.
Second Micro-controller: The second micro-controller 108, receives the RF signals from the receiver 106 and determines the direction and value of displacement experienced by the hand held device. These values are then passed to a gesture recognition engine. The second micro-controller also has a repository 112 which stores the templates for pre-defined gestures which are later used for comparison by the gesture recognition engine.
In particular, the automatic recognition of 'gesture based character input' is based on at least multi-factorial analysis or a similar approach that makes a decision from multiple features or parameters. And 'recognition of alpha numeric character input' is based on the use of a graph theoretical approach and chain code based approach.
The gesture recognition engine of the second micro-controller 108 has at least two different agents. Each of them recognizes the input character with some confidence factor lying in the closed interval of (0, 1). Final decision about the recognized character is made by taking the output obtained from either agent with highest confidence. The main advantage of the scheme is that one agent works efficiently for curved characters and the other for the characters with linear segment. A recognition accuracy of more than 92% is obtained. The recognized output signal is then passed to the controller.
Controller: The controller 110, is a part of the host device and is adapted to receive the recognized output signal from the second micro-controller 108 and send the signal for operating at least one function on one or more host devices.
Typically, the process of recognition of gestures is performed in the below stages:
1. Data Acquisition:
Micro Electro-Mechanical Systems (MEMS) accelerometer 100 is used to detect the movement of the wireless device, which is placed typically on the fingertip. Every time some movement is detected, direction of the displacement and value of acceleration are measured and sent to the host devices using wireless technology (ZigBee, Blue Tooth, and the like). The receiver 106 is fitted in the host device which receives the signals from the transmitter 104 then stores all the data as a series of co-ordinates (x,y and z) and sends the signals to a gesture recognition engine in the second micro-controller 108.
2. Identification of the strokes:
The difference between two consecutive X or Y co-ordinates throughout the whole input data are taken into account to determine if they differ by a large number <THRESHOLD>, which in turn tell the receiver 106 if the mouse is taken up while drawing. If so, the position of the stroke is stored as <STROKE_INDEX[COUNT]>. The data captured using MEMS sensor 100 is then made to undergo some filtering. It is observed that if the x and y coordinates are stored, there is a huge change in two consecutive points whenever there is a new stroke. Therefore, initially the input data is split into several segments.
3. Data Filtering: To get a smooth transition between two consecutive mouse data points, a median filter is applied on the input data. This filtering is applied on each stroke.
4. Compute Area: <COMPUTE_AREA(Gesture_Struct_ADDRESS)> is defined to get the total boundary covered by the input pattern which aids to make the system independent of the size of the input data.
5. Construct Chain Code: <Compare_sign(Gesture_Struct_ADDRESS)> and <assign_chain_code(Gesture_Struct_ADDRESS)> are responsible to calculate the gradient value of each stroke segment of the structure and store them in a loop up table <look_up[stroke_length]>. In this case, a generalized concept of line direction is followed typically as under [only by way of example] :
Figure imgf000015_0001
Remove noise: Noise filter is used to remove any noise present within a continuous value of any gradient. This gives a good gradient look-up which excludes most of the un- wanted part of the drawing that includes some un-intentional jerking of hand while drawing.
6. Recognition:
An important feature of the invention is the recognition module. In accordance with a preferred embodiment of the invention the method of "multi factorial approach" is used to find out the candidate recognized character. In this method two agents are used to give their opinion about the input character. Each of the agents gives some score indicating their confidence about the recognition. The character with highest confidence is taken as the recognized character. The method is as described below:
Agent one is typically based on chain code based approach. In this approach a chain code is assigned to every segment.
Determination of presence of loops: If the stroke count is less than or equal to 1 and the starting and ending co-ordinates of the drawing do not differ by more than a <threshold>, the input pattern is considered to have only one loop which is justified for the character O'.
Determination of presence of curves: The presence of a curve in any input data will help to distinguish between the characters. According to an embodiment of the present invention, if English characters are selected for making gestures, these inputs will be used for selecting differences between alphabets 'B', 1C,1 1D1, 'G', 1J1, O1, 1P1, 'QVR', 'S', 1U1 and alphabets 1A1, 1E,1 1F1, Η, T, 'K', 1L1, 'M'/N', T, 1V, 'W','X', Υ, 1Z1. Here, the distance among every consecutive co-ordinates is computed and checked whether they fall within a small range, in that case the gesture contains one or more than one curves.
Agent two is typically based on un-directional un-weighted graph based approach for recognition. In this approach:
• minimum and maximum x and y co-ordinate of the input data are identified;
• they are marked as (max x, max_y) and (min_x, min_y);
• the entire region is split into 3x3 blocks;
• each sub-block is marked as 0,1, 2, ,8;
• the starting and end position of each code is found and the start and end is marked;
• further every code is represented in the form of an adjacency matrix of size 9x9;
• Indicate i = Min(start,end), j = Max(start,end);
• If a code starts at I and ends at j position, the adjacency(ij) is marked as 1;
• from the above we can construct template graph, template_graph for each character;
• for some characters more than one template can be constructed. Further construct the dis-similarity_matrix by graph(ij)- template_graph(i,j);
• If dis-similarity_matrix(i,j) = 1 means there is some edge in the input data that is absent in the template;
• If dissimilarity matrix(ij) = -1 means there is some edge in the template that is absent in the input data; • If there is a (1,-1) pair in two adjacent (horizontally or vertically) position, this means that the input is written in a different manner than the template is written;
• Count the number of occurrence of insertion, deletion and modification;
• Sum of matching, insertion and modification gives the number of codes present in the input;
• Sum of matching and deletion gives the number of code in the template;
• Weighted sum of deviation from input and target template can be obtained by (1 -((insertion* WEIGHT INSERT + modification *WEIGHT_MODIFY+ deletion*WEIGHT_DELETE)/(div_factor)));
• Div factor is defined by (WEIGHT INSERT+ WEIGHT MODIFY+ WEIGHT_DELETE)*code_cnt, where code_cnt is the number of code in the input data and
• further find the character that matches best and obtain the matching score lying between 0 and 1.
Figure imgf000018_0001
In accordance with an embodiment of the present invention, the flow chart showing the methodology of controlling at least one device by means of predefined gestures is shown in Figure 3. The typical steps followed are given by: (a) creating a repository for storing discrete data corresponding to sets of stored signals representing predefined gestures as shown by block 1000 of Figure 3;
(b) linking the repository with said devices that need to be controlled as shown by block 1002 of Figure 3;
(c) securing a sensor element to a part of the human body as shown by block 1004 of Figure 3;
(d) forming a gesture as shown by block 1006 of Figure 3;
(e) sensing the gesture in the sensed element by means of motion of human body and converting it into a set of digital signals as shown by block 1008 of Figure 3;
(f) converting said set of digital signal into set of RF signals representing three dimensional co-ordinates as shown by block 1010 of Figure 3;
(g) transmitting said set of converted signals to a receiver as shown by block 1012 of Figure 3;
(h) receiving said transmitted set of converted signals as shown by block 1014 of Figure 3;
(i) comparing said transmitted set of converted signals with the set of stored signals in the repository as shown by block
1016 of Figure 3;
(j) matching the transmitted set of signals as shown by block
1018 of Figure 3 and generating a output representing a match with the set of stored signals;
(k) no output signal is generated in case no match is found as shown by 1022 of Figure 3 and (1) receiving the output in order to control at least one functionality of one or more said devices as shown by block 1020 of Figure 3.
The technical advancements of the present invention include:
• Providing a single user-friendly system which can be used to control and operate a plurality of devices.
• Providing a convenient system for controlling a plurality of devices by sensing and interpreting gestures performed by any member of the human body.
• Providing an inexpensive system which is easily installable.
• Providing a system whose operation is independent of the line-of- sight with the devices thus, enabling users to operate the system from any angle.
• Providing a system which will work efficiently in any room lighting condition.
• Providing a system that can be easily operated and does not require application of any skills by users.
• Providing a system that does not require users to memorize a lot of code/symbols for each character input.
• Providing a system which can be used by the physically challenged to conveniently provide character as well as control inputs to a plurality of devices.
While considerable emphasis has been placed herein on the particular features of this invention, it will be appreciated that various modifications can be made, and that many changes can be made in the preferred embodiments without departing from the principles of the invention. These and other modifications in the nature of the invention or the preferred embodiments will be apparent to those skilled in the art from the disclosure herein, whereby it is to be distinctly understood that the foregoing descriptive matter is to be interpreted merely as illustrative of the invention and not as a limitation.

Claims

Claims
1. A system for controlling at least one device by means of pre-defined gestures, the said system comprising: a) first element adapted to work in conjunction with the human body and located remotely from one or more devices, said first element comprising: i) at least one motion detecting sensor adapted to sense the motion of gestures created by a member of the human body and to convert said sensed motions to a set of digital signals; ii) a micro-controller adapted to receive said set of digital signals and further adapted to convert said set of digital signals into a set of RP signals representing three dimensional co-ordinates and iii) a transmitter adapted to receive a said set of RF signals and transmit them; b) second element co-operating with each of said one or more devices, the said second element comprising: i) a receiver adapted to receive said transmitted set of RP signals; ii) a repository adapted to store discrete data corresponding to sets of stored signals representing pre-defined gestures; iii) a second micro-controller adapted to receive said transmitted set of converted signals and convert to received set of converted digital signals, said second micro-controller including a comparator for comparing said received set of converted digital signals with sets of stored signals and generate a output in response to matching said received set of converted digital signals with set of stored signals and iv) at least one controller adapted to receive said output and further adapted to operate at least one functionality of said one or more devices.
2. A system as claimed in claim 1 adapted to provide alphanumeric, symbol and special characters based input remotely using pre-defined human body gestures thereby providing a human machine interface.
3. A system as claimed in claim 1 wherein said gestures are adapted to provide alphanumeric, symbol and special characters based input remotely to said one or more devices.
4. A system as claimed in claim 1 wherein the motion detecting sensor is a MEMS (Micro-Electro-Mechanical system) accelerometer.
5. A system as claimed in claim 1 wherein the motion detecting sensor is fabricated on a chip.
6. A system as claimed in claim 1 wherein the motion detecting sensor is a three axis sensor chip.
7. A system as claimed in claim 1 wherein the motion detecting sensor is adapted to be attached to any part of the human body.
8. A system as claimed in claim 1 wherein the motion detecting sensor is adapted to be removably placed on the fingertips.
9. A system as claimed in claim 1 wherein the motion detecting sensor is adapted to be removably worn in the form of wrist band.
10. A system as claimed in claim 1 wherein the motion detecting sensor is adapted to be held in the hand.
11. A system as claimed in claim 1 wherein the motion detecting sensor is adapted to be removably worn on the arm.
12. A system as claimed in claim 1 wherein the motion detecting sensor is adapted to be removably worn on the leg.
13. A system as claimed in claim 1 wherein the transmitter and the receiver are an RF transmitter and an RF receiver.
14. A system as claimed in claim 1 wherein the transmitter and the receiver are adapted to transmit and receive data through RF wireless transmission.
15. A system as claimed in claim 1 wherein the transmitter and the receiver are adapted to respond to frequency in the range of 900Mhz and 2.4Ghz.
16. A system as claimed in claim 1 wherein the transmitter and the receiver are fabricated on a chip.
17. A system as claimed in claim 1 wherein the receiver comprises a data filter adapted to smoothen the transmission between two said transmitted set of signals.
18. A system as claimed in claim 1 wherein the receiver comprises a median data filter adapted to smoothen the transmission between two said transmitted set of signals.
19. A system as claimed in claim 1 wherein the receiver comprises a noise filter adapted to remove unwanted noise present in said transmitted set of signals.
20. A system as claimed in claim 1 wherein the first element is operated by a battery.
21. A method for controlling at least one device by means of pre-defined gestures comprising the following steps:
(a) creating a repository for storing discrete data corresponding to sets of stored signals representing predefined gestures;
(b) linking the repository with said devices that need to be controlled;
(c) securing a sensor element to a part of the human body;
(d) forming a gesture; (e) sensing the gesture in the sensed element by means of motion of human body and converting it into a set of digital signals;
(f) converting said set of digital signal into set of RF signals representing three dimensional co-ordinates;
(g) transmitting said set of converted signals to a receiver;
(h) receiving said transmitted set of converted signals;
(i) comparing said transmitted set of converted signals with the set of stored signals in the repository;
(j) matching the transmitted set of signals;
(k) generating a output representing a match with the set of stored signals; and
(1) receiving the output in order to control at least one functionality of one or more said devices.
22. A method as claimed in claim 21 wherein the step of comparing is executed by chain code based recognition.
23. A method as claimed in claim 21 wherein the step of comparing is executed by un-directional un-weighted graph based recognition.
PCT/IN2009/000123 2008-02-27 2009-02-24 Character based input using pre-defined human body gestures WO2009116079A2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IN400MU2008 2008-02-27
IN400/MUM/2008 2008-02-27

Publications (2)

Publication Number Publication Date
WO2009116079A2 true WO2009116079A2 (en) 2009-09-24
WO2009116079A3 WO2009116079A3 (en) 2011-03-31

Family

ID=41091339

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IN2009/000123 WO2009116079A2 (en) 2008-02-27 2009-02-24 Character based input using pre-defined human body gestures

Country Status (1)

Country Link
WO (1) WO2009116079A2 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012094143A1 (en) * 2011-01-05 2012-07-12 Qualcomm Incorporated Method and apparatus for scaling gesture recognition to physical dimensions of a user
US9214043B2 (en) 2013-03-04 2015-12-15 Here Global B.V. Gesture based map annotation
RU2613038C2 (en) * 2012-08-09 2017-03-14 Тенсент Текнолоджи (Шэньчжэнь) Компани Лимитед Method for controlling terminal device with use of gesture, and device
CN109213333A (en) * 2017-07-07 2019-01-15 联想(新加坡)私人有限公司 For converting speech into text and using the device and method of posture insertion character

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4414537A (en) * 1981-09-15 1983-11-08 Bell Telephone Laboratories, Incorporated Digital data entry glove interface device
EP1276038A1 (en) * 2001-07-10 2003-01-15 Hung-Lien Shen Data input method and device for a computer system
DE102005021527A1 (en) * 2005-05-10 2006-11-23 Siemens Ag Characters inputting device for e.g. mobile radio telephone, has input device that is designed as finger or decoration ring and including acceleration sensor to detect movement of input device and to output corresponding acceleration data

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4414537A (en) * 1981-09-15 1983-11-08 Bell Telephone Laboratories, Incorporated Digital data entry glove interface device
EP1276038A1 (en) * 2001-07-10 2003-01-15 Hung-Lien Shen Data input method and device for a computer system
DE102005021527A1 (en) * 2005-05-10 2006-11-23 Siemens Ag Characters inputting device for e.g. mobile radio telephone, has input device that is designed as finger or decoration ring and including acceleration sensor to detect movement of input device and to output corresponding acceleration data

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012094143A1 (en) * 2011-01-05 2012-07-12 Qualcomm Incorporated Method and apparatus for scaling gesture recognition to physical dimensions of a user
US8929609B2 (en) 2011-01-05 2015-01-06 Qualcomm Incorporated Method and apparatus for scaling gesture recognition to physical dimensions of a user
RU2613038C2 (en) * 2012-08-09 2017-03-14 Тенсент Текнолоджи (Шэньчжэнь) Компани Лимитед Method for controlling terminal device with use of gesture, and device
US9214043B2 (en) 2013-03-04 2015-12-15 Here Global B.V. Gesture based map annotation
CN109213333A (en) * 2017-07-07 2019-01-15 联想(新加坡)私人有限公司 For converting speech into text and using the device and method of posture insertion character

Also Published As

Publication number Publication date
WO2009116079A3 (en) 2011-03-31

Similar Documents

Publication Publication Date Title
US10948992B1 (en) Ring human-machine interface
EP3265893B1 (en) Arbitrary surface and finger position keyboard
JP5802667B2 (en) Gesture input device and gesture input method
KR100720335B1 (en) Apparatus for inputting a text corresponding to relative coordinates values generated by movement of a touch position and method thereof
KR100630806B1 (en) Command input method using motion recognition device
KR20190135974A (en) Method and apparatus for optimal control based on motion-voice multi-modal command
US20100238137A1 (en) Multi-telepointer, virtual object display device, and virtual object control method
US20120068925A1 (en) System and method for gesture based control
WO2006068357A1 (en) System for wearable general-purpose 3-dimensional input
CN101901051A (en) Data entry device and device based on the input object of distinguishing
KR20090027048A (en) Apparatus and method for recognizing moving signal
KR101452343B1 (en) Wearable device
US20120005615A1 (en) Method for executing an input by means of a virtual keyboard displayed on a screen
TW201403391A (en) Remote interaction system and control thereof
CN105278699A (en) Easy-wearable gesture identification device
WO2009116079A2 (en) Character based input using pre-defined human body gestures
KR101233793B1 (en) Virtual mouse driving method using hand motion recognition
US10955935B2 (en) Tap device with multi-tap feature for expanded character set
Wang et al. A gesture-based method for natural interaction in smart spaces
US20230236673A1 (en) Non-standard keyboard input system
KR101211808B1 (en) Gesture cognitive device and method for recognizing gesture thereof
CN105242795A (en) Method for inputting English letters by azimuth gesture
KR101451943B1 (en) Method and set-top box for controlling screen
US11009968B1 (en) Bi-directional tap communication device
KR20150118377A (en) Information inputting system and method by movements of finger

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09721393

Country of ref document: EP

Kind code of ref document: A2

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 09721393

Country of ref document: EP

Kind code of ref document: A2