WO2023161958A1 - Finger-worn device operable as mouse/keyboard and method for realizing same - Google Patents

Finger-worn device operable as mouse/keyboard and method for realizing same Download PDF

Info

Publication number
WO2023161958A1
WO2023161958A1 PCT/IN2023/050166 IN2023050166W WO2023161958A1 WO 2023161958 A1 WO2023161958 A1 WO 2023161958A1 IN 2023050166 W IN2023050166 W IN 2023050166W WO 2023161958 A1 WO2023161958 A1 WO 2023161958A1
Authority
WO
WIPO (PCT)
Prior art keywords
finger
linear acceleration
keyboard
worn device
mouse
Prior art date
Application number
PCT/IN2023/050166
Other languages
French (fr)
Inventor
Renju Parappillil Baby
Dhinesh RAMASAMY
Kaushik MITRA
Original Assignee
INDIAN INSTITUTE OF TECHNOLOGY MADRAS (IIT Madras)
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by INDIAN INSTITUTE OF TECHNOLOGY MADRAS (IIT Madras) filed Critical INDIAN INSTITUTE OF TECHNOLOGY MADRAS (IIT Madras)
Publication of WO2023161958A1 publication Critical patent/WO2023161958A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/014Hand-worn input/output arrangements, e.g. data gloves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/10Character recognition
    • G06V30/22Character recognition characterised by the type of writing
    • G06V30/228Character recognition characterised by the type of writing of three-dimensional handwriting, e.g. writing in the air
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/033Indexing scheme relating to G06F3/033
    • G06F2203/0331Finger worn pointing device

Definitions

  • the present invention relates to the field of wearable devices and in particular relates to a finger worn device controlled by hand gestures that has the capability of being operated as a mouse and a keyboard for one or more user devices.
  • HMDs head-mounted displays
  • These devices not only perform many basic computing functions, akin to laptops and smartphones, but may also perform unique health-tracking services (such as calorie tracking and sleep monitoring) as a result of being in contact with the user’s body.
  • wearable devices have also been used to operate either as a mouse or a keyboard, to plurality of user devices.
  • These wearable devices generally include an embedded sensor system that may track the motion of the body part on which the wearable device is worn in order to move the mouse pointer.
  • the wearable devices conventionally rely on a tap-kind input or present a virtual keyboard allowing a user to provide input thereby allowing a user the ease to wirelessly access a user device such as a laptop, computer, tablet etc.
  • the existent known technology provides finger-worn devices where the motion of the finger may be tracked to move a mouse pointer.
  • such devices are generally positioned at the proximal phalanx of the finger and hence the accuracy of motion detection is limited.
  • the finger worn devices that may be used as both a mouse and a keyboard generally perform gesture recognition based on a combination of a tap-kind input for keyboard and air-gesture for mouse, thereby requiring different processing techniques for gesture -recognition.
  • the present disclosure overcomes one or more shortcomings of the prior art and provides additional advantages discussed throughout the present disclosure. Additional features and advantages are realized through the techniques of the present disclosure. Other embodiments and aspects of the disclosure are described in detail herein and are considered a part of the claimed disclosure.
  • a finger-worn device configured to pair with a user device and function as mouse/ keyboard to said user device.
  • the finger-worn device comprises a sensor housing configured to be placed at distal phalanx of a finger of a user.
  • Said sensor housing comprises at least one motion sensor, placed within the sensor housing in a manner that the at least one motion sensor rests at tip of the finger pointing towards the user device and configured to capture orientation in euler angles and linear acceleration in all three axes directions, produced by the finger movement.
  • said sensor housing may include one or more visual indicators configured to provide pairing indication of the finger- worn device as the mouse/keyboard with the user device.
  • the finger-worn device further comprises a battery housing operatively coupled to the sensor housing.
  • Said battery housing comprises at least one of a battery, a controller, a Bluetooth unit, a port and one or more input keys.
  • the finger- worn device when functioning as the keyboard the controller is configured to detect a plurality of patterns drawn by the user in air using the at least one motion sensor; and transform the plurality of the detected patterns into valid keyboard inputs by employing linear acceleration based peak sequencing technique.
  • the finger-worn device when functioning as the mouse the controller is configured to track an angular movement of the finger based on a change in orientation of the finger using the at least one motion sensor and transform the angular movement of the finger into a mouse pointer movement.
  • the controller is configured to switch operation between the mouse and the keyboard, when the one of the one or more input keys is pressed by the user for a pre-defined duration.
  • the at least one motion sensor is first stabilized by pointing the distal phalanx of the finger in such a manner that the at least one motion sensor is parallel to a ground plane for a predetermined time duration.
  • the plurality of patterns drawn by the user comprises a combination of one or more gesture letters drawn by the user in a two-dimensional plane, and wherein the one or more gesture letters comprises a plurality of line gesture letters and a plurality of curve gesture letters.
  • the controller in combination with one or more filters is configured to recognize linear acceleration values for each detected pattern of the plurality of patterns in the two-dimensional plane.
  • the Controller is further configured to plot a linear acceleration curve for each detected pattern depicting a variation of the linear acceleration values for each detected pattern with time.
  • the one or more filters are configured to filter each of said linear acceleration curve by removing one or more acceleration values below a threshold to obtain a filtered linear acceleration curve for each of said detected pattern.
  • the one or more filters are further configured to remove noise from each filtered linear acceleration curve to obtain noiseless linear acceleration curve depicting sharp peaks for each detected pattern.
  • the Controller is further configured to detect one or more peaks from each noiseless linear acceleration curve.
  • the Controller is further configured to sequence the one or more peaks in order to map a peak sequence for each detected pattern with a corresponding segment pattern in order to obtain a valid keyboard input.
  • the at least one sensor is first stabilized for a pre-determined time duration in order to detect a pivot point along with an angle subtended by the pivot point on each axis of the two- dimensional plane.
  • the controller is configured to measure a translation of the finger from the pivot point along each axis of the two- dimensional plane by calculating a change in the angle subtended by the pivot point at each axis of the two-dimensional plane and transform the translation into the mouse pointer movement.
  • a method for allowing a finger- worn device to function as mouse/ keyboard to a user device comprises allowing a sensor housing of the finger worn device to be placed at distal phalanx of a finger of a user such that at least one motion sensor rests at tip of the finger pointing towards the user device.
  • the method further comprises pairing the finger-worn device with the user device.
  • the method further comprises providing one or more visual indicators indicating pairing between the finger-worn device, as the mouse/keyboard, and the user device.
  • the method further comprises allowing the at least one motion sensor to capture orientation in euler angles and linear acceleration in all three axes directions, produced by the finger movement of the user.
  • the method when functioning as the keyboard, comprises detecting, by a controller, a plurality of patterns drawn by the user in air using the at least one motion sensor and transforming, by the controller, the plurality of the detected patterns into valid keyboard inputs by employing linear acceleration based peak sequencing technique.
  • the method when functioning as the mouse, comprises tracking, by the controller, an angular movement of the finger based on a change in orientation of the finger using the at least one motion sensor and transforming, by the controller, the angular movement of the finger into a mouse pointer movement.
  • the process of transforming the plurality of patterns into valid keyboard inputs by employing linear acceleration based peak sequencing technique comprises: recognizing linear acceleration values for each detected pattern of the plurality of patterns in a two-dimensional plane.
  • the process further comprises plotting, a linear acceleration curve for each detected pattern depicting a variation of the linear acceleration values for each detected pattern with time.
  • the process further comprises filtering each of said linear acceleration curve by removing one or more acceleration values below a threshold to obtain a filtered linear acceleration curve for each of said detected pattern.
  • the process further comprises removing noise from each filtered linear acceleration curve to obtain noiseless linear acceleration curve depicting sharp peaks for each detected pattern.
  • the process further comprises detecting one or more peaks from each noiseless linear acceleration curve.
  • the process further comprises sequencing the one or more peaks in order to map a peak sequence for each detected pattern with a corresponding segment pattern in order to obtain a valid keyboard input.
  • process of transforming the angular movement of the finger into a mouse pointer movement comprises measuring a translation of the finger from a pivot point along each axis of the two-dimensional plane by calculating a change in the angle subtended by the pivot point at each axis of the two- dimensional plane and transforming the translation into the mouse pointer movement.
  • Figure 1A depicts an environment 100 for implementing a finger-worn device 102 in accordance with an embodiment of the present disclosure
  • Figure IB depicts a side view of the finger-worn device 102, in accordance with an embodiment of the present disclosure
  • Figure 1C depicts a top view of the finger- worn device 102, in accordance with an embodiment of the present disclosure
  • Figure 2 depicts a structural arrangement of the finger-worn device 102, by way of a step diagram, in accordance with an embodiment of the present disclosure
  • Figure 3 illustrates the keyboard cycle 300 to initialize and operate the finger-worn device 102 as a keyboard, in accordance with an embodiment of the present disclosure
  • Figure 4 illustrates one or more gesture letters used by the finger-worn device to generate one or more keyboard functionalities, in accordance with an embodiment of the present disclosure
  • Figure 5 illustrates exemplary gesture segmentation used by the finger-worn device to generate one or more keyboard functionalities, in accordance with an embodiment of the present disclosure
  • Figure 6 illustrates a gesture recognition process 600 used by the finger- worn device to recognize patterns drawn in air by a user to execute one or more keyboard functionalities, in accordance with an embodiment of the present disclosure
  • Figure 7 illustrates exemplary linear acceleration curves 700a-d for peak detection and sequencing, in accordance with an embodiment of the present disclosure
  • Figure 8 illustrates exemplary filtered linear acceleration curves 800a-d based on the linear acceleration curves 700a-d for peak detection and sequencing, in accordance with an embodiment of the present disclosure
  • Figure 9 illustrates noiseless linear acceleration curves 900a-d based on the filtered linear acceleration curves 800a-d for peak detection and sequencing, in accordance with an embodiment of the present disclosure
  • Figure 10 illustrates the mouse cycle 1000 to initialize and operate the finger-worn device 102 as a mouse, in accordance with an embodiment of the present disclosure
  • Figure 11 illustrates angular movement of the finger-worn device 102 with respect to a pivot point used by the finger-worn device to execute one or more mouse functionalities, in accordance with an embodiment of the present disclosure
  • Figure 12 depicts a method 1200, by way of a flow diagram, for allowing a finger- worn device 102 to function as mouse/ keyboard to the user device 108, in accordance with an embodiment of the present disclosure
  • Figure 12A depicts a method 1200A, by way of a flow diagram, for transforming the plurality of the detected patterns into valid keyboard inputs, in accordance with an embodiment of the present disclosure
  • Figure 12B depicts a method 1200B, by way of a flow diagram, for transforming the angular movement of the finger into a mouse pointer movement.
  • finger-worn device that is paired with a user device in order to operate as a mouse/keyboard to the user device.
  • finger- worn devices comprise an embedded sensor system that may track the motion of the finger in order to move the mouse pointer.
  • the finger-worn devices generally rely on a tap-kind input or present a virtual keyboard allowing a user to provide input thereby allowing a user the ease to wirelessly access a user device such as a laptop, computer, tablet etc.
  • such finger-worn devices are generally positioned at the proximal phalanx of the finger and hence the accuracy of motion detection is limited.
  • the finger worn devices that may be used as both a mouse and a keyboard generally perform gesture recognition based on a combination of a tap-kind input for keyboard and air-gesture for mouse, thereby requiring different processing techniques to process the inputs into valid gestures and also hampering user experience.
  • the present disclosure understands this need and provides a finger- worn device with motion sensors placed at tip of the finger thereby ensuring higher accuracy in motion detection. Further, the present disclosure provides the ease of using air gestures for moving the mouse pointer or inputting words, sentences or other keyboards shortcuts, thereby improving user experience. The present disclosure also provide an efficient technique to recognize the air gestures and transform them into valid inputs. The detailed description of the finger-worn device is described in the subsequent paragraphs.
  • Figure 1A depicts an environment 100 to implement the finger-worn device 102 in accordance with an embodiment of the present disclosure.
  • the environment 100 depicts the finger- worn device 102 being worn on an index finger of a user and configured to operate as a mouse/keyboard for a user device 108.
  • the choice of finger to wear the finger-worn device 102 in Figure 1A is merely exemplary and should not be construed as limiting.
  • Figures IB and 1C depicts the finger-worn device 102 from different orthogonal views. For instance, Figure IB depicts the finger-worn device 102 from a side view while Figure 1C depicts the finger-worn device 102 from a top view.
  • the finger-worn device 102 comprises a sensor housing 104 and a battery housing 106.
  • the sensor housing 104 is placed at distal phalanx of a finger of the user while the battery housing 106 is placed at proximal phalanx of the finger of the user and the sensor housing 104 and the battery housing 106 are operatively coupled to each other by means of a wire.
  • the battery housing 106 may be placed at finger locations other than the proximal phalanx and may be coupled to the sensor housing 104 by other wired or wireless means.
  • the sensor housing 104 of the finger-worn device 102 as depicted in Figure 2 comprises at least one motion sensor 202 and one or more visual indicators 204.
  • the at least one motion sensor 202 rests at tip of the finger pointing towards the user device 108 and is configured to capture orientation of the finger in euler angles and linear acceleration in all three axes directions, produced by the finger movement.
  • the one or more visual indicators 204 comprises light emitting diodes (LEDs) and are configured to provide pairing indication of the finger-worn device 102 as the mouse/keyboard with the user device 108.
  • the battery housing 106 of the finger-worn device 102 as depicted in Figure 2 comprises a battery 206, a Bluetooth unit 208, a controller 210 and a port 212.
  • the finger-worn device 102 is paired with the user device 108 by using Bluetooth technology facilitated by the Bluetooth unit 208.
  • the port 212 may be a USB charging port configured to charge the finger-worn device 102 by wired means.
  • the finger- worn device 102 may be charged by other wireless means.
  • the finger-worn device 102 as illustrated in Figure 2 works both as a mouse and a keyboard. However, in one embodiment, when the finger-worn device 102 is initialized, it works as a keyboard. However, the operation of the finger-worn device 102 may be switched to mouse on pressing one of the one or more input keys 110, 112 for a predefined duration or for a predefined number of times.
  • the one or more input keys 110, 112 are illustrated in the orthogonal views of the finger-worn device 102 illustrated in Figures IB and 1C. For instance, in one embodiment, the operation of the finger-worn device 102 may be switched between keyboard and mouse on pressing one of the one or more input keys 110, 112 twice.
  • the operation of the finger-worn device 102 as a keyboard and a mouse are elaborated in upcoming paragraphs.
  • FIG. 3 illustrates the operational cycle 300 of a finger-worn device 102 when working as the keyboard, in accordance with an embodiment of the present disclosure.
  • the at least one motion sensor 202 needs to be stabilized as illustrated at step 302.
  • the distal phalanx of the finger is pointed in such a manner that the at least one motion sensor 202 is parallel to a ground plane for a pre -determined time duration.
  • the pre-determined time duration for stabilizing the at least one motion sensor 202 may be 200ms.
  • the stabilization indication is provided by the one or more visual indicators 204.
  • the finger-worn device 102 is ready to be used as a keyboard.
  • the user may draw patterns in the air that will be recognized as valid keyboard inputs by the controller 210 by employing linear acceleration based peak sequencing technique.
  • the patterns in air that may be drawn by the user comprises a combination of one or more gesture letters drawn by the user in a two-dimensional plane such as the y-z plane.
  • the one or more gesture letters comprises a plurality of line gesture letters and a plurality of curve gesture letters.
  • the one or more gesture letters employed in the present disclosure are depicted in Figure 4.
  • the plurality of line gesture letters comprises +y, -y, +Z and -z and the plurality of curve gesture letters comprises -i-cr, -cr, 4-cl, -cl, -t-o and -o representing half circles, full circles, or curves.
  • the positive and negative signs depict a direction of the gesture letters.
  • the anti-clockwise direction of the curve gesture letters may be represented by a negative sign and the clock-wise direction of the curve gesture letters may be represented by a positive sign.
  • the one or more gesture letters are considered herein to be the building blocks for various patterns that are further transformed into valid keyboard gesture. For instance, a pattern “A” drawn by the user would comprise three line gesture letters and may map with a keyboard input of “Control+Z”.
  • the linear acceleration values are continuously recognized from the at least one motion sensor 202 as illustrated in step 304.
  • the gesture recognition process 600 based on linear acceleration and peak sequencing technique is illustrated in Figure 6.
  • the gesture recognition process is implemented by the controller 210 to transform the patterns into valid keyboard inputs.
  • linear acceleration curves are plotted by the controller 210 for each pattern depicting the variation of linear acceleration in y and z-directions with time as illustrated in step 602.
  • the linear acceleration curves 700a-d for the patterns “two”, “swipe right”, “square” and “zero” have been illustrated in Figure 7.
  • the plotted linear acceleration curves are filtered by one or more filters in order to eliminate one or more acceleration values below a threshold as illustrated in step 604 in order to obtain filtered linear acceleration curves.
  • the filtered linear acceleration curves 800a-d for the patterns “two”, “swipe right”, “square” and “zero” have been illustrated in Figure 8.
  • the filtered linear acceleration curves are subjected to a noise-removal process in order to obtain sharp (or prominent) peaks as illustrated in step 606.
  • the noiseless linear acceleration curves 900a-d for the patterns “two”, “swipe right”, “square” and “zero” have been illustrated in Figure 9.
  • one or more low pass, band pass or high pass filters may be used to carry out the process of noise-removal and the same is not explained for the sake of brevity.
  • Peaks corresponding to each pattern are then detected by the controller 210 from the corresponding noiseless linear acceleration curves as illustrated in step 608.
  • the detected peaks are then sequenced by the controller 210 in order to obtain a peak sequence for mapping the detected pattern with a valid keyboard input as illustrated at step 610.
  • peak sequences of each gesture letter are first identified as illustrated in Table 1.
  • the peak sequence is mapped to a corresponding segment pattern [Npt] as illustrated in Figure 5 which is then further mapped to a valid keyboard input.
  • the peak sequences for the patterns “two”, “swipe right”, “square” and “zero” are obtained from the noiseless linear acceleration curves 900a-d as illustrated in Figure 9 and are tabulated in table 2.
  • the peak sequences indicate a variation of acceleration with time along the y and the z axis.
  • the gesture letter “+y” indicates a straight line along the positive y axis and therefore, has a positive acceleration in the positive y-direction while a negative acceleration along the negative y-direction.
  • the only important factor is the position of the peaks and not their amplitude.
  • the peak sequences also help in recognizing whether the gesture letter is a line gesture letter or a curve gesture letter as a line gesture letter has acceleration peaks only in one axis, but a curved gesture letter has acceleration peaks in both the axis directions as may be clearly observed from the table 1.
  • the valid keyboard input is outputted by the controller 210 on the user device 108 based on the mapping performed at step 610.
  • the valid keyboard inputs are burnt to finger-worn device using the user device 108.
  • the mapping between valid keyboard inputs and custom keyboard shortcuts are done in an external application and then burnt onto the finger-worn device.
  • the at least one motion sensor 202 is again stabilized by pointing the distal phalanx of the finger in such a manner that the at least one motion sensor 202 is parallel to a ground plane for the pre -determined time duration as illustrated in step 306. Operation of the finger-worn device 102 as a mouse:
  • Figure 10 illustrates the mouse cycle 1000 in accordance with an embodiment of the present disclosure.
  • the at least one motion sensor 202 needs to be stabilized as illustrated at step 1002.
  • the distal phalanx of the finger is pointed in such a manner that the at least one motion sensor 202 is parallel to a ground plane for a predetermined time duration in order to detect a pivot point along with an angle subtended by the pivot point on each axis of the two-dimensional plane as illustrated at step 1004 and depicted in Figure 11.
  • Figure 11 depicts point p to be the pivot point with the angles subtended at the y-axis as a 0 and the angle subtended at z-axis as b 0 .
  • the pre-determined time duration for stabilizing the at least one motion sensor 202 may be 50-500ms.
  • the stabilization indication is provided by the one or more visual indicators 204.
  • the finger-worn device 102 is ready to be used as a mouse.
  • the controller 210 is configured track an angular movement of the finger based on a change in orientation of the finger using the at least one motion sensor 202 and transform the angular movement of the finger into a mouse pointer movement.
  • the controller 210 After detecting the pivot point p, when the finger moves from the pivot point, the controller 210 measures a translation of the finger from pivot point along each axis of the two-dimensional plane by calculating a change in the angle subtended by the pivot point at each axis of the two-dimensional plane. For instance, with reference to Figure 11, the finger moves from the pivot point p to point q and the angle made by point q with y axis becomes ai and the angle made with the z-axis becomes bi. The translation is measured by the controller 210 as -
  • FIG. 12 depicts a method 1200 for allowing the finger-worn device 102 to function as mouse/ keyboard to a user device 108, in accordance with an embodiment of the present disclosure.
  • the method 1200 may include allowing a sensor housing 104 of the finger worn device 102 to be placed at distal phalanx of a finger of a user such that at least one motion sensor 202 rests at tip of the finger pointing towards the user device 108.
  • the method 1200 may include pairing the finger-worn device 102 with the user device 108.
  • the method 1200 may include providing one or more visual indicators 204 indicating pairing between the finger-worn device 102, as the mouse/keyboard, and the user device 108.
  • the method 1200 may include allowing the at least one motion sensor 202 to capture orientation in euler angles and linear acceleration in all three axes directions, produced by the finger movement of the user.
  • the method 1200 executes operations at steps 1210 and 1212.
  • the method 1200 may include detecting a plurality of patterns drawn by the user in air using the at least one motion sensor 202.
  • the method 1200 may include transforming the plurality of the detected patterns into valid keyboard inputs by employing linear acceleration based peak sequencing technique. Said process is elaborated in detail in steps 1212-1 to 1212-6 as illustrated in Figure 12A. [0071] At step 1212-1, the method 1200A may include recognizing linear acceleration values for each detected pattern of the plurality of patterns in a two-dimensional plane.
  • the method 1200A may include plotting, a linear acceleration curve for each detected pattern depicting a variation of the linear acceleration values for each detected pattern with time.
  • the method 1200A may include filtering each of said linear acceleration curve by removing one or more acceleration values below a threshold to obtain a filtered linear acceleration curve for each of said detected pattern.
  • the method 1200A may include removing noise from the filtered linear acceleration curves to obtain noiseless linear acceleration curves with prominent peaks.
  • the method 1200A may include detecting one or more peaks from each noiseless linear acceleration curve.
  • the method 1200A may include sequencing the one or more peaks in each noiseless linear acceleration curve to map the peak sequence with a corresponding segment pattern [Npt] for each pattern in order to map the detected pattern with a valid keyboard input.
  • the method 1200 executes operations at steps 1214 and 1216.
  • the method 1200 may include tracking an angular movement of the finger based on a change in orientation of the finger using the at least one motion sensor 2O2.At step 1216, the method 1200 may include transforming the angular movement of the finger into a mouse pointer movement. The process of transforming the angular movement of the finger into a mouse pointer movement is elaborated in steps 1216-1 to 1216-2 and illustrated in Figure 12B. [0079] At step 1216-1, the method 1200B may include measuring a translation of the finger from a pivot point along each axis of the two-dimensional plane by calculating a change in the angle subtended by the pivot point at each axis of the two-dimensional plane.
  • the method 1200B may include transforming the translation into the mouse pointer movement.
  • the valid keyboard inputs may be burned to the finger-worn device 102 using the user device 108 such as smartphone or tablet.
  • the mapping between the plurality of patterns and valid keyboard inputs may be done using one or more platforms available used by the user device 108 which may then be burned onto the finger-worn device 102 using Bluetooth UART communication protocol.
  • the finger-worn device 102 may be used to control machines in public domain, like consoles in gaming arena without touching. For that it is integral to communicate the valid keyboard inputs to the external console. This may be done by generating the QR code using one or more platforms available used by the user device 108, which may be scanned by the external console using a visual device such as a camera. The authentication key for initiating Bluetooth communication the finger-worn device 102 may also be read through the QR scanner.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Hardware Design (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Psychiatry (AREA)
  • Social Psychology (AREA)
  • Position Input By Displaying (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Disclosed herein is a finger-worn device (102) configured to function as mouse/ keyboard to a user device (108). The finger-worn device (102) comprises a sensor housing (104) and a battery housing (106) operatively coupled to one another. The sensor housing (104) is placed at a distal phalanx of finger of a user and comprises one or more visual indicators (204) and at least one motion sensor (202) resting at tip of the finger pointing towards the user device (108) and configured to capture orientation in euler angles and linear acceleration in all three axes directions. The finger-worn device (102) employs linear acceleration based peak sequencing technique to transform air drawn patterns into valid keyboard inputs and measures an angular movement of the finger to transform it into a mouse pointer movement.

Description

“FINGER-WORN DEVICE OPERABLE AS MOUSE/KEYBOARD AND METHOD FOR REALIZING SAME”
TECHNICAL FIELD
[001] The present invention relates to the field of wearable devices and in particular relates to a finger worn device controlled by hand gestures that has the capability of being operated as a mouse and a keyboard for one or more user devices.
BACKGROUND OF INVENTION
[002] The following description includes information that may be useful in understanding the present invention. It is not an admission that any of the information provided herein is prior art or relevant to the presently claimed invention, or that any publication specifically or implicitly referenced is prior art.
[003] Wearable devices are now being extensively researched and used for various applications. Smartwatches are likely the best-known wearable devices, but many other kinds of wearable devices have emerged, and continue to emerge today. Examples of modern wearable technology available on the market range from head-mounted displays (HMDs) to clothing and jewellery. These devices not only perform many basic computing functions, akin to laptops and smartphones, but may also perform unique health-tracking services (such as calorie tracking and sleep monitoring) as a result of being in contact with the user’s body.
[004] Apart from the above-mentioned applications of wearable devices, wearable devices have also been used to operate either as a mouse or a keyboard, to plurality of user devices. These wearable devices generally include an embedded sensor system that may track the motion of the body part on which the wearable device is worn in order to move the mouse pointer. Further, with respect to use as a keyboard, the wearable devices conventionally rely on a tap-kind input or present a virtual keyboard allowing a user to provide input thereby allowing a user the ease to wirelessly access a user device such as a laptop, computer, tablet etc.
[005] Further, the existent known technology provides finger-worn devices where the motion of the finger may be tracked to move a mouse pointer. However, such devices are generally positioned at the proximal phalanx of the finger and hence the accuracy of motion detection is limited. Furthermore, the finger worn devices that may be used as both a mouse and a keyboard generally perform gesture recognition based on a combination of a tap-kind input for keyboard and air-gesture for mouse, thereby requiring different processing techniques for gesture -recognition.
[006] There exists therefore a need for a finger worn device that is compact, is capable of operating both a mouse and a keyboard and recognizes air gestures drawn by the user with a greater accuracy for an improved user experience.
SUMMARY OF INVENTION
[007] The present disclosure overcomes one or more shortcomings of the prior art and provides additional advantages discussed throughout the present disclosure. Additional features and advantages are realized through the techniques of the present disclosure. Other embodiments and aspects of the disclosure are described in detail herein and are considered a part of the claimed disclosure.
[008] In one non-limiting embodiment of the present disclosure, a finger-worn device configured to pair with a user device and function as mouse/ keyboard to said user device is disclosed. The finger-worn device comprises a sensor housing configured to be placed at distal phalanx of a finger of a user. Said sensor housing comprises at least one motion sensor, placed within the sensor housing in a manner that the at least one motion sensor rests at tip of the finger pointing towards the user device and configured to capture orientation in euler angles and linear acceleration in all three axes directions, produced by the finger movement. Further, said sensor housing may include one or more visual indicators configured to provide pairing indication of the finger- worn device as the mouse/keyboard with the user device. The finger-worn device further comprises a battery housing operatively coupled to the sensor housing. Said battery housing comprises at least one of a battery, a controller, a Bluetooth unit, a port and one or more input keys. In one specific embodiment, the finger- worn device when functioning as the keyboard, the controller is configured to detect a plurality of patterns drawn by the user in air using the at least one motion sensor; and transform the plurality of the detected patterns into valid keyboard inputs by employing linear acceleration based peak sequencing technique. Further, in another specific embodiment, the finger-worn device when functioning as the mouse, the controller is configured to track an angular movement of the finger based on a change in orientation of the finger using the at least one motion sensor and transform the angular movement of the finger into a mouse pointer movement.
[009] In one non-limiting embodiment of the present disclosure, the controller is configured to switch operation between the mouse and the keyboard, when the one of the one or more input keys is pressed by the user for a pre-defined duration.
[0010] In one non-limiting embodiment of the present disclosure, to operate as a keyboard, the at least one motion sensor is first stabilized by pointing the distal phalanx of the finger in such a manner that the at least one motion sensor is parallel to a ground plane for a predetermined time duration.
[0011] In one non-limiting embodiment of the present disclosure, the plurality of patterns drawn by the user comprises a combination of one or more gesture letters drawn by the user in a two-dimensional plane, and wherein the one or more gesture letters comprises a plurality of line gesture letters and a plurality of curve gesture letters.
[0012] In one non-limiting embodiment of the present disclosure, to transform the plurality of patterns into valid keyboard inputs by employing linear acceleration based peak sequencing technique, the controller in combination with one or more filters is configured to recognize linear acceleration values for each detected pattern of the plurality of patterns in the two-dimensional plane. The Controller is further configured to plot a linear acceleration curve for each detected pattern depicting a variation of the linear acceleration values for each detected pattern with time. The one or more filters are configured to filter each of said linear acceleration curve by removing one or more acceleration values below a threshold to obtain a filtered linear acceleration curve for each of said detected pattern. The one or more filters are further configured to remove noise from each filtered linear acceleration curve to obtain noiseless linear acceleration curve depicting sharp peaks for each detected pattern. The Controller is further configured to detect one or more peaks from each noiseless linear acceleration curve. The Controller is further configured to sequence the one or more peaks in order to map a peak sequence for each detected pattern with a corresponding segment pattern in order to obtain a valid keyboard input. [0013] In one non-limiting embodiment of the present disclosure, to operate as a mouse, the at least one sensor is first stabilized for a pre-determined time duration in order to detect a pivot point along with an angle subtended by the pivot point on each axis of the two- dimensional plane.
[0014] In one non-limiting embodiment of the present disclosure, to transform the angular movement of the finger into a mouse pointer movement, the controller is configured to measure a translation of the finger from the pivot point along each axis of the two- dimensional plane by calculating a change in the angle subtended by the pivot point at each axis of the two-dimensional plane and transform the translation into the mouse pointer movement.
[0015] In one non-limiting embodiment of the present disclosure, a method for allowing a finger- worn device to function as mouse/ keyboard to a user device is disclosed. The method comprises allowing a sensor housing of the finger worn device to be placed at distal phalanx of a finger of a user such that at least one motion sensor rests at tip of the finger pointing towards the user device. The method further comprises pairing the finger-worn device with the user device. The method further comprises providing one or more visual indicators indicating pairing between the finger-worn device, as the mouse/keyboard, and the user device. The method further comprises allowing the at least one motion sensor to capture orientation in euler angles and linear acceleration in all three axes directions, produced by the finger movement of the user. In one specific embodiment, when functioning as the keyboard, the method comprises detecting, by a controller, a plurality of patterns drawn by the user in air using the at least one motion sensor and transforming, by the controller, the plurality of the detected patterns into valid keyboard inputs by employing linear acceleration based peak sequencing technique. In another specific embodiment, when functioning as the mouse, the method comprises tracking, by the controller, an angular movement of the finger based on a change in orientation of the finger using the at least one motion sensor and transforming, by the controller, the angular movement of the finger into a mouse pointer movement.
[0016] In one non-limiting embodiment of the present disclosure, the process of transforming the plurality of patterns into valid keyboard inputs by employing linear acceleration based peak sequencing technique comprises: recognizing linear acceleration values for each detected pattern of the plurality of patterns in a two-dimensional plane. The process further comprises plotting, a linear acceleration curve for each detected pattern depicting a variation of the linear acceleration values for each detected pattern with time. The process further comprises filtering each of said linear acceleration curve by removing one or more acceleration values below a threshold to obtain a filtered linear acceleration curve for each of said detected pattern. The process further comprises removing noise from each filtered linear acceleration curve to obtain noiseless linear acceleration curve depicting sharp peaks for each detected pattern. The process further comprises detecting one or more peaks from each noiseless linear acceleration curve. The process further comprises sequencing the one or more peaks in order to map a peak sequence for each detected pattern with a corresponding segment pattern in order to obtain a valid keyboard input.
[0017] In one non-limiting embodiment of the present disclosure, process of transforming the angular movement of the finger into a mouse pointer movement comprises measuring a translation of the finger from a pivot point along each axis of the two-dimensional plane by calculating a change in the angle subtended by the pivot point at each axis of the two- dimensional plane and transforming the translation into the mouse pointer movement.
[0018] The foregoing summary is illustrative only and is not intended to be in any way limiting. In addition to the illustrative aspects, embodiments, and features described above, further aspects, embodiments, and features will become apparent by reference to the drawings and the following detailed description.
BRIEF DESCRIPTION OF DRAWINGS
[0019] The embodiments of the disclosure itself, as well as a preferred mode of use, further objectives and advantages thereof, will best be understood by reference to the following detailed description of an illustrative embodiment when read in conjunction with the accompanying drawings. One or more embodiments are now described, by way of example only, with reference to the accompanying drawings in which:
[0020] Figure 1A depicts an environment 100 for implementing a finger-worn device 102 in accordance with an embodiment of the present disclosure; [0021] Figure IB depicts a side view of the finger-worn device 102, in accordance with an embodiment of the present disclosure;
[0022] Figure 1C depicts a top view of the finger- worn device 102, in accordance with an embodiment of the present disclosure;
[0023] Figure 2 depicts a structural arrangement of the finger-worn device 102, by way of a step diagram, in accordance with an embodiment of the present disclosure;
[0024] Figure 3 illustrates the keyboard cycle 300 to initialize and operate the finger-worn device 102 as a keyboard, in accordance with an embodiment of the present disclosure;
[0025] Figure 4 illustrates one or more gesture letters used by the finger-worn device to generate one or more keyboard functionalities, in accordance with an embodiment of the present disclosure;
[0026] Figure 5 illustrates exemplary gesture segmentation used by the finger-worn device to generate one or more keyboard functionalities, in accordance with an embodiment of the present disclosure;
[0027] Figure 6 illustrates a gesture recognition process 600 used by the finger- worn device to recognize patterns drawn in air by a user to execute one or more keyboard functionalities, in accordance with an embodiment of the present disclosure;
[0028] Figure 7 illustrates exemplary linear acceleration curves 700a-d for peak detection and sequencing, in accordance with an embodiment of the present disclosure;
[0029] Figure 8 illustrates exemplary filtered linear acceleration curves 800a-d based on the linear acceleration curves 700a-d for peak detection and sequencing, in accordance with an embodiment of the present disclosure;
[0030] Figure 9 illustrates noiseless linear acceleration curves 900a-d based on the filtered linear acceleration curves 800a-d for peak detection and sequencing, in accordance with an embodiment of the present disclosure; [0031] Figure 10 illustrates the mouse cycle 1000 to initialize and operate the finger-worn device 102 as a mouse, in accordance with an embodiment of the present disclosure;
[0032] Figure 11 illustrates angular movement of the finger-worn device 102 with respect to a pivot point used by the finger-worn device to execute one or more mouse functionalities, in accordance with an embodiment of the present disclosure;
[0033] Figure 12 depicts a method 1200, by way of a flow diagram, for allowing a finger- worn device 102 to function as mouse/ keyboard to the user device 108, in accordance with an embodiment of the present disclosure;
[0034] Figure 12A depicts a method 1200A, by way of a flow diagram, for transforming the plurality of the detected patterns into valid keyboard inputs, in accordance with an embodiment of the present disclosure; and
[0035] Figure 12B depicts a method 1200B, by way of a flow diagram, for transforming the angular movement of the finger into a mouse pointer movement.
[0036] The figures depict embodiments of the disclosure for purposes of illustration only. One skilled in the art will readily recognize from the following description that alternative embodiments of the structures and methods illustrated herein may be employed without departing from the principles of the disclosure described herein.
DETAILED DESCRIPTION
[0037] The foregoing has broadly outlined the features and technical advantages of the present disclosure in order that the detailed description of the disclosure that follows may be better understood. It should be appreciated by those skilled in the art that the conception and specific embodiment disclosed may be readily utilized as a basis for modifying or designing other structures for carrying out the same purposes of the present disclosure.
[0038] The novel features which are believed to be characteristic of the disclosure, both as to its organization and method of operation, together with further objects and advantages will be better understood from the following description when considered in connection with the accompanying Figures. It is to be expressly understood, however, that each of the Figures is provided for the purpose of illustration and description only and is not intended as a definition of the limits of the present disclosure.
[0039] Disclosed herein is a finger-worn device that is paired with a user device in order to operate as a mouse/keyboard to the user device. Conventionally, to operate either as a mouse or a keyboard, finger- worn devices comprise an embedded sensor system that may track the motion of the finger in order to move the mouse pointer. Further, with respect to use as a keyboard, the finger-worn devices generally rely on a tap-kind input or present a virtual keyboard allowing a user to provide input thereby allowing a user the ease to wirelessly access a user device such as a laptop, computer, tablet etc. However, such finger-worn devices are generally positioned at the proximal phalanx of the finger and hence the accuracy of motion detection is limited. Furthermore, the finger worn devices that may be used as both a mouse and a keyboard generally perform gesture recognition based on a combination of a tap-kind input for keyboard and air-gesture for mouse, thereby requiring different processing techniques to process the inputs into valid gestures and also hampering user experience.
[0040] The present disclosure understands this need and provides a finger- worn device with motion sensors placed at tip of the finger thereby ensuring higher accuracy in motion detection. Further, the present disclosure provides the ease of using air gestures for moving the mouse pointer or inputting words, sentences or other keyboards shortcuts, thereby improving user experience. The present disclosure also provide an efficient technique to recognize the air gestures and transform them into valid inputs. The detailed description of the finger-worn device is described in the subsequent paragraphs.
[0041] Figure 1A depicts an environment 100 to implement the finger-worn device 102 in accordance with an embodiment of the present disclosure. The environment 100 depicts the finger- worn device 102 being worn on an index finger of a user and configured to operate as a mouse/keyboard for a user device 108. It may be noted that the choice of finger to wear the finger-worn device 102 in Figure 1A is merely exemplary and should not be construed as limiting. Further, Figures IB and 1C depicts the finger-worn device 102 from different orthogonal views. For instance, Figure IB depicts the finger-worn device 102 from a side view while Figure 1C depicts the finger-worn device 102 from a top view. The various components of the finger worn device 102 are depicted in Figure 2. As seen from Figure 2, the finger-worn device 102 comprises a sensor housing 104 and a battery housing 106. In accordance with the environment 100 depicted in Figure 1A, the sensor housing 104 is placed at distal phalanx of a finger of the user while the battery housing 106 is placed at proximal phalanx of the finger of the user and the sensor housing 104 and the battery housing 106 are operatively coupled to each other by means of a wire. However, it may be noted by a skilled person that the battery housing 106 may be placed at finger locations other than the proximal phalanx and may be coupled to the sensor housing 104 by other wired or wireless means.
[0042] The sensor housing 104 of the finger-worn device 102 as depicted in Figure 2 comprises at least one motion sensor 202 and one or more visual indicators 204. The at least one motion sensor 202 rests at tip of the finger pointing towards the user device 108 and is configured to capture orientation of the finger in euler angles and linear acceleration in all three axes directions, produced by the finger movement. Further, in one embodiment, the one or more visual indicators 204 comprises light emitting diodes (LEDs) and are configured to provide pairing indication of the finger-worn device 102 as the mouse/keyboard with the user device 108.
[0043] The battery housing 106 of the finger-worn device 102 as depicted in Figure 2 comprises a battery 206, a Bluetooth unit 208, a controller 210 and a port 212. In one embodiment, the finger-worn device 102 is paired with the user device 108 by using Bluetooth technology facilitated by the Bluetooth unit 208. However, it may be noted by a skilled person that the finger-worn device 102 may be paired with the user device 108 by connectivity means other than Bluetooth. Further, in one embodiment, the port 212 may be a USB charging port configured to charge the finger-worn device 102 by wired means. However, it may be noted by a skilled person that the finger- worn device 102 may be charged by other wireless means.
[0044] The finger-worn device 102 as illustrated in Figure 2 works both as a mouse and a keyboard. However, in one embodiment, when the finger-worn device 102 is initialized, it works as a keyboard. However, the operation of the finger-worn device 102 may be switched to mouse on pressing one of the one or more input keys 110, 112 for a predefined duration or for a predefined number of times. The one or more input keys 110, 112 are illustrated in the orthogonal views of the finger-worn device 102 illustrated in Figures IB and 1C. For instance, in one embodiment, the operation of the finger-worn device 102 may be switched between keyboard and mouse on pressing one of the one or more input keys 110, 112 twice. The operation of the finger-worn device 102 as a keyboard and a mouse are elaborated in upcoming paragraphs.
Operation of the finger-worn device 102 as a keyboard:
[0045] Figure 3 illustrates the operational cycle 300 of a finger-worn device 102 when working as the keyboard, in accordance with an embodiment of the present disclosure. For commencing the operation of the finger-worn device 102 as a keyboard, the at least one motion sensor 202 needs to be stabilized as illustrated at step 302. For stabilizing the at least one motion sensor 202, the distal phalanx of the finger is pointed in such a manner that the at least one motion sensor 202 is parallel to a ground plane for a pre -determined time duration. In one embodiment, the pre-determined time duration for stabilizing the at least one motion sensor 202 may be 200ms. However, it may be noted by a skilled pre-determined time duration described herein is merely exemplary and should not be construed to be limiting. The stabilization indication is provided by the one or more visual indicators 204.
[0046] Once the at least one motion sensor 202 is stabilized, the finger-worn device 102 is ready to be used as a keyboard. During operation as a keyboard, the user may draw patterns in the air that will be recognized as valid keyboard inputs by the controller 210 by employing linear acceleration based peak sequencing technique.
[0047] The patterns in air that may be drawn by the user comprises a combination of one or more gesture letters drawn by the user in a two-dimensional plane such as the y-z plane. In one embodiment, the one or more gesture letters comprises a plurality of line gesture letters and a plurality of curve gesture letters. The one or more gesture letters employed in the present disclosure are depicted in Figure 4. The plurality of line gesture letters comprises +y, -y, +Z and -z and the plurality of curve gesture letters comprises -i-cr, -cr, 4-cl, -cl, -t-o and -o representing half circles, full circles, or curves. The positive and negative signs depict a direction of the gesture letters. For instance, the anti-clockwise direction of the curve gesture letters may be represented by a negative sign and the clock-wise direction of the curve gesture letters may be represented by a positive sign. The patterns drawn in the air by the user are segmented based on the combination of the one or more gesture letters. For instance, as illustrated in Figure 5, the pattern “two” may be segmented as [+cr, -y] with number of segments = 2 comprises two gesture letters viz “+cr” and “-y”. Similarly, the pattern “square” may be segmented as [-y, -z, -i-y, -t-z] with number of segments = 4 comprising four gesture letters viz “-y”, “-z”, “+y” and “+z”. Further, the pattern “swipe right” may be segmented as [-y] with number of segments = 1 comprising only one gesture letter “-y” and the pattern “zero” may be segmented as [+o] with number of segments =1 comprising only one gesture letter “+o”. It may be understood by a skilled person that the one or more gesture letters are considered herein to be the building blocks for various patterns that are further transformed into valid keyboard gesture. For instance, a pattern “A” drawn by the user would comprise three line gesture letters and may map with a keyboard input of “Control+Z”.
[0048] For each drawn pattern, the linear acceleration values are continuously recognized from the at least one motion sensor 202 as illustrated in step 304. The gesture recognition process 600 based on linear acceleration and peak sequencing technique is illustrated in Figure 6. The gesture recognition process is implemented by the controller 210 to transform the patterns into valid keyboard inputs.
[0049] Based on the linear acceleration values recognized for the patterns drawn by the user in air, linear acceleration curves are plotted by the controller 210 for each pattern depicting the variation of linear acceleration in y and z-directions with time as illustrated in step 602. The linear acceleration curves 700a-d for the patterns “two”, “swipe right”, “square” and “zero” have been illustrated in Figure 7.
[0050] Further, the plotted linear acceleration curves are filtered by one or more filters in order to eliminate one or more acceleration values below a threshold as illustrated in step 604 in order to obtain filtered linear acceleration curves. The filtered linear acceleration curves 800a-d for the patterns “two”, “swipe right”, “square” and “zero” have been illustrated in Figure 8.
[0051] Furthermore, the filtered linear acceleration curves are subjected to a noise-removal process in order to obtain sharp (or prominent) peaks as illustrated in step 606. The noiseless linear acceleration curves 900a-d for the patterns “two”, “swipe right”, “square” and “zero” have been illustrated in Figure 9. Those skilled in the art will appreciate that one or more low pass, band pass or high pass filters may be used to carry out the process of noise-removal and the same is not explained for the sake of brevity.
[0052] Peaks corresponding to each pattern are then detected by the controller 210 from the corresponding noiseless linear acceleration curves as illustrated in step 608.
[0053] The detected peaks are then sequenced by the controller 210 in order to obtain a peak sequence for mapping the detected pattern with a valid keyboard input as illustrated at step 610. In order to map, the peak sequence to a valid keyboard input, peak sequences of each gesture letter are first identified as illustrated in Table 1.
Table 1: Peak Sequencing
Figure imgf000014_0001
[0054] For each detected pattern, the peak sequence is mapped to a corresponding segment pattern [Npt] as illustrated in Figure 5 which is then further mapped to a valid keyboard input. For instance, the peak sequences for the patterns “two”, “swipe right”, “square” and “zero” are obtained from the noiseless linear acceleration curves 900a-d as illustrated in Figure 9 and are tabulated in table 2. Table 2: Peak Sequencing
Figure imgf000015_0001
[0055] The peak sequences indicate a variation of acceleration with time along the y and the z axis. For instance, the gesture letter “+y” indicates a straight line along the positive y axis and therefore, has a positive acceleration in the positive y-direction while a negative acceleration along the negative y-direction. It may be noted by a skilled person that for the process of peak sequencing, the only important factor is the position of the peaks and not their amplitude. Further, the peak sequences also help in recognizing whether the gesture letter is a line gesture letter or a curve gesture letter as a line gesture letter has acceleration peaks only in one axis, but a curved gesture letter has acceleration peaks in both the axis directions as may be clearly observed from the table 1.
[0056] Further, at step 612, the valid keyboard input is outputted by the controller 210 on the user device 108 based on the mapping performed at step 610. In one embodiment, the valid keyboard inputs are burnt to finger-worn device using the user device 108. The mapping between valid keyboard inputs and custom keyboard shortcuts are done in an external application and then burnt onto the finger-worn device.
[0057] Furthermore, for ending the operation of the finger-worn device 102 as a keyboard, the at least one motion sensor 202 is again stabilized by pointing the distal phalanx of the finger in such a manner that the at least one motion sensor 202 is parallel to a ground plane for the pre -determined time duration as illustrated in step 306. Operation of the finger-worn device 102 as a mouse:
[0058] Figure 10 illustrates the mouse cycle 1000 in accordance with an embodiment of the present disclosure. For commencing the operation of the finger-worn device 102 as a mouse, the at least one motion sensor 202 needs to be stabilized as illustrated at step 1002. For stabilizing the at least one motion sensor 202, the distal phalanx of the finger is pointed in such a manner that the at least one motion sensor 202 is parallel to a ground plane for a predetermined time duration in order to detect a pivot point along with an angle subtended by the pivot point on each axis of the two-dimensional plane as illustrated at step 1004 and depicted in Figure 11. In particular, Figure 11 depicts point p to be the pivot point with the angles subtended at the y-axis as a0 and the angle subtended at z-axis as b0. In one embodiment, the pre-determined time duration for stabilizing the at least one motion sensor 202 may be 50-500ms. The stabilization indication is provided by the one or more visual indicators 204.
[0059] Once the at least one motion sensor 202 is stabilized, the finger-worn device 102 is ready to be used as a mouse. During operation as a mouse, the controller 210 is configured track an angular movement of the finger based on a change in orientation of the finger using the at least one motion sensor 202 and transform the angular movement of the finger into a mouse pointer movement.
[0060] After detecting the pivot point p, when the finger moves from the pivot point, the controller 210 measures a translation of the finger from pivot point along each axis of the two-dimensional plane by calculating a change in the angle subtended by the pivot point at each axis of the two-dimensional plane. For instance, with reference to Figure 11, the finger moves from the pivot point p to point q and the angle made by point q with y axis becomes ai and the angle made with the z-axis becomes bi. The translation is measured by the controller 210 as -
Ax = a * (bo-bi) and Ay = a * (ao-ai)
Where, a is a multiplying factor
[0061] The translations Ax and Ay are transformed by the controller 210 into mouse pointer movement on the user device 108. Further, to end the mouse cycle, the user presses one of the one or more input keys 110, 112 as illustrated at step 1006. [0062] Figure 12 depicts a method 1200 for allowing the finger-worn device 102 to function as mouse/ keyboard to a user device 108, in accordance with an embodiment of the present disclosure.
[0063] The order in which the method 1200 is described is not intended to be construed as a limitation, and any number of the described method steps may be combined in any order to implement the method. Additionally, individual steps may be deleted from the methods without departing from the spirit and scope of the subject matter described.
[0064] At step 1202, the method 1200 may include allowing a sensor housing 104 of the finger worn device 102 to be placed at distal phalanx of a finger of a user such that at least one motion sensor 202 rests at tip of the finger pointing towards the user device 108.
[0065] At step 1204, the method 1200 may include pairing the finger-worn device 102 with the user device 108.
[0066] At step 1206, the method 1200 may include providing one or more visual indicators 204 indicating pairing between the finger-worn device 102, as the mouse/keyboard, and the user device 108.
[0067] At step 1208, the method 1200 may include allowing the at least one motion sensor 202 to capture orientation in euler angles and linear acceleration in all three axes directions, produced by the finger movement of the user.
[0068] When functioning as a keyboard, the method 1200 executes operations at steps 1210 and 1212.
[0069] At step 1210, the method 1200 may include detecting a plurality of patterns drawn by the user in air using the at least one motion sensor 202.
[0070] At step 1212, the method 1200 may include transforming the plurality of the detected patterns into valid keyboard inputs by employing linear acceleration based peak sequencing technique. Said process is elaborated in detail in steps 1212-1 to 1212-6 as illustrated in Figure 12A. [0071] At step 1212-1, the method 1200A may include recognizing linear acceleration values for each detected pattern of the plurality of patterns in a two-dimensional plane.
[0072] At step 1212-2, the method 1200A may include plotting, a linear acceleration curve for each detected pattern depicting a variation of the linear acceleration values for each detected pattern with time.
[0073] At step 1212-3, the method 1200A may include filtering each of said linear acceleration curve by removing one or more acceleration values below a threshold to obtain a filtered linear acceleration curve for each of said detected pattern.
[0074] At step 1212-4, the method 1200A may include removing noise from the filtered linear acceleration curves to obtain noiseless linear acceleration curves with prominent peaks.
[0075] At step 1212-5, the method 1200A may include detecting one or more peaks from each noiseless linear acceleration curve.
[0076] At step 1212-6, the method 1200A may include sequencing the one or more peaks in each noiseless linear acceleration curve to map the peak sequence with a corresponding segment pattern [Npt] for each pattern in order to map the detected pattern with a valid keyboard input.
[0077] When functioning as a mouse, the method 1200 executes operations at steps 1214 and 1216.
[0078] At step 1214, the method 1200 may include tracking an angular movement of the finger based on a change in orientation of the finger using the at least one motion sensor 2O2.At step 1216, the method 1200 may include transforming the angular movement of the finger into a mouse pointer movement. The process of transforming the angular movement of the finger into a mouse pointer movement is elaborated in steps 1216-1 to 1216-2 and illustrated in Figure 12B. [0079] At step 1216-1, the method 1200B may include measuring a translation of the finger from a pivot point along each axis of the two-dimensional plane by calculating a change in the angle subtended by the pivot point at each axis of the two-dimensional plane.
[0080] At step 1216-2, the method 1200B may include transforming the translation into the mouse pointer movement.
[0081] In one exemplary embodiment, the valid keyboard inputs may be burned to the finger-worn device 102 using the user device 108 such as smartphone or tablet. Under such scenario the mapping between the plurality of patterns and valid keyboard inputs may be done using one or more platforms available used by the user device 108 which may then be burned onto the finger-worn device 102 using Bluetooth UART communication protocol.
[0082] In another exemplary embodiment, the finger-worn device 102 may be used to control machines in public domain, like consoles in gaming arena without touching. For that it is integral to communicate the valid keyboard inputs to the external console. This may be done by generating the QR code using one or more platforms available used by the user device 108, which may be scanned by the external console using a visual device such as a camera. The authentication key for initiating Bluetooth communication the finger-worn device 102 may also be read through the QR scanner.
[0083] A description of an embodiment with several components in communication with each other does not imply that all such components are required. On the contrary, a variety of optional components are described to illustrate the wide variety of possible embodiments of the disclosure.
[0084] When a single device or article is described herein, it will be clear that more than one device/article (whether they cooperate) may be used in place of a single device/article. Similarly, where more than one device or article is described herein (whether they cooperate), it will be clear that a single device/article may be used in place of the more than one device or article or a different number of devices/articles may be used instead of the shown number of devices or programs. The functionality and/or the features of a device may be alternatively embodied by one or more other devices which are not explicitly described as having such functionality/features. Thus, other embodiments of the disclosure need not include the device itself.
[0085] Finally, the language used in the specification has been principally selected for readability and instructional purposes, and it may not have been selected to delineate or circumscribe the inventive subject matter. It is therefore intended that the scope of the disclosure be limited not by this detailed description, but rather by any claims that issue on an application based here on. Accordingly, the embodiments of the present disclosure are intended to be illustrative, but not limiting, of the scope of the disclosure, which is set forth in the following claims.
[0086] While various aspects and embodiments have been disclosed herein, other aspects and embodiments will be apparent to those skilled in the art. The various aspects and embodiments disclosed herein are for purposes of illustration and are not intended to be limiting, with the true scope and spirit being indicated by the following claims.
Reference Numerals
Figure imgf000020_0001
Figure imgf000021_0001

Claims

The Claims:
1. A finger-worn device (102) configured to pair with a user device (108) and function as mouse/ keyboard to said user device (108), the finger-worn device (102) comprising: a sensor housing (104) configured to be placed at distal phalanx of a finger of a user, said sensor housing (104) comprising: at least one motion sensor (202), placed within the sensor housing (104) in a manner that the at least one motion sensor (202) rests at tip of the finger pointing towards the user device (108) and configured to capture orientation in euler angles and linear acceleration in all three axes directions, produced by the finger movement, and one or more visual indicators (204) configured to provide pairing indication of the finger- worn device (102) as the mouse/keyboard with the user device (108); and a battery housing (106) operatively coupled to the sensor housing (104), said battery housing (104) comprising a battery (206), a controller (210), a Bluetooth unit (208), a port (212) and one or more input keys (110, 112); wherein, when functioning as the keyboard, the controller (210) is configured to: detect a plurality of patterns drawn by the user in air using the at least one motion sensor (202); and transform the plurality of the detected patterns into valid keyboard inputs; by employing linear acceleration based peak sequencing technique; and wherein, when functioning as the mouse, the controller (210) is configured to: track angular movement of the finger based on a change in orientation of the finger using the at least one motion sensor (202); and transform the angular movement of the finger into a mouse pointer movement.
2. The finger-worn device (102) of claim 1, wherein the controller (210) is configured to switch operation between the mouse and the keyboard, when the one of the one or more input keys (110, 112) is pressed by the user for a pre-defined duration.
3. The finger-worn device (102) of claim 1, wherein to operate as a keyboard, the at least one motion sensor (202) is first stabilized by pointing the distal phalanx of the finger in a manner that the at least one motion sensor (202) is parallel to a ground plane for a pre -determined time duration.
4. The finger-worn device ( 102) of claim 1 , wherein the plurality of patterns drawn by the user comprises a combination of one or more gesture letters drawn by the user in a two-dimensional plane, and wherein the one or more gesture letters comprises a plurality of line gesture letters and a plurality of curve gesture letters.
5. The finger-worn device (102) of claim 1, wherein to transform the plurality of patterns into valid keyboard inputs by employing linear acceleration based peak sequencing technique, the controller (210) in combination with one or more filters is configured to: recognize linear acceleration values for each detected pattern of the plurality of patterns in the two-dimensional plane; plot a linear acceleration curve for each detected pattern depicting a variation of the linear acceleration values for each of said detected pattern with time; filter each of said linear acceleration curve by removing one or more acceleration values below a threshold to obtain a filtered linear acceleration curve for each of said detected pattern; remove noise from the each filtered linear acceleration curve to obtain noiseless linear acceleration curve depicting sharp peaks for each of said detected pattern; detect one or more peaks from each of said noiseless linear acceleration curve; sequence the one or more peaks to map a peak sequence for each detected pattern with a corresponding segment pattern in order to obtain a valid keyboard input.
6. The finger-worn device (102) of claim 1, wherein to operate as a mouse, the at least one sensor (202) is first stabilized for a pre-determined time duration in order to detect a pivot point along with an angle subtended by the pivot point on each axis of the two-dimensional plane.
7. The finger- worn device (102) of claim 1, wherein to transform the angular movement of the finger into a mouse pointer movement, the controller (210) is configured to: measure a translation of the finger from the pivot point along each axis of the two-dimensional plane by calculating a change in the angle subtended by the pivot point at each axis of the two-dimensional plane; and transform the translation into the mouse pointer movement.
8. A method for allowing a finger- worn device (102) to function as mouse/ keyboard to a user device (108), the method comprising: allowing a sensor housing (104) of the finger-worn device (102) to be placed at distal phalanx of a finger of a user such that at least one motion sensor (202) rests at tip of the finger pointing towards the user device (108); pairing the finger-worn device (102) with the user device (108); providing one or more visual indicators (204) indicating pairing between the finger- worn device (102), as the mouse/keyboard, and the user device (108); allowing the at least one motion sensor (202) to capture orientation in euler angles and linear acceleration in all three axes directions, produced by the finger movement of the user; wherein when functioning as the keyboard, the method comprises: detecting, by a controller (210), a plurality of patterns drawn by the user in air using the at least one motion sensor (202); and transforming, by the controller (210), the plurality of the detected patterns into valid keyboard inputs by employing linear acceleration based peak sequencing technique; and wherein when functioning as the mouse, the method comprises: tracking, by the controller (210), an angular movement of the finger based on a change in orientation of the finger using the at least one motion sensor (202); and transforming, by the controller (210), the angular movement of the finger into a mouse pointer movement.
9. The method of claim 8, wherein the process of transforming the plurality of patterns into valid keyboard inputs by employing linear acceleration based peak sequencing technique comprises: recognizing linear acceleration values for each detected pattern of the plurality of patterns in a two-dimensional plane; plotting, a linear acceleration curve for each detected pattern depicting a variation of the linear acceleration values for each detected pattern with time; filtering each of said linear acceleration curve by removing one or more acceleration values below a threshold to obtain a filtered linear acceleration curve for each of said detected pattern; removing noise from the each filtered linear acceleration curve to obtain noiseless linear acceleration curve depicting sharp peaks for each of said detected pattern; detecting one or more peaks from each noiseless linear acceleration curve; and sequencing the one or more peaks to map a peak sequence for each detected pattern with a corresponding segment pattern in order to obtain a valid keyboard input.
10. The method of claim 8, wherein the process of transforming the angular movement of the finger into a mouse pointer movement, comprises: measuring a translation of the finger from a pivot point along each axis of the two-dimensional plane by calculating a change in the angle subtended by the pivot point at each axis of the two-dimensional plane; and transforming the translation into the mouse pointer movement.
PCT/IN2023/050166 2022-02-22 2023-02-21 Finger-worn device operable as mouse/keyboard and method for realizing same WO2023161958A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IN202241009250 2022-02-22
IN202241009250 2022-02-22

Publications (1)

Publication Number Publication Date
WO2023161958A1 true WO2023161958A1 (en) 2023-08-31

Family

ID=87765088

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IN2023/050166 WO2023161958A1 (en) 2022-02-22 2023-02-21 Finger-worn device operable as mouse/keyboard and method for realizing same

Country Status (1)

Country Link
WO (1) WO2023161958A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9891718B2 (en) * 2015-04-22 2018-02-13 Medibotics Llc Devices for measuring finger motion and recognizing hand gestures
CN111221405A (en) * 2018-11-23 2020-06-02 东莞市易联交互信息科技有限责任公司 Gesture control method and device

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9891718B2 (en) * 2015-04-22 2018-02-13 Medibotics Llc Devices for measuring finger motion and recognizing hand gestures
CN111221405A (en) * 2018-11-23 2020-06-02 东莞市易联交互信息科技有限责任公司 Gesture control method and device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
C HEN, MINGYU ET AL.: "Air-writing recognition-Part I: Modeling and recognition of characters, words, and connecting motions.", IEEE TRANSACTIONS ON HUMAN-MACHINE SYSTEMS, vol. 46, no. 3, 11 September 2015 (2015-09-11), pages 403 - 413, XP011610175, DOI: 10.1109/THMS.2015.2492598 *

Similar Documents

Publication Publication Date Title
US11755137B2 (en) Gesture recognition devices and methods
Lee et al. Smart wearable hand device for sign language interpretation system with sensors fusion
Hsu et al. An inertial pen with dynamic time warping recognizer for handwriting and gesture recognition
JP6545258B2 (en) Smart ring
CN103257711B (en) space gesture input method
US9619024B2 (en) Virtual input device and virtual input method
CA2799406A1 (en) Methods and systems for pointing device using acoustic impediography
TWI478006B (en) Cursor control device, display device and portable electronic device
CN104254816A (en) A data input device
CN103425244A (en) Gesture recognition
US20100103092A1 (en) Video-based handwritten character input apparatus and method thereof
CN106774839A (en) A kind of intelligence wearing key board unit and its input method
CN204965335U (en) Keyboard with fingerprint identification function
KR20050047329A (en) Input information device and method using finger motion
Khan et al. Gesthaar: An accelerometer-based gesture recognition method and its application in nui driven pervasive healthcare
CN203241934U (en) System for identifying hand gestures, user input device and processor
Ahmad et al. A keystroke and pointer control input interface for wearable computers
Iyer et al. Generalized hand gesture recognition for wearable devices in IoT: Application and implementation challenges
Roshandel et al. Multi-sensor based gestures recognition with a smart finger ring
KR101211808B1 (en) Gesture cognitive device and method for recognizing gesture thereof
WO2023161958A1 (en) Finger-worn device operable as mouse/keyboard and method for realizing same
Costagliola et al. Gesture‐Based Computing
Yamagishi et al. A system for controlling personal computers by hand gestures using a wireless sensor device
KR101068281B1 (en) Portable information terminal and content control method using rear finger movement and gesture recognition
Arakawa et al. Personal identification method for robot with whole-body sensing mechanism

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23759477

Country of ref document: EP

Kind code of ref document: A1