WO2016003365A1 - Dispositif de saisie à porter sur soi - Google Patents

Dispositif de saisie à porter sur soi Download PDF

Info

Publication number
WO2016003365A1
WO2016003365A1 PCT/SG2014/000321 SG2014000321W WO2016003365A1 WO 2016003365 A1 WO2016003365 A1 WO 2016003365A1 SG 2014000321 W SG2014000321 W SG 2014000321W WO 2016003365 A1 WO2016003365 A1 WO 2016003365A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
input device
computer
microprocessor
wearable
Prior art date
Application number
PCT/SG2014/000321
Other languages
English (en)
Inventor
Kar Kit Bernard LOKE
Original Assignee
Loke Kar Kit Bernard
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Loke Kar Kit Bernard filed Critical Loke Kar Kit Bernard
Priority to PCT/SG2014/000321 priority Critical patent/WO2016003365A1/fr
Publication of WO2016003365A1 publication Critical patent/WO2016003365A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/1694Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being a single or a set of motion sensors for pointer control or gesture input obtained by sensing movements of the portable computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/014Hand-worn input/output arrangements, e.g. data gloves

Definitions

  • the present disclosure generally relates to a wearable input device for communication with a computer. More particularly, the present disclosure describes various embodiments of an input device that is wearable by a user, wherein the input device is able to communicate with a computer or a computing device.
  • the mouse devices are often used in conjunction with the keyboard. Users typically transit from typing on the keyboard to scrolling and clicking with the mouse. Such transition requires the user to move his hand from the keyboard to the mouse and vice versa. In particular, the user has to lift his hand from the keyboard towards the mouse, thereby upsetting typing on the keyboard. This action can be strenuous to the user. In the long run, the hand or arm of the user may feel tired or cramped after grasping and operating the mouse, especially with long-term clicking of the mouse buttons. It would be beneficial if such stresses are mitigated when using input devices for performing functions on the computer.
  • Some users may prefer to be in a more relaxed position or posture when using their desktop computers. For example, some users may want to lie down on a bed when they are surfing the internet or reading documents online for long periods of time. The users may also be streaming or watching movies on their computer screen. Thus, over the duration of the movie or reading documents, the users do not need to use the keyboard or mouse. There would be some situations whether use of the mouse is necessary, such as to switch to another time frame in the movie or to scroll to another page in the documents. In these cases, it would be cumbersome for the user to stretch out towards the mouse surface, which is usually the desktop, when the user is already in a rested position or posture on the bed. It would thus be beneficial if the user could remain in the rested posture and also having an input device to perform certain functions on the computer, especially when the user is not near the desktop.
  • input devices may also be used for gaming consoles such as the Nintendo WiiTM.
  • gaming consoles such as the Nintendo WiiTM.
  • Such devices are typically located in the living room and not on a desktop.
  • the use of gaming consoles usually requires the user to be standing or in a relaxed position, instead of the required seated position when using the mouse devices.
  • Normal mouse devices would not be usable in connection with gaming consoles, because they require a table surface for use thereon.
  • wearable input devices that can be used for computers or computing devices, such as those mentioned above. These devices are portable and can be worn on a part of the body of the user, and do not require any physical surfaces in order to use them. Some of these devices rely on gestures made by the user for operation thereof. These devices may rely on video or optical tracking of the hand motion and gestures of the user. However, a problem associated with using video or optical tracking is that there is the accuracy is limited and often requires direct line-of-sight between the user and the sensors.
  • MycestroTM It's a mouse that a user can wear on his finger and operate using gestures and thumb actions.
  • the input device wirelessly connects to your computer, tablet, or other computing device, and allows the user to control from a range.
  • the device comprises sensors on its side whereon the user can use his thumb scroll along the sensors and performing scrolling functions on the computer. On the same side, there are buttons which the user can actuate with his thumb to perform the clicking functions of a normal mouse.
  • MycestroTM product is that it is not completely based on gestures to perform functions on the computer. The user still needs to move his thumb to perform scrolling and/or clicking functions on the side of the device. Such motions of the thumb are unnatural and not ergonomic, which may further lead to unnecessary stresses and strains in the user's hand.
  • United States patent number 8,125,448 discloses an input device that is wearable over the palm of the user.
  • the device is operable without the need to contact a mousing surface.
  • the device comprises sensors that detect movements of the user's hand, i.e. gestures, and perform corresponding functions on the computer.
  • the device also comprises actuation mechanisms on its side that allow the user to actuate and perform clicking functions akin to a mouse.
  • a problem associated with this device is that it is not completely based on gestures and the movement of the user's thumb is not ergonomic.
  • the device further comprises a switch that allows the user to deactivate the device when he does not intend to use it but wants to continue wearing it, such as when transiting to typing on the keyboard. Similar, the actuation of the switch requires the user to move his thumb and is not ergonomic. Additionally, the user may forget to deactivate the device when typing on the keyboard, thus leading to unintended mouse functions being performed on the computer while typing.
  • the wearable input device for communication with a computer.
  • the wearable input device comprises a body formed for wearing on a hand of a user, a plurality of sensors for detecting gestures made by the user with at least a portion of the hand, and a microprocessor communicatively coupled to the plurality of sensors.
  • the microprocessor is further for transmitting input signals to the computer in response to the detection of the gestures.
  • Each gesture is associated with one of the input signals, and each of the input signals is further associated with one from a plurality of functions performable on the computer.
  • the microprocessor is configured for automatically deactivating the transmission of at least some of the input signals to the computer in response to the microprocessor receiving a deactivation signal.
  • the wearable input device is wearable on the hand of the user and usable as an input device to the computer.
  • the user does not need to physically grab and operate the device like a standard mouse.
  • the device also provides functions that are operable based on gestures made by the user's hand, allowing the user to use natural finger movement gestures to operate the device.
  • the device is deactivatable upon the receiving of the deactivation signal by the microprocessor.
  • the user can, on his volition, deactivate or switch off the device when he decides not to use it anymore. For example, the user may want to continue wearing the device while resting on his bed, but does not want unintentional movements of his hand to perform any functions on the computer.
  • the microprocessor receives the deactivation signal in response to the plurality of sensors detecting a deactivation gesture made by the user.
  • the microprocessor is further configured for automatically reactivating the transmission of the at least some of the input signals to the computer in response to the plurality of sensors detecting a reactivation gesture made by the user.
  • the microprocessor receives the deactivation signal when the user is using a second input device communicable with the computer.
  • the microprocessor is preferably further configured for automatically reactivating the transmission of the at least some of the input signals to the computer after the user stops the using of the second input device.
  • the second input device is preferably a keyboard, and the plurality of functions performable on the computer preferably comprises functions of a standard mouse.
  • An advantage of deactivating and reactivating the wearable input device is that the user can readily switch off and on the device by making gestures with his hand, without having to fumble for a switch or button. Further, the user can simultaneously use both the device as a mouse input device and the keyboard as another input device. If the user is using the keyboard, he need not worry about unintentional movement of the mouse cursor caused by his hand movements, because of the deactivation signal. Once the user stops using the keyboard, the device detects that and reactivates the functions of the device.
  • the wearable input device preferably comprises a plurality of operational modes, wherein each operational mode allows a portion of the gestures to be detectable.
  • Each of the plurality of operational modes is selectable by the user with a gesture.
  • An advantage is that the user can select his preferred operational mode which allows a portion of the gestures to be detectable and thus functional. Other functions from other gestures outside of the portion of the gestures are not detectable by the device as these unintended gestures and functions may possibly interfere with the user's operation of the device.
  • the gestures are preferably definable by the user for detection by the plurality of sensors. It is also preferred that the wearable input device comprises a memory module for storing the gestures defined by the user.
  • the plurality of sensors preferably comprises at least one of micro-electromechanical systems (MEMS) sensors and proximity sensors.
  • MEMS micro-electromechanical systems
  • the plurality of sensors detect the gestures based on at least one of position, direction, displacement, velocity, and acceleration of the at least a portion of the hand.
  • the wearable input device as in claim 1 preferably comprises an alert module for providing alerts to the user.
  • the alert module preferably comprises a vibrating mechanism. These alerts could include prompts from the computer and reminders to the user, such as an alert for warning the user of low battery.
  • the wearable input device comprises a battery for powering at least one of the microprocessor and the plurality of sensors. More preferably, the wearable input device comprises a cable interface for transmission of at least one of data and power.
  • the communication of the wearable input device with the computer preferably occurs via one of Wi-Fi, Bluetooth, and USB cable.
  • the wearable input device is preferred to be in the form of a ring wearable on a finger of the hand.
  • the device can thus be worn on the finger and operable wirelessly (or wired with the USB cable) from a more extended range than that of a standard mouse. The user does not need to be seated near the desktop to operate the device, as he would have if he was using the standard mouse.
  • FIG. 1 is a perspective view diagram of a wearable input device according to an embodiment of the present disclosure.
  • FIG. 2 is a side view diagram of the wearable input device according to an embodiment of the present disclosure.
  • FIG. 3 is a schematic of the wearable input device according to an embodiment of the present disclosure.
  • FIG. 4 is a top view diagram showing a first gesture with a hand using the wearable input device according to an embodiment of the present disclosure.
  • FIG. 5 is a side view diagram showing a second gesture with the hand using the wearable input device according to an embodiment of the present disclosure.
  • FIG. 6 is a side view diagram showing a third gesture with the hand using the wearable input device according to an embodiment of the present disclosure.
  • FIG. 7 is a top view diagram showing a fourth gesture with the hand using the wearable input device according to an embodiment of the present disclosure.
  • FIG. 8 is a top view diagram showing a fifth gesture with the hand using the wearable input device according to an embodiment of the present disclosure.
  • any recitation of particular numerical values or value ranges is taken to be a recitation of particular approximate numerical values or approximate value ranges.
  • descriptions of embodiments of the present disclosure are limited hereinafter to a wearable input device for communication with a computer, in accordance with the drawings in FIG. 1 to FIG. 8. This, however, does not preclude embodiments of the present disclosure where fundamental principles prevalent among the various embodiments of the present disclosure such as operational, functional or performance characteristics are required.
  • a wearable input device 100 for communication with a computer 500 is described hereinafter.
  • the wearable input device 100 is structured in the form of a ring that is wearable by the user as an accessory on his hand.
  • the device 100 functions as a computer input device or a human interface device (HID) that transmits and receives signals to and from the computer 500 or a computing device.
  • the user is able to operate the device 100 to transmit signals to the computer for performing functions thereon.
  • the device 100 has a ring-like structure comprising a body 102 and a core module 200.
  • the body 102 is a strap that can be fastened to the user such that it remains secure when the user is operating the device 100.
  • the strap 102 is adjustable to fit different users and any means of adjustment may be used, as known by the person having ordinary skill in the art.
  • One of the functions of the proximity sensors (104, 106, and 108) is to detect the position and distance of objects which are proximate to or nearby the proximity sensors (104, 106, and 108).
  • the arrangement of the proximity sensors (104, 106, and 108) is such that the device 100 is symmetrical and that the user does not need to orientate it in a particular direction before wearing the device 00. It is also advantageous if the user switches from using the device 100 on the right hand to the left hand, or if the user is left- handed.
  • the core module 200 is disposed on the top of the device 100.
  • FIG. 3 shows a schematic 210 of the relationships between various components of the device 100.
  • the core module 200 comprises a microprocessor 202, micro- electro-mechanical systems (MEMS) sensors 204, and a proximity sensors processor 206.
  • the proximity sensors (104, 106, and 108), MEMS sensors 204, and proximity sensors processor 206 are communicatively coupled to the microprocessor 202.
  • the MEMS sensors comprise a gyroscope and accelerometer for measuring at least one of displacement, velocity, and acceleration. In particular, the gyroscope measures or maintains orientation based on the principles of angular momentum.
  • the proximity sensors processor 206 processes the raw sensors data obtained from parameters measured by the proximity sensors (104, 106, and 108), such as the position and direction of the objects around them.
  • the microprocessor 202 receives the raw data from the MEMS sensors 204 and the proximity sensors processor 206 for subsequent processing.
  • the microprocessor 202 transforms and corrects the raw data to the relevant format for reading by the computer 500, using algorithms and logics programmed within the microprocessor 202 for recognizing gestures made by the user.
  • the microprocessor 202 is responsible for all data processing using the algorithms and logics.
  • the output command will be translated into HID driver format by the microprocessor 202.
  • the processed output data format will be a HID-compliant data format based on a standard mouse. The precision and accuracy of the gesture recognition is largely dependent on the statistical confidence of the algorithm performance.
  • the core module 200 is powered by a suitably sized and shaped battery 208.
  • the battery 208 may be a button cell lithium battery such as the CR2032 type, or a rechargeable battery.
  • the battery 208 also provides power to the proximity sensors (104, 106, and 108).
  • the battery 208 may be recharged by means of a cable connection between a data interface on the device 100 and the computer 500.
  • the output data is transmitted as input signals to the computer 500 via a connection means 502. Each input signal is associated with a function performable on the computer 500.
  • the connection means 502 may be through wireless means such as Wi-Fi and Bluetooth, or wired means such as USB through the data interface.
  • the data interface allows transmission of at least one of data and power between the device 100 and the computer 500.
  • the device 100 transmits the input signals wirelessly via Bluetooth using a Bluetooth transmitter unit.
  • the Bluetooth transmitter unit may be disposed within the core module 200, or as an independent component separate from the core module 200. It will be apparent to the person having ordinary skill in the art that a Wi-Fi module will be implemented if the device 100 is transmitting the input signals via Wi-Fi.
  • Other connection means known to the skilled person may also be used. Current technology allows for the various components of the core module 200 to be readily available.
  • the miniaturization and mass production of MEMS technology for the sensors, particularly the MEMS sensors 204, allow the MEMS sensors 204 to be small enough to fit into the core module 200 in order for the liser to wear the device 100.
  • Current MEMS technology also provides an improvement in the minute movements and articulations of different parts of the fingers and hands, allowing the MEMS sensors 204 to recognize gestures more efficiently and accurately.
  • the low energy consumption of the communication means 502, i.e. Bluetooth allows the battery to be miniature-sized for integrating into the device 100.
  • the ring-like structure of the device 100 allows the device 100 to be worn as an accessory on the hand.
  • the wearable input device 100 can be stored in a user's pocket and thereby improve portability of the device 100.
  • Such features of the wearable input device 100 allow the device 100 to become more suitable for the consumer mass market, and more particularly in terms of cost and form-fit function of a wearable device.
  • FIG. 4 shows the device 100 being worn on a right hand 300 of the user.
  • the preferred embodiment shows the device 100 worn on the index finger 304 of the user
  • the device 100 can also be worn on the other fingers of the user, i.e. the thumb 302, middle finger 306, ring finger 308, and little finger 310. It is up to the user's preference to decide which finger to wear the device 100.
  • the device 100 may also be worn on the left hand of the user.
  • the device 100 is worn on the innermost section of the index finger 304 of the right hand 300, towards the palm and adjacent to the knuckle, such that the device 100 remains secure on the index finger 304 when the user is operating the device 00.
  • the user When operating the device 100, the user performs gestures with the fingers (302, 304, 306, 308, and 3 0) and/or the hand 300.
  • the proximity sensors ( 04, 106, and 108) and the MEMS sensors 204 detect parameters that are based on these gestures. These parameters include, but are not limited to, position of the index finger 304 and the middle finger 306, as well as the velocity and acceleration of the movement of said fingers.
  • the microprocessor 202 processes these data and transmits input signals to the computer 500.
  • each gesture corresponds or is associated with an input signal and a function performable on the computer 500.
  • the gestures can be separated into basic gestures for performing basic mouse functions on the computer 500, and advanced gestures for performing more advanced functions that may be unique to different software applications running on the computer 500. More details on each gesture and its associated function are described hereinafter.
  • Gestures can be made by the user with at least a portion of the hand 300 for performing functions on the computer 500. These functions include functions that are typical of a standard mouse, such as manipulating a mouse cursor and performing mouse clicks. The combination of movement of fingers (302, 304, 306, 308, and 310) and the hand 300 is thus able to fulfil the various functions of the standard mouse.
  • the hand 300 is free from holding or grasping any object, and is allowed to type and operate above the keyboard or anywhere around the computer 500.
  • the gestures made by the user to perform the computer functions apply natural finger movement, i.e. finger and hand motion similar to those used in everyday activities. Additionally, the user does not need to physically click a button when performing a mouse click function. This natural movement of the fingers and hand thus eliminates finger-tip actuation of physical buttons, surface and/or switches, such as those on standard mice.
  • a first basic gesture or a horizontal panning gesture. The user can move the mouse cursor towards the left and right of a display screen of the computer 500 by moving the hand 300 along the directions 400a and/or 400b.
  • the rotation of the hand 300 about the wrist towards the left along the direction 400a moves the mouse cursor towards the left on the display screen
  • the rotation of the hand 300 about the wrist towards the right along the direction 400b moves the mouse cursor towards the right on the display screen.
  • the speed of the cursor movement corresponds, or is proportional, to at least one of the angular velocity and angular acceleration of the rotation of the hand 300 about the wrist.
  • the MEMS sensors 204 and proximity sensors are used to detect the parameters associated with the motions.
  • the microprocessor 202 processes the data and transmits an input signal to the computer 500 for moving the mouse cursor (leftwards and rightwards).
  • FIG. 5 shows a second basic gesture, or a clicking gesture.
  • the clicking gesture is made by the user to perform a clicking or double-clicking function on the computer 500. If the user is using a standard mouse, the index finger 304 of the user will be slightly curved downwards when actuating the mouse buttons. Thus, to mimic a similar gesture, the index finger 304 is curved downwards when the user is operating the device 100 to perform the clicks.
  • the bottom proximity sensor 108 detects that the index finger 304 is curved downwards because a portion of the index finger 304 is within the line of sight 402 of the bottom proximity sensor 108. With the index finger 304 curved downwards, the user can move or rotate the index finger 304 downwards and upwards once along the direction 404, i.e. one cycle, in order to perform the single-click function. The user can move or rotate the index finger 304 downwards and upwards twice in quick succession along the direction 404, i.e. two cycles, in order to perform the double-click function.
  • the MEMS sensors 204 and proximity sensors are used to detect the parameters associated with the motions.
  • the microprocessor 202 processes the data and transmits input signals to the computer 500 in order to perform the single-click or double-click functions on the computer 500. 8.
  • the user is able to achieve the clicking function of the mouse by using the clicking gesture with the device 100, in contrast to that of finger-tip operated physical buttons, surface or switches in standard mice. By not having to physically actuate buttons with the fingers, the user can reduce the stresses and strains therein.
  • the speed of the rotation or movement along the direction 404 needs to be within at least one of a predetermined threshold angular velocity and acceleration ranges before the microprocessor 202 is able to register the movement and transmit the relevant input signal.
  • the microprocessor 202 will not transmit the input signal if the movement is below a lower limit of the predetermined range or above an upper limit of the predetermined range.
  • the advantage of this feature is that the microprocessor 202 will not register unintended movements of the index finger 304 that fall outside the predetermined threshold angular velocity range and/or predetermined threshold angular acceleration range, which would result in unintended clicking of icons on the computer 500. Such unintended movement could arise from sudden jerks or spasms of the index finger 304 or the hand 300.
  • the microprocessor 202 Only intentional movement of the index finger 304 within the predetermined threshold angular velocity and/or acceleration ranges allows the microprocessor 202 to register the movement as a single-click or a double-click. In addition, once the microprocessor 202 registers movement of the index finger 304 as being within the predetermined threshold angular velocity and/or acceleration ranges, the microprocessor 202 will cease all cursor movement in any direction. In other words, the microprocessor 202 receives a deactivation signal to deactivate all input signals related to cursor movements. This is to ensure that the clicking functions do not unnecessarily introduce cursor movement which could result in erroneous selection of adjacent icons instead of the intended icons.
  • FIG. 6 shows a third basic gesture, or a vertical panning gesture.
  • the user actions of the vertical panning gesture are similar to the user actions of the clicking gesture shown in FIG. 5.
  • the index finger 304 is straightened, such that no portion of the index finger 304 is within the line of sight 402 of the bottom proximity sensor 108. With the index finger 304 straightened, the user can move or rotate the index finger 304 downwards and upwards once along the direction 404 to move the mouse cursor downwards and upwards, respectively.
  • the rotation of the index finger 304 upwards moves the mouse cursor upwards, while the rotation of the index finger 304 downwards moves the mouse cursor downwards.
  • the speed of the cursor movement corresponds, or is proportional, to at least one of the angular velocity and angular acceleration of the rotation of the straightened index finger 304.
  • the MEMS sensors 204 and proximity sensors are used to detect the parameters associated with the motions.
  • the microprocessor 202 processes the data and transmits an input signal to the computer 500 for moving the mouse cursor (upwards and downwards) on the computer 500.
  • the bottom proximity sensor 108 of the device 100 is used for the detection of objects, such as the index finger 304, within its line of sight 402. If an object is detected within the line of sight 402, the device 100 only allows clicking functions and inhibits cursor movement in any direction. If there is no detection of any object within the line of sight 402, the device 100 only allows cursor movements and inhibits clicking functions. Thus, for the gestures described in FIG. 4 and FIG. 6, the index finger 304 needs to be straightened in order for the microprocessor 202 to transmit the input signals for moving the mouse cursor on the display screen of the computer 500.
  • FIG. 7 shows a fourth basic gesture, which also performs the vertical panning function like the third basic gesture shown in FIG. 6.
  • the user can move the mouse cursor towards upwards and downwards along the display screen of the computer 500 by moving the hand 300 along the direction 406.
  • the translation of the hand 300 forward moves the mouse cursor upwards
  • the translation of the hand 300 rearwards moves the mouse cursor downwards.
  • the speed of the cursor movement corresponds, or is proportional, to at least one of the velocity and acceleration of the translation of the hand 300.
  • the MEMS sensors 204 and proximity sensors are used to detect the parameters associated with the motions.
  • the microprocessor 202 processes the data and transmits an input signal to the computer 500 for moving the mouse cursor (upwards and downwards) on the computer 500.
  • an advantage of the fourth basic gesture is that it provides users with an alternative method for the vertical panning.
  • Some users may be confused between the clicking gesture as depicted in FIG. 5 and the vertical panning gesture as depicted in FIG. 6, because the finger movements are substantially similar. This also helps the users to avoid unintended vertical panning of the cursor when the user wants to click on an icon. Likewise, it also helps the users to avoid unintended clicking of an icon when the user wants to move the mouse cursor.
  • Another advantage of providing the fourth basic gesture for vertical panning using the device 100 is that it allows users with an injured index finger 304 to perform the vertical panning function.
  • a user who has sprained his index finger 304 will not be able to fully straighten the index finger 304. As such, the user will not be able to make the third basic gesture as depicted in FIG. 6. Therefore, the user needs to rely on the fourth basic gesture as depicted in FIG. 7 to perform the vertical panning function.
  • FIG. 8 shows a fifth basic gesture, or a highlighting gesture.
  • the user can move the mouse cursor to the desired location using the device 100 and the panning gestures described above. For example, the user may intend to highlight a paragraph of text on a document for subsequent copying and pasting.
  • the user moves the mouse cursor to the beginning of the text using the panning gestures, while keeping the index finger 304 and the middle finger 306 separated. Once the mouse cursor is at the beginning of the text, the user then joins the index finger 304 and the middle finger 306 together.
  • the right proximity sensor 106 of the device 100 senses this parameter of closing of the fingers and the microprocessor 202 registers this.
  • the microprocessor 202 processes the data and transmits an input signal to the computer 500 for starting the highlighting of the text.
  • the user continues to move the mouse cursor using the panning gestures, until the mouse cursor reaches the end of the text. Through this motion, the paragraph of text will be highlighted by the user. The user then ends the highlighting by separating the index finger 304 and the middle finger 306. The text remains highlighted as seen on the display screen and the user can perform subsequent functions on the highlighted text, such as copy and paste.
  • the fifth basic gesture of FIG. 8 may also be termed as a drag-and-drop gesture.
  • the user may intend to move an icon from a first area to a second area of the display screen.
  • the user selects the icon by closing the index finger 304 and the middle finger 306.
  • the user then moves the mouse cursor from the first area to the second area using the panning gestures.
  • the icon moves or is dragged along with the mouse cursor from the first area to the second area.
  • the user separates the index finger 304 and the middle finger 306, thereby affixing the icon at the second area, or "dropping" the icon.
  • a scrolling gesture that can be made by the user.
  • a scrolling gesture For example, when the user is reading an article with multiple pages, he may prefer to lean back and scroll through the pages. If he was using a standard mouse, he would need to reach out to the mousing surface and click or scroll continually.
  • the user can use the device 100 and perform the scrolling function without having to reach the mousing surface.
  • the user can read the article while leaning back and wearing the device 100.
  • the user Upon reaching the end of one page of the article, the user can activate the scrolling function by closing the index finger 304 and the middle finger 306 together.
  • the scrolling may be in discrete pages or continuous scrolling, depending on the format of the article.
  • the right proximity sensor 106 of the device 100 senses this parameter of closing of the fingers and the microprocessor 202 registers this.
  • the microprocessor 202 processes the data and transmits an input signal to the computer 500 for starting the scrolling of the article.
  • the index finger 304 and the middle finger 306 together, the user scrolls through the article by rotating both fingers 304 and 306 downwards.
  • the user can scroll upwards by rotating both fingers 304 and 306 upwards.
  • the motion of rotating the fingers 304 and 306 is similar to the vertical panning gesture as depicted above with reference to FIG. 6.
  • the predetermined threshold angle which the fingers 304 and 306 must rotate through before the scrolling function can be performed.
  • the fingers 304 and 306 must rotate downwards beyond a predetermined threshold angle of 15 degrees before there is any scrolling of the article.
  • the scrolling function can be stopped by separating the index finger 304 and the middle finger 306.
  • the predetermined threshold angle is programmable or adjustable by the user as different users have different preferences on the appropriate angle to commence the scrolling function.
  • the wearable input device 100 is usable in conjunction with the keyboard.
  • One advantage of the device 100 is that the user can wear the device 100 and perform the standard mouse functions, without having to grab and operate a mouse device located separately from the keyboard.
  • the user wears the device 100 on his index finger 304.
  • the ends of the fingers (302, 304, 306, 308, and 310) of his right hand 300, as well as his left hand, are free to operate on other input devices.
  • the user can efficiently type on the keyboard while wearing the device 100.
  • the user decides to switch to control the mouse cursor, he can simply stop the typing on the keyboard and proceed to control the mouse cursor with the device 100 worn on his index finger 304.
  • the right hand 300 and left hand can remain on the keyboard, or hovering slightly above the keyboard, while the user operates the device 100.
  • the user can switch from "keyboard mode" to "mouse mode” without having to move his fight hand 300 to another location on the desktop surface. This reduces the distance that the user needs to move with his right hand 300 and thereby reduce the stresses and strains in his arm.
  • the wearable input device 100 provides a feature termed as "ghost elimination”. This feature configures the microprocessor 202 to automatically deactivate transmission of at least some input signals to the computer 500 in response to the microprocessor 202 receiving a deactivation signal. All commands, functions, and gestures, which are related to the at least some input signals, operated from the wearable input device 100 will cease once the microprocessor 202 receives the deactivation signal, essentially placing the device 100 in a "suspended mode". This "suspended mode" of the device 100 is akin to the user's hand not holding a standard mouse.
  • the user makes a clicking gesture with his hand 300.
  • the movement of the finger 304 along the direction 404 is within the predetermined threshold angular velocity and/or acceleration ranges.
  • the microprocessor 202 registers the clicking gesture and transmits an input signal to the computer 500, the input signal being a clicking function.
  • the microprocessor 202 receives a deactivation signal as a result of the clicking gesture, such that the deactivation signal deactivates the transmission of all input signals related to cursor movement.
  • the microprocessor 202 will not transmit input signals that are associated with the panning gestures. This is to ensure that the clicking functions do not unnecessarily introduce cursor movement which could result in erroneous selection of adjacent icons instead of the intended icons.
  • the deactivation signal is received by the microprocessor 202 when the user is making a deactivation gesture with his hand 300.
  • the plurality of sensors (104, 106, and 108) and the MEMS sensors 204 detect the deactivation gesture and the microprocessor 202 consequently receives the deactivation signal.
  • the receiving of the deactivation signal by the microprocessor 202 places the wearable input device 100 in the "suspended mode", terminating all functionalities therefrom.
  • the deactivation signal may also be received by the processor 202 when the user is using a second input device that is communicatively coupled to the computer 500.
  • the user is operating the wearable input device 100 with his right hand 300 hovering above the keyboard.
  • the deactivation signal is received by the microprocessor 202.
  • the deactivation signal may be transmitted from the computer 500 to the microprocessor 202 in response to the computer 500 detecting that the keyboard is being typed on. Once a key of the keyboard is actuated on by the user, the computer 500 registers this as an input signal from the keyboard and transmits the deactivation signal to the microprocessor 202.
  • the receiving of the deactivation signal by the microprocessor 202 places the wearable input device 100 in the "suspended mode", terminating all functionalities therefrom.
  • the sensors of the device 100 may directly detect that the motions of the user's fingers and hands correspond to that of typing or tapping on a keyboard.
  • the sensors of the device 100 may comprise the MEMS sensors 204 and the proximity sensors (104, 106, and 108), or may also comprise other sensors known to the person having ordinary skill in the art.
  • the sensors of the device 100 Upon detection of the user using the keyboard, the sensors of the device 100 transmit the deactivation signal to the microprocessor 202.
  • the receiving of the deactivation signal by the microprocessor 202 places the wearable input device 100 in the "suspended mode", terminating all functionalities therefrom.
  • the microprocessor 202 automatically deactivates the transmission of the input signals to the computer 500 in response to receiving the deactivation signal, wherein the deactivation signal results from the user making a deactivation gesture and/or typing on the keyboard while wearing the device 100.
  • the user does not need to physically or manually actuate any switch or button to deactivate the transmission of the input signals, as is disclosed in prior art products.
  • the deactivation of the transmission of the input signals occurs automatically in response to the user making a reactivation gesture and/or beginning the use of the second input device, i.e. typing on the keyboard.
  • An advantage is that the user does not need to consciously and/or intentionally actuate a switch or button to "switch off the device 100.
  • the "switching off' of the device 100 can occur automatically depending on what gesture the user is making or actions the user is doing, i.e. whether the user is using the second input device or keyboard.
  • the functionalities of the wearable input device 100 are automatically reactivated when the microprocessor 202 stops receiving the deactivation signal.
  • the deactivation signal can be terminated when the user is making a reactivation gesture with his hand 300.
  • the plurality of sensors (104, 106, and 108) and the MEMS sensors 204 detect the reactivation gesture stop the transmission of the deactivation signal.
  • the microprocessor 202 consequently stops receiving the deactivation signal.
  • the termination of the deactivation signal returns the wearable input device 100 into active mode, reactivating all functionalities therefrom.
  • the deactivation signal may also be terminated with the user stopping the use of the second input device, i.e. the typing on the keyboard.
  • the computer 500 detects that the keyboard is no longer being typed on and stops the transmission of the deactivation signal.
  • the sensors of the device 100 may directly detect that the user's fingers and hands do not correspond to the motions of typing or tapping on a keyboard.
  • the sensors of the device 100 stop the transmission of the deactivation signal to the microprocessor 202.
  • the termination of the deactivation signal returns the wearable input device 100 into active mode, reactivating all functionalities therefrom.
  • the microprocessor 202 automatically reactivates the transmission of the input signals to the computer 500 in response to the deactivation signal being no longer received by the microprocessor 202, i.e. the termination of the deactivation signal.
  • the termination of the deactivation signal results from the user stopping the typing on the keyboard while wearing the device 100.
  • the user does not need to physically or manually actuate any switch or button to reactivate the transmission of the input signals, as is disclosed in prior art products.
  • the reactivation of the transmission of the input signals occurs automatically in response to the user stopping the use of the second input device, i.e. stop typing on the keyboard.
  • An advantage is that the user does not need to consciously and/or intentionally actuate a switch or button to "switch on” the device 100.
  • the "switching on” of the device 100 can occur automatically depending on what actions the user is doing, i.e. whether the user is using the second input device or keyboard.
  • a more pragmatic advantage of this "ghost elimination" feature is that while the user wearing the device 100 is typing on the keyboard, the movement of his fingers and hands do not cause the mouse cursor to move or click. This allows the user to type on the keyboard freely and efficiently, without having to worry about unintentional movements or clicks of the mouse cursor, because all commands, functions, and gestures from the device 100 are terminated during the typing on the keyboard.
  • Another practical advantage is that the user only needs to commence and stop typing on the keyboard to alternate between "keyboard mode” (i.e. the device 100 is “switched off') and “mouse mode” (i.e. the device 100 is “switched on”), without having to physically or manually actuate a separate switch or button to perform the same alternation of modes.
  • the deactivation signal is no longer received by the microprocessor 202. The user can thus continue to use the device 100 normally because all commands, functions, and gestures have been reactivated.
  • the user may make some advanced gestures with the wearable input device 100. These advanced gestures may be applicable to specific functionalities that are unique to various software applications running on the computer 500.
  • the advanced gestures are developed to further complement the other features of the device 100, such as the basic gestures.
  • the advanced gestures are doable by the user with natural hand and finger movements for performing the different functions on the computer 500 More details on the advanced gestures and the functions associated therewith are described hereinafter.
  • the wearable input device 100 provides a first advanced gesture for quickly scrolling across a wide area of the display screen.
  • the gesture is similar to the basic panning gestures depicted in FIG. 4 and FIG. 7.
  • the user instead of moving the right hand 300 in a smooth motion, the user applies a jerk to the right hand 300 in the intended direction.
  • the microprocessor 202 registers the "jerk" movement of the hand 300 and registers this as the first advanced gesture.
  • the mouse cursor moves quickly in the intended direction beyond the normal range of the hand movement (if performed with the basic panning gestures). Towards the end of the movement, the mouse cursor decelerates and slows down until it remains stationary.
  • An advantage of the first advanced gesture is that it allows the user to pan across the display screen if the user has very limited area for hand movement, such as within a small cluttered desktop space. It also allows the user to pan across multiple display screens quickly without requiring the hand to move through a large area on the desktop space.
  • the wearable input device 100 provides a second advanced gesture for waking up the device 100. After a period of non-use, device 100, and perhaps also the computer 500, may fall into a sleep mode or Screensaver mode. An evidence of such a sleep mode or screensaver mode is that the mouse cursor is not visible on the display screen.
  • the second advanced gesture wakes the computer 500 and device 100 and returns them to active mode.
  • the user makes the second advanced gesture by forming a fist with his right hand 300, as if he is grabbing a horizontal bar with an overhand grip.
  • the user next rotates his fist upwards about his wrist, and then rotates his fist downwards about his wrist.
  • the angle of rotation from the upward position of the fist and the downward position of the wrist is at least 90 degrees.
  • the completion of the rotation of his fist completes the second advanced gesture and wakes up the device 100 and/or the computer 500.
  • the user may intentionally want to put the wearable input device 100 and/or computer 500 into the sleep mode or screensaver mode, i.e. locking.
  • the user can make a third advanced gesture for the locking.
  • the user makes the third advanced gesture by forming a fist with his right hand 300, as if he is grabbing a horizontal bar with an overhand grip.
  • the user next rotates his fist by twisting his arm.
  • the angle of rotation is at least 90 degrees in either clockwise or anti-clockwise direction.
  • the completion of the rotation of his fist completes the third advanced gesture and locks the device 100 and/or the computer 500.
  • the unlocking or waking up of the device 100 and/or computer 500 can be done with the second advanced gesture described above.
  • the third advanced gesture i.e. the locking gesture
  • the second advanced gesture i.e. the unlocking gesture
  • the reactivation described above which when made by the user, causes the termination of the deactivation signal, thereby preventing the microprocessor 202 from receiving the deactivation signal, and consequently reactivating the transmission of the input signals to the computer 500.
  • Other advanced gestures are possible to perform other functions.
  • a fourth advanced gesture could be substantially to the second basic gesture or clicking gesture as depicted in FIG. 5.
  • the fourth advanced gesture would have a difference in that the index finger 304 is in contact with the middle finger 306, although a substantially similar clicking gesture is made.
  • a fifth advanced gesture would be having the user to form a fist with his right hand 300, as if he is holding a cup.
  • the next step of the fifth advanced gesture would be for the user to rotate his fist, in either clockwise or anti-clockwise direction, about a vertical axis through his wrist.
  • the user can also configure or program the device 100 with other gestures for performing even more functions on the computer 500. By knowing the nature of the movement of the fingers and hands and the locations of the various sensors of the device 100, the user can define and develop new and different gestures for other functions.
  • the user can also break down the existing gestures into different smaller individual actions, or re-combine these smaller individual actions to form different gestures.
  • the user can thus configure or program the device 100 to perform an extended range of functions on the computer 500 based on these newly defined and developed gestures.
  • the wearable input device 100 is configurable to be operated in different modes for suiting different user preferences and/or working environments. Each operational mode may allow only a portion of the gestures to perform their associated functions on the computer 500, while ceasing the functions associated with the other gestures.
  • the user may switch between modes by performing certain actions and/or gestures.
  • the user can select his preferred operational mode which does not allow other unnecessary functional gestures as they could interfere with the user's working or operation of the device 100.
  • the user may want to select an operational mode that only allows clicking and panning, while disabling the highlighting and drag-and-drop functions.
  • Other non-limited examples of operational modes are described hereinafter.
  • the user can switch from “mouse mode” to "keyboard mode” by typing on the keyboard.
  • the user may also switch to a "touchscreen mode” by doing a gesture.
  • the user has a display screen with touch-sensitive surfaces.
  • the "touchscreen mode” is usually activated when the user intends to work in a more comfortable posture and orientation, while wearing the device 100 on his index finger 304.
  • the gesture to switch to the "touchscreen mode” would be, for example, having the user rotate his hand 300 upwards about his wrist until the palm is directly facing the display screen.
  • the position of the palm parallel to the display screen is the optimal position for using the display screen as a touchscreen.
  • the microprocessor 202 will register this as the user's intention to use the device 100 in the "touchscreen mode".
  • all basic gestures of the device 100 function normally. The user may continue to use the touchscreen without his fingers actually contacting the touchscreen.
  • the device 100 allows the user to perform the usual navigational functions on the touchscreen by transmitting input signals to the touchscreen, which functions like a computer 500.
  • the user can operate the touchscreen using the device 100 without any physical touch while maintaining the intuitive input commands unique to touchscreen devices.
  • the user may want to use the touchscreen as per normal, i.e. with physical touch, while wearing the device 100.
  • the user navigates around the touchscreen with his fingers in physical contact with the touchscreen.
  • the movements of his fingers and hand 300 may cause unintended cursor movements on the touchscreen.
  • the user may be able to configure or program the device 100 to cease the functions of the basic gestures if the device 100 is operating in the "touchscreen mode".
  • One reason for this is that the user may not want intentional movement or clicking of the mouse cursor while he is working on the touchscreen.
  • the user can also configure or program other gestures to perform the switching of the operation modes of the device 100.
  • the user is able to configure or program various features of the wearable input device 100 to suit his preferences.
  • the programmable features include the threshold values or parameters, movements of a gesture, definition and development of new gestures, the function performable by a gesture, methods of switching the operation modes of the device 100, and the algorithms and logics programming of the microprocessor 202.
  • the capability to configure and program the various features allows the user to customize the device 100 to his preferred settings, thereby enhancing user experience.
  • the configuration and programming of at least some of the aforementioned features may be performed by the user on the computer 500 using a software application.
  • the configured or programmed features may be stored or saved in memory.
  • the wearable input device 100 comprises a memory module for storing or saving the configured features, such as the gestures programmed by the user.
  • An advantage of saving these features within the device 100 is that the user can move the device 100 for use with another computer 500, and maintain his desired preferences.
  • the device 100 is intended to be small and portable for ease of moving around, and as such it would be advantageous for the preferences of the user to follow the device 100.
  • the configured or programmed features may be stored or saved in a memory module on the computer 500.
  • the wearable input device 100 of the preferred embodiment further comprises an alert module, such as a vibrating mechanism or a micro-vibrator, for providing alerts to the user.
  • an alert module such as a vibrating mechanism or a micro-vibrator
  • Other types of alerts may also be utilized, such as visual lights and audible sounds, as known to the skilled person.
  • These alerts could include prompts from the computer 500 and reminders to the user.
  • An example of an alert would be to warn the user of the battery 208 running low.
  • Another example would be to alert the user that a gesture made by the user is not recognized by the microprocessor 202 and thus is unable to perform the intended function on the computer 500.
  • the wearable input device 100 specifies the wearable input device 100 to be a ring wearable on the index finger 304 of the user, there can be other alternative embodiments.
  • the wearable input device 100 is a band being wearable across both the index finger 304 and the middle finger 306 of the user.
  • the device 100 may also be usable with a gaming console or other computing devices, instead of a normal personal computer or laptop.
  • the device 100 may be applicable to television controllers, especially those that are internet enabled and/or with a television screen menu. It would be apparent to the person having ordinary skill in the art that the gestures, their associated functions, and other features described above apply analogously to the alternative embodiments, based on the details of the preferred embodiment of the present disclosure.
  • Embodiments of the present disclosure have been described above.
  • the present disclosure serves to address at least some prior art problems and issues by enhancing utility value and easing the adoption cycle of a wearable input device that may be classified as a disruptive innovation.
  • the present disclosure relates to interaction between humans and computers, and also on ergonomics, it would be apparent to the skilled person that Fitts' Law is applicable to embodiments of the present disclosure.

Abstract

La présente invention concerne un dispositif de saisie à porter sur soi, pour communiquer avec un ordinateur. Le dispositif de saisie à porter sur soi comprend un corps formé pour être porté sur une main d'un utilisateur, une pluralité de capteurs afin de détecter des gestes accomplis par l'utilisateur avec au moins une partie de la main et un microprocesseur couplé en communication à la pluralité de capteurs. Le microprocesseur est en outre destiné à transmettre des signaux de saisie à l'ordinateur, en réponse à la détection des gestes. Chaque geste est associé à l'un des signaux de saisie et chacun des signaux de saisie est en outre associé à une fonction parmi une pluralité de fonctions exécutables sur l'ordinateur. Le microprocesseur est configuré pour désactiver automatiquement la transmission d'au moins certains des signaux de saisie à l'ordinateur, en réponse au fait que le microprocesseur reçoit un signal de désactivation. Un avantage de la présente invention est que le dispositif de saisie à porter sur soi peut être porté sur la main de l'utilisateur et être utilisé en tant que dispositif de saisie pour l'ordinateur. Le dispositif propose également des fonctions qui peuvent être actionnées sur la base des gestes effectués par la main de l'utilisateur, ce qui permet à l'utilisateur d'utiliser des gestes naturels des doigts pour faire fonctionner le dispositif. En outre, le dispositif peut être désactivé lors de la réception du signal de désactivation par le microprocesseur. De manière avantageuse, l'utilisateur peut, à son gré, désactiver le dispositif ou le mettre hors tension lorsqu'il décide de ne plus l'utiliser.
PCT/SG2014/000321 2014-07-04 2014-07-04 Dispositif de saisie à porter sur soi WO2016003365A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/SG2014/000321 WO2016003365A1 (fr) 2014-07-04 2014-07-04 Dispositif de saisie à porter sur soi

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/SG2014/000321 WO2016003365A1 (fr) 2014-07-04 2014-07-04 Dispositif de saisie à porter sur soi

Publications (1)

Publication Number Publication Date
WO2016003365A1 true WO2016003365A1 (fr) 2016-01-07

Family

ID=55019734

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/SG2014/000321 WO2016003365A1 (fr) 2014-07-04 2014-07-04 Dispositif de saisie à porter sur soi

Country Status (1)

Country Link
WO (1) WO2016003365A1 (fr)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018027217A1 (fr) * 2016-08-05 2018-02-08 Avery Dennison Retail Information Services, Llc Dispositif nfc portable pour interaction sécurisée de données
WO2018151449A1 (fr) 2017-02-17 2018-08-23 Samsung Electronics Co., Ltd. Dispositif électronique et procédés permettant de déterminer une orientation du dispositif
WO2018223397A1 (fr) 2017-06-09 2018-12-13 Microsoft Technology Licensing, Llc. Dispositif portable permettant des gestes à plusieurs doigts
EP3441849A1 (fr) * 2017-08-09 2019-02-13 Nagravision S.A. Dispositif d'entrée de pointage pour ordinateur personnel basé sur les mouvements des doigts
CN114123494A (zh) * 2021-11-11 2022-03-01 国网冀北电力有限公司智能配电网中心 一种面向配电终端在线状态估计方法
WO2022142807A1 (fr) * 2020-12-30 2022-07-07 Huawei Technologies Co.,Ltd. Dispositifs portables, procédés et supports pour la reconnaissance de gestes dans l'air à plusieurs doigts
CN116523914A (zh) * 2023-07-03 2023-08-01 智慧眼科技股份有限公司 一种动脉瘤分类识别装置、方法、设备、存储介质
US11842432B2 (en) * 2017-09-29 2023-12-12 Sony Interactive Entertainment Inc. Handheld controller with finger proximity detection

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050052414A1 (en) * 2003-09-08 2005-03-10 Samsung Electronics Co., Ltd. Pointing apparatus and method
US20050179644A1 (en) * 2002-03-12 2005-08-18 Gunilla Alsio Data input device
US20080040951A1 (en) * 2005-06-21 2008-02-21 Lawrence Kates System and method for wearable electronics
EP2733578A2 (fr) * 2012-11-20 2014-05-21 Samsung Electronics Co., Ltd Entrée de geste d'utilisateur pour dispositif électronique portable impliquant le mouvement du dispositif
US20140176439A1 (en) * 2012-11-24 2014-06-26 Eric Jeffrey Keller Computing interface system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050179644A1 (en) * 2002-03-12 2005-08-18 Gunilla Alsio Data input device
US20050052414A1 (en) * 2003-09-08 2005-03-10 Samsung Electronics Co., Ltd. Pointing apparatus and method
US20080040951A1 (en) * 2005-06-21 2008-02-21 Lawrence Kates System and method for wearable electronics
EP2733578A2 (fr) * 2012-11-20 2014-05-21 Samsung Electronics Co., Ltd Entrée de geste d'utilisateur pour dispositif électronique portable impliquant le mouvement du dispositif
US20140176439A1 (en) * 2012-11-24 2014-06-26 Eric Jeffrey Keller Computing interface system

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018027217A1 (fr) * 2016-08-05 2018-02-08 Avery Dennison Retail Information Services, Llc Dispositif nfc portable pour interaction sécurisée de données
KR102391189B1 (ko) 2016-08-05 2022-04-27 에이버리 데니슨 리테일 인포메이션 서비시스, 엘엘씨. 보안 데이터 상호작용을 위한 웨어러블 nfc 디바이스
EP3979668A1 (fr) * 2016-08-05 2022-04-06 Avery Dennison Retail Information Services, LLC Dispositif nfc portable pour interaction sécurisée de données
KR20190037250A (ko) * 2016-08-05 2019-04-05 에이버리 데니슨 리테일 인포메이션 서비시스, 엘엘씨. 보안 데이터 상호작용을 위한 웨어러블 nfc 디바이스
WO2018151449A1 (fr) 2017-02-17 2018-08-23 Samsung Electronics Co., Ltd. Dispositif électronique et procédés permettant de déterminer une orientation du dispositif
EP3538975A4 (fr) * 2017-02-17 2019-11-27 Samsung Electronics Co., Ltd. Dispositif électronique et procédés permettant de déterminer une orientation du dispositif
US10572012B2 (en) 2017-02-17 2020-02-25 Samsung Electronics Co., Ltd Electronic device for performing gestures and methods for determining orientation thereof
US11237640B2 (en) 2017-06-09 2022-02-01 Microsoft Technology Licensing, Llc Wearable device enabling multi-finger gestures
EP3635511A4 (fr) * 2017-06-09 2021-04-28 Microsoft Technology Licensing, LLC Dispositif portable permettant des gestes à plusieurs doigts
WO2018223397A1 (fr) 2017-06-09 2018-12-13 Microsoft Technology Licensing, Llc. Dispositif portable permettant des gestes à plusieurs doigts
EP4145254A1 (fr) * 2017-06-09 2023-03-08 Microsoft Technology Licensing, LLC Dispositif portable permettant des gestes à plusieurs doigts
EP3441849A1 (fr) * 2017-08-09 2019-02-13 Nagravision S.A. Dispositif d'entrée de pointage pour ordinateur personnel basé sur les mouvements des doigts
US11842432B2 (en) * 2017-09-29 2023-12-12 Sony Interactive Entertainment Inc. Handheld controller with finger proximity detection
WO2022142807A1 (fr) * 2020-12-30 2022-07-07 Huawei Technologies Co.,Ltd. Dispositifs portables, procédés et supports pour la reconnaissance de gestes dans l'air à plusieurs doigts
US11573642B2 (en) 2020-12-30 2023-02-07 Huawei Technologies Co., Ltd. Wearable devices, methods and media for multi-finger mid-air gesture recognition
CN114123494A (zh) * 2021-11-11 2022-03-01 国网冀北电力有限公司智能配电网中心 一种面向配电终端在线状态估计方法
CN114123494B (zh) * 2021-11-11 2023-12-22 国网冀北电力有限公司智能配电网中心 一种面向配电终端在线状态估计方法
CN116523914A (zh) * 2023-07-03 2023-08-01 智慧眼科技股份有限公司 一种动脉瘤分类识别装置、方法、设备、存储介质
CN116523914B (zh) * 2023-07-03 2023-09-19 智慧眼科技股份有限公司 一种动脉瘤分类识别装置、方法、设备、存储介质

Similar Documents

Publication Publication Date Title
WO2016003365A1 (fr) Dispositif de saisie à porter sur soi
US10613685B2 (en) Rejection of false turns of rotary inputs for electronic devices
US9041651B2 (en) Multi-touch mouse
KR101793566B1 (ko) 원격 제어기, 정보 처리 방법 및 시스템
US8125448B2 (en) Wearable computer pointing device
US9008725B2 (en) Strategically located touch sensors in smartphone casing
US10042438B2 (en) Systems and methods for text entry
EP2720129A1 (fr) Capteurs tactiles situés de façon stratégique dans un boîtier de téléphone mobile
KR20140138361A (ko) 루프 형태의 택타일 멀티터치 입력장치, 제스처와 그 방법
US20150193023A1 (en) Devices for use with computers
JP6194355B2 (ja) コンピュータと共に用いるデバイスの改良
KR102297473B1 (ko) 신체를 이용하여 터치 입력을 제공하는 장치 및 방법
JP2011077863A (ja) 遠隔操作装置、遠隔操作システム、遠隔操作方法およびプログラム
US20140253453A1 (en) Computer Display Object Controller
KR200401975Y1 (ko) 컴퓨터 제어장치
US20150049020A1 (en) Devices and methods for electronic pointing device acceleration
EP3123277B1 (fr) Dispositif informatique
JP2009187153A (ja) コンピュータ用入力装置、コンピュータ、情報処理システム、コンピュータの入力信号処理方法、入力信号処理プログラムおよび記録媒体
US20110224919A1 (en) Airflow-sensing computer cursor generator and airflow-sensing mouse
US20160139628A1 (en) User Programable Touch and Motion Controller
KR100997840B1 (ko) 손가락을 접촉하여 조작이 가능한 인터페이스 장치
US20110216024A1 (en) Touch pad module and method for controlling the same
KR102659301B1 (ko) 스마트 링 및 스마트링 제어 방법
WO2014041548A1 (fr) Système et procédé pour la commande du comportement d'un curseur
KR20170130989A (ko) 아이 볼 마우스

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14896369

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14896369

Country of ref document: EP

Kind code of ref document: A1