WO2022204614A1 - Wrist-wearable controller for head-mounted device (hmd) or other user device with camera input - Google Patents

Wrist-wearable controller for head-mounted device (hmd) or other user device with camera input Download PDF

Info

Publication number
WO2022204614A1
WO2022204614A1 PCT/US2022/027315 US2022027315W WO2022204614A1 WO 2022204614 A1 WO2022204614 A1 WO 2022204614A1 US 2022027315 W US2022027315 W US 2022027315W WO 2022204614 A1 WO2022204614 A1 WO 2022204614A1
Authority
WO
WIPO (PCT)
Prior art keywords
wrist
user
control device
wearable control
sensors
Prior art date
Application number
PCT/US2022/027315
Other languages
French (fr)
Inventor
Xiang Li
Original Assignee
Innopeak Technology, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Innopeak Technology, Inc. filed Critical Innopeak Technology, Inc.
Priority to PCT/US2022/027315 priority Critical patent/WO2022204614A1/en
Publication of WO2022204614A1 publication Critical patent/WO2022204614A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/014Hand-worn input/output arrangements, e.g. data gloves
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/211Input arrangements for video game devices characterised by their sensors, purposes or types using inertial sensors, e.g. accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/212Input arrangements for video game devices characterised by their sensors, purposes or types using sensors worn by the player, e.g. for measuring heart beat or leg activity
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/213Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/428Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving motion or position input signals, e.g. signals representing the rotation of an input controller or a player's arm motions sensed by accelerometers or gyroscopes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/015Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures

Definitions

  • the present disclosure relates, in general, to methods, systems, and apparatuses for implementing user interface (“UI”) controls, and, more particularly, to methods, systems, and apparatuses for implementing wrist- wearable controller for head-mounted device (“HMD”) or other user device with camera input.
  • UI user interface
  • HMD head-mounted device
  • the techniques of this disclosure generally relate to tools and techniques for implementing user interface (“UI”) controls, and, more particularly, to methods, systems, and apparatuses for implementing wrist-wearable controller for head-mounted device (“HMD”) or other user device with camera input.
  • UI user interface
  • HMD head-mounted device
  • a wrist-wearable control device may comprise one or more first sensors that are configured to detect gestures of one or more fingers of a user, when the wrist- wearable control device is being worn by the user, without any fingers of the user touching the wrist- wearable control device; one or more second sensors that are configured to detect motion of the wrist-wearable control device with respect to at least one axis of rotation among three axes of rotation; at least one processor; and a non-transitory computer readable medium communicatively coupled to the at least one processor.
  • the non-transitory computer readable medium may have stored thereon computer software comprising a set of instructions that, when executed by the at least one processor, causes the wrist-wearable control device to: analyze first sensor data to identify a first gesture corresponding to movement of one or more fingers of the user, the first sensor data being received from the one or more first sensors disposed on the wrist-wearable control device when the wrist-wearable control device is being worn by the user; analyze second sensor data to detect motion of the wrist-wearable control device with respect to the at least one axis of rotation among the three axes of rotation, the second sensor data being received from the one or more second sensors disposed on the wrist-wearable control device when the wrist-wearable control device is being worn by the user; and based on a determination that the detected motion of the wrist-wearable control device in conjunction with the first gesture corresponds to a first gesture-based command for controlling a user interface ("UI") of a head-mounted device (“HMD”) or other user device with camera input, generate first instructions based on the first gesture-based
  • a method may comprise analyzing, using a computing system, first sensor data to identify a first gesture corresponding to movement of one or more fingers of a user, the first sensor data being received from one or more first sensors disposed on a wrist-wearable control device when the wrist- wearable control device is being worn by the user, the one or more first sensors being configured to detect gestures of the one or more fingers of the user, when the wrist-wearable control device is being worn by the user, without any fingers of the user touching the wrist-wearable control device.
  • the method may further comprise analyzing, using the computing system, second sensor data to detect motion of the wrist-wearable control device with respect to at least one axis of rotation among three axes of rotation, the second sensor data being received from one or more second sensors disposed on the wrist-wearable control device when the wrist-wearable control device is being worn by the user; and based on a determination that the detected motion of the wrist-wearable control device in conjunction with the first gesture corresponds to a first gesture-based command for controlling a user interface ("UI") of a head-mounted device (“HMD”) or other user device with camera input, generating, using the computing system, first instructions based on the first gesture-based command and sending, using the computing system, the generated first instructions to the HMD or the other user device, the first instructions serving as input for controlling the UI of the HMD or the other user device.
  • UI user interface
  • HMD head-mounted device
  • Fig. 1 is a schematic diagram illustrating a system for implementing a wrist- wearable control device for controlling a user interface ("UI") of a head-mounted device (“HMD”) or other user device with camera input, in accordance with various embodiments.
  • Figs. 2A-2P are schematic diagrams illustrating various non-limiting examples of a wrist-wearable control device for controlling a UI of a HMD or other user device with camera input, in accordance with various embodiments.
  • FIGs. 3A and 3B are schematic diagrams illustrating various non-limiting examples of the use of a wrist- wearable control device for controlling a UI of a HMD or other user device with camera input, in accordance with various embodiments.
  • FIG. 4 is a flow diagram illustrating a method for implementing a wrist-wearable control device for controlling a UI of a HMD or other user device with camera input, in accordance with various embodiments.
  • FIG. 5 is a block diagram illustrating an example of computer or system hardware architecture, in accordance with various embodiments.
  • Various embodiments provide tools and techniques for implementing user interface (“UI”) controls, and, more particularly, to methods, systems, and apparatuses for implementing wrist-wearable controller for head-mounted device (“HMD”) or other user device with camera input.
  • UI user interface
  • HMD head-mounted device
  • a wrist-wearable control device may comprise one or more first sensors that are configured to detect gestures of one or more fingers of a user, when the wrist-wearable control device is being worn by the user, without any fingers of the user touching the wrist-wearable control device; one or more second sensors that are configured to detect motion of the wrist-wearable control device with respect to at least one axis of rotation among three axes of rotation; at least one processor; and a non-transitory computer readable medium communicatively coupled to the at least one processor.
  • the non- transitory computer readable medium may have stored thereon computer software comprising a set of instructions that, when executed by the at least one processor, causes the wrist- wearable control device to: analyze first sensor data to identify a first gesture corresponding to movement of one or more fingers of the user, the first sensor data being received from the one or more first sensors disposed on the wrist-wearable control device when the wrist- wearable control device is being worn by the user; analyze second sensor data to detect motion of the wrist-wearable control device with respect to the at least one axis of rotation among the three axes of rotation, the second sensor data being received from the one or more second sensors disposed on the wrist-wearable control device when the wrist-wearable control device is being worn by the user; and based on a determination that the detected motion of the wrist-wearable control device in conjunction with the first gesture corresponds to a first gesture-based command for controlling a user interface ("UI") of a head-mounted device (“HMD”) or other user device with camera input, generate first instructions based on the first gesture-based
  • the wrist-wearable control device may comprise one of a wristwatch-based wearable control device or a wristband-based wearable control device, or the like, each comprising one or more band portions that may be linked to adjacent band portions by connectors in a manner configured to removably wrap the wrist-wearable control device around a wrist of the user.
  • at least one first sensor among the one or more first sensors and at least one second sensor among the one or more second sensors may be disposed within each of the one or more band portions.
  • the one or more first sensors and the one or more second sensors may each be disposed within only one band portion among the one or more band portions, wherein the band portion containing the one or more first sensors may be one of the same band portion containing the one or more second sensors or a separate band portion from the band portion containing the one or more second sensors.
  • at least one first sensor among the one or more first sensors may be disposed within each of the one or more band portions while the one or more second sensors may be disposed within only one band portion among the one or more band portions.
  • the HMD or other user device with camera input may comprise one of a set of virtual reality (“VR”) goggles, augmented reality (“AR”) goggles, a set of mixed reality (“MR”) goggles, a pair of VR-enabled eyewear, a pair of AR- enabled eyewear, a pair of MR-enabled eyewear, a VR-enabled smartphone mounted in a headset, an AR-enabled smartphone mounted in a headset, a MR-enabled smartphone mounted in a headset, a smart television (“TV”) with built-in camera, a smart TV with externally connected camera, a gaming console with built-in camera, a gaming console with externally connected camera, a monitor with built-in camera, a monitor with externally connected camera, or a projector with externally connected camera, and/or the like.
  • VR virtual reality
  • AR augmented reality
  • MR mixed reality
  • VR-enabled eyewear a pair of VR-enabled eyewear
  • a pair of AR- enabled eyewear
  • the one or more first sensors may each comprise at least one biosensor
  • the at least one biosensor may comprise at least one of one or more photoplethysmography (“PPG") sensors, one or more electromyography (“EMG”) sensors, one or more sound transducers, or one or more motion transducers, and/or the like, each configured to detect biological characteristics that correspond to motion of soft tissue in or near a wrist of the user on which the wrist-wearable control device is being worn.
  • the soft tissue may comprise at least one of one or more muscles, one or more tendons, or one or more blood vessels, and/or the like, in or near the wrist of the user.
  • the one or more PPG sensors may be configured to optically detect blood volume changes in the one or more blood vessels below skin tissue at or near the wrist of the user corresponding to movement of the one or more fingers of the user.
  • the one or more EMG sensors may be configured to detect electrical activity of the one or more muscles at or near the wrist of the user corresponding to movement of the one or more fingers of the user.
  • the one or more sound transducers may be configured to detect sounds within a body of the user corresponding to movement of the one or more fingers of the user.
  • the one or more motion transducers may be configured to detect motion of the one or more muscles, the one or more tendons, and the one or more blood vessels corresponding to movement of the one or more fingers of the user.
  • the one or more second sensors may each comprise at least one of an inertial measurement unit (“IMU") sensor or a gyroscope-based sensor, and/or the like.
  • IMU inertial measurement unit
  • the wrist-wearable control device may further comprise one or more band portions; and a plurality of outward-facing lights disposed on the one or more band portions, wherein the plurality of outward-facing lights may be arranged in a predetermined pattern on the one or more band portions and may serve as tracking points for one or more cameras of the HMD or the other user device to track translation movement of the wrist-wearable control device along three axes relative to the HMD or the other user device, wherein the tracking of translation movement of the wrist-wearable control device along the three axes may serve as additional input for controlling the UI of the HMD or the other user device.
  • the plurality of outward-facing lights may comprise at least one of a plurality of infrared (“IR”) light emitting diodes (“LEDs”) or a plurality of colored LEDs, and/or the like.
  • the first gesture-based command may comprise at least one of a swipe-based command, a drag-based command, a tap command, a double tap command, a point command, a pinch-based command, a clench-based command, a rotate command, a roll command, a pitch command, or a yaw command, and/or the like.
  • the first instructions may comprise at least one of a scroll command, a move command, a select command, a confirm command, a highlight command, a return command, a cancel command, a move cursor command, or a navigate command, and/or the like.
  • a computing system may analyze first sensor data to identify a first gesture corresponding to movement of one or more fingers of a user, the first sensor data being received from one or more first sensors disposed on a wrist-wearable control device when the wrist-wearable control device is being worn by the user, the one or more first sensors being configured to detect gestures of the one or more fingers of the user, when the wrist-wearable control device is being worn by the user, without any fingers of the user touching the wrist-wearable control device.
  • the computing system may analyze second sensor data to detect motion of the wrist-wearable control device with respect to at least one axis of rotation among three axes of rotation, the second sensor data being received from one or more second sensors disposed on the wrist-wearable control device when the wrist- wearable control device is being worn by the user. Based on a determination that the detected motion of the wrist-wearable control device in conjunction with the first gesture corresponds to a first gesture-based command for controlling a user interface ("UI") of a head-mounted device (“HMD”) or other user device with camera input, the computing system may generate first instructions based on the first gesture-based command and may send the generated first instructions to the HMD or the other user device, the first instructions serving as input for controlling the UI of the HMD or the other user device.
  • UI user interface
  • HMD head-mounted device
  • the computing system may comprise at least one of one or more processors on the wrist-wearable control device, one or more processors on a mobile device that is communicatively coupled with the wrist-wearable control device, a machine learning system, an artificial intelligence (“AI”) system, a deep learning system, a neural network, a convolutional neural network (“CNN”), or a fully convolutional network (“FCN”), and/or the like.
  • a machine learning system an artificial intelligence (“AI") system
  • AI artificial intelligence
  • CNN convolutional neural network
  • FCN fully convolutional network
  • the wrist-wearable control device may comprise one of a wristwatch-based wearable control device or a wristband-based wearable control device, and/or the like, each comprising one or more band portions that may be linked to adjacent band portions by connectors in a manner configured to removably wrap the wrist-wearable control device around a wrist of the user.
  • the one or more first sensors may each comprise at least one biosensor
  • the at least one biosensor may comprise at least one of one or more photoplethysmography (“PPG") sensors, one or more electromyography (“EMG”) sensors, one or more sound transducers, or one or more motion transducers, and/or the like, each configured to detect biological characteristics that correspond to motion of soft tissue in or near a wrist of the user on which the wrist-wearable control device is being worn.
  • the soft tissue may comprise at least one of one or more muscles, one or more tendons, or one or more blood vessels, and/or the like, in or near the wrist of the user.
  • the one or more PPG sensors may be configured to optically detect blood volume changes in the one or more blood vessels below skin tissue at or near the wrist of the user corresponding to movement of the one or more fingers of the user.
  • the one or more EMG sensors may be configured to detect electrical activity of the one or more muscles at or near the wrist of the user corresponding to movement of the one or more fingers of the user.
  • the one or more sound transducers may be configured to detect sounds within a body of the user corresponding to movement of the one or more fingers of the user.
  • the one or more motion transducers may be configured to detect motion of the one or more muscles, the one or more tendons, and the one or more blood vessels corresponding to movement of the one or more fingers of the user.
  • the one or more second sensors may each comprise at least one of an inertial measurement unit (“IMU") sensor or a gyroscope-based sensor, and/or the like.
  • IMU inertial measurement unit
  • the wrist-wearable control device may further comprise one or more band portions; and a plurality of outward-facing lights disposed on the one or more band portions, wherein the plurality of outward-facing lights may be arranged in a predetermined pattern on the one or more band portions and may serve as tracking points for one or more cameras of the HMD or the other user device to track translation movement of the wrist-wearable control device along three axes relative to the HMD or the other user device, wherein the tracking of translation movement of the wrist-wearable control device along the three axes may serve as additional input for controlling the UI of the HMD or the other user device.
  • the plurality of outward-facing lights may comprise at least one of a plurality of infrared (“IR”) light emitting diodes (“LEDs”) or a plurality of colored LEDs, and/or the like.
  • a wrist-wearable control device for controlling a UI of a HMD or other user device with camera input (collectively, "user device”).
  • This enables highly accurate six-degrees-of-freedom (“6DOF") control of the UI of user device, by combining micro gesture-based UI control (using the one or more first sensors or biosensors), motion (e.g., rotation and/or linear motion, etc.) detection-based UI control (using the at least one second sensor(s) or IMU sensor(s)), and translation detection- based UI control (using the one or more outward facing lights as tracking points for camera(s) of the user device to serve as addition UI control input).
  • micro gesture-based UI control using the one or more first sensors or biosensors
  • motion detection-based UI control using the at least one second sensor(s) or IMU sensor(s)
  • translation detection- based UI control using the one or more outward facing lights as tracking points for camera(s) of the user device to serve as addition UI control
  • Various embodiments as described herein - while embodying (in some cases) software products, computer-performed methods, and/or computer systems - represent tangible, concrete improvements to existing technological areas, including, without limitation, wrist-wearable device technology, gesture control technology, gesture control technology for wrist-wearable devices, biosensor technology for gesture control, motion tracking technology for gesture control, user interface (“UI") control technology, UI control technology for wrist- wearable devices, 3DOF control technology of UI, 6DOF control technology of UI, and/or the like.
  • UI user interface
  • some embodiments can improve the functioning of user equipment or systems themselves (e.g., wrist-wearable devices, gesture control systems, gesture control systems for wrist- wearable devices, biosensor systems for gesture control, motion tracking systems for gesture control, UI control systems, UI control systems for wrist-wearable devices, 3DOF control systems of UI, 6DOF control systems of UI, etc.), for example, by analyzing, using a computing system, first sensor data to identify a first gesture corresponding to movement of one or more fingers of a user, the first sensor data being received from one or more first sensors disposed on a wrist-wearable control device when the wrist-wearable control device is being worn by the user, the one or more first sensors being configured to detect gestures of the one or more fingers of the user, when the wrist-wearable control device is being worn by the user, without any fingers of the user touching the wrist-wearable control device; analyzing, using the computing system, second sensor data to detect motion of the wrist-wearable control device with respect to at least one axis of rotation among three
  • an optimized wrist-wearable control device that allows for 6DOF UI control of another user device, by using micro-gestures of the fingers on the arm on which the wrist-wearable control device is worn that are detected and identified using biosensors (in some cases, in conjunction with sound/motion transducers (e.g., microphones, etc.) and/or IMU sensors) to monitor, track, and identify such micro gestures in conjunction with motion detection (using IMU sensors) and translation detection (using outward-facing lights that are used as tracking points for a camera(s) of the user device being controlled) of the wrist-wearable control device, at least some of which may be observed or measured by users, wrist-wearable device manufacturers, user device manufacturers, and/or universal remote controller manufacturers.
  • biosensors in some cases, in conjunction with sound/motion transducers (e.g., microphones, etc.) and/or IMU sensors) to monitor, track, and identify such micro gestures in conjunction with motion detection (using IMU sensors) and translation detection (using outward-facing lights that are used as tracking
  • Figs. 1-5 illustrate some of the features of the method, system, and apparatus for implementing user interface (“UI") controls, and, more particularly, to methods, systems, and apparatuses for implementing wrist-wearable controller for head-mounted device (“HMD”) or other user device with camera input, as referred to above.
  • UI user interface
  • HMD head-mounted device
  • the methods, systems, and apparatuses illustrated by Figs. 1-5 refer to examples of different embodiments that include various components and steps, which can be considered alternatives or which can be used in conjunction with one another in the various embodiments.
  • the description of the illustrated methods, systems, and apparatuses shown in Figs. 1-5 is provided for purposes of illustration and should not be considered to limit the scope of the different embodiments.
  • Fig. 1 is a schematic diagram illustrating a system 100 for implementing a wrist-wearable control device for controlling a user interface ("UI") of a head-mounted device (“HMD”) or other user device with camera input, in accordance with various embodiments.
  • UI user interface
  • HMD head-mounted device
  • system 100 may comprise a wrist- wearable device 105 that is configured to be worn on a wrist of a user 110.
  • the wrist-wearable device 105 may include, but is not limited to, one of a smart watch, a wrist-wearable display device, a wrist-wearable control device, or other wrist-wearable user device, and/or the like.
  • the wrist-wearable control device may include, without limitation, one of a wristwatch-based wearable control device or a wristband-based wearable control device, and/or the like, each including one or more band portions that may be linked to adjacent band portions by connectors in a manner configured to removably wrap the wrist- wearable control device around a wrist of the user 110 (as shown, e.g., in Figs. 2A-2P, 3A, and 3B, or the like).
  • Wrist-wearable device 105 may include, without limitation, computing system(s) 115, one or more first sensors 120, one or more second sensors 125, display screen 130 (optional), communications system 135, and one or more outward-facing lights 140a-140n (optional; collectively, "outward-facing lights 140" or “lights 140” or the like), and/or the like.
  • outward facing lights 140 may include, but are not limited to, at least one of a plurality of infrared (“IR”) light emitting diodes (“LEDs”) or a plurality of colored LEDs (e.g., LEDs emitting light in the visual spectrum that may each be single-color or changeable among multiple colors), and/or the like.
  • IR infrared
  • LEDs light emitting diodes
  • colored LEDs e.g., LEDs emitting light in the visual spectrum that may each be single-color or changeable among multiple colors
  • computing system 115 may include, without limitation, at least one of one or more processors on the wrist- wearable device (e.g., processor(s) 115a, or the like), one or more processors on a mobile device that is communicatively coupled with the wrist-wearable device (e.g., processor (not shown) on mobile device 145 that is communicatively coupled with wrist- wear able device 105 via communications system 135 (as denoted in Fig.
  • processors on the wrist- wearable device e.g., processor(s) 115a, or the like
  • processors on a mobile device that is communicatively coupled with the wrist-wearable device e.g., processor (not shown) on mobile device 145 that is communicatively coupled with wrist- wear able device 105 via communications system 135 (as denoted in Fig.
  • the mobile device 145 may include, without limitation, one of a smart phone, a tablet computer, a laptop computer, or a portable gaming device, and/or the like.
  • a signal processing system e.g., signal processing system 115b, or the like
  • an artificial intelligence (“AI") system e.g., AI system 115c, or the like
  • the AI system 115c and/or the other computing system(s) 115d may include, but is not limited to, a machine learning system, a deep learning system, a neural network, a convolutional neural network (“CNN”), or a fully convolutional network (“FCN”), and/or the like.
  • the mobile device 145 may include, without limitation, one of a smart phone, a tablet computer, a laptop computer, or a portable gaming device, and/or the like.
  • display device 130 may include, but is not limited to, at least one of a touchscreen display device, a non-touchscreen display device, a projection-based display device, a holographic display device, and/or the like.
  • the communications system 135 may include wireless communications devices capable of communicating using protocols including, but not limited to, at least one of BluetoothTM communications protocol, WiFi communications protocol, or other 802.11 suite of communications protocols, ZigBee communications protocol, Z-wave communications protocol, or other 802.15.4 suite of communications protocols, cellular communications protocol (e.g., 3G, 4G, 4G LTE, 5G, etc.), or other suitable communications protocols, and/or the like.
  • system 100 may further comprise a user device 150, which may include, but is not limited to, at least one of one or more processors 150a, a transceiver 150b, one or more cameras 150c, or a display device 150d, and/or the like.
  • the transceiver 150b and/or the one or more cameras 150c may each be either integrated within the user device 150, external yet communicatively coupled to user device 150, or partially integrated with and partially external yet communicatively coupled to user device 150, and/or the like.
  • the camera(s) 150c may have a field of view ("FOV") 155 that, when directed toward wrist-wearable device 105, may be used to monitor or track position and/or orientation of the wrist-wearable device 105 (in some cases, based at least in part on the outward facing lights 140 disposed on the wrist-wearable device 105, if any).
  • a UI 160 may be displayed or presented on display device 150d.
  • the wrist- wearable device 105 may communicatively couple with user device 150 via communications system 135 (as denoted in Fig. 1 by the lightning bolt symbol between transceiver 150b of user device 150 and communications system 135 of wrist- wearable device 105), or the like).
  • user device 150 may include, without limitation, a head-mounted device (“HMD”) or other user device with camera input, and/or the like.
  • the HMD may include, but is not limited to, one of a set of virtual reality (“VR”) goggles, augmented reality (“AR”) goggles, a set of mixed reality (“MR”) goggles, a pair of VR-enabled eyewear, a pair of AR-enabled eyewear, a pair of MR-enabled eyewear, a VR-enabled smartphone mounted in a headset, an AR-enabled smartphone mounted in a headset, a MR-enabled smartphone mounted in a headset, and/or the like.
  • VR virtual reality
  • AR augmented reality
  • MR mixed reality
  • the other user device with camera input may include, without limitation, one of a smart television ("TV") with built-in camera, a smart TV with externally connected camera, a gaming console with built-in camera, a gaming console with externally connected camera, a monitor with built-in camera, a monitor with externally connected camera, or a projector with externally connected camera, and/or the like.
  • TV smart television
  • gaming console with built-in camera a gaming console with externally connected camera
  • a monitor with built-in camera a monitor with externally connected camera
  • a projector with externally connected camera and/or the like.
  • the one or more first sensors 120 may each include at least one biosensor.
  • the at least one biosensor may include, without limitation, at least one of one or more photoplethysmography (“PPG") sensors, one or more electromyography (“EMG”) sensors, one or more sound transducers, or one or more motion transducers, and/or the like, each configured to detect biological characteristics that correspond to motion of soft tissue in or near a wrist of the user on which the wrist-wearable device is being worn.
  • the soft tissue may include, but is not limited to, at least one of one or more muscles, one or more tendons, or one or more blood vessels, and/or the like, in or near the wrist of the user.
  • the one or more PPG sensors may be configured to optically detect blood volume changes in the one or more blood vessels below skin tissue at or near the wrist of the user corresponding to movement of the one or more fingers of the user.
  • the one or more EMG sensors may be configured to detect electrical activity of the one or more muscles at or near the wrist of the user corresponding to movement of the one or more fingers of the user.
  • the one or more sound transducers in some cases, including, but not limited to, a microphone, or the like) may be configured to detect sounds within a body of the user corresponding to movement of the one or more fingers of the user.
  • the one or more motion transducers may be configured to detect motion of the one or more muscles, the one or more tendons, and the one or more blood vessels corresponding to movement of the one or more fingers of the user.
  • the one or more second sensors 125 may each include, but is not limited to, at least one of an inertial measurement unit (“IMU") sensor(s) or a gyroscope-based sensor(s), or similar sensor(s) and/or the like.
  • IMU inertial measurement unit
  • At least one first sensor 120 among the one or more first sensors 120 and at least one second sensor 125 among the one or more second sensors 125 may be disposed within each of the one or more band portions.
  • the one or more first sensors 120 and the one or more second sensors 125 may each be disposed within only one band portion among the one or more band portions, wherein the band portion containing the one or more first sensors 120 may be one of the same band portion containing the one or more second sensors 125 or a separate band portion from the band portion containing the one or more second sensors 125.
  • at least one first sensor 120 among the one or more first sensors 120 may be disposed within each of the one or more band portions while the one or more second sensors 125 may be disposed within only one band portion among the one or more band portions.
  • the outward-facing lights 140 may be disposed on the one or more band portions, and, in some cases, may be arranged in a predetermined pattern on the one or more band portions and may serve as tracking points for one or more cameras 150c of the user device 150 (as captured within the FOV 155) to track translation movement of the wrist-wearable device 105 along three axes relative to the user device 150. In some cases, the tracking of translation movement of the wrist-wearable device along the three axes may serve as additional input for controlling the UI 160 of the user device 150.
  • computing system(s) 115, processor(s) 115a, signal processing system 115b, AI system 115c, other computing system(s) 115d, and/or mobile device 145, or the like may analyze first sensor data to identify a first gesture corresponding to movement of one or more fingers of a user (e.g., user 110, or the like), the first sensor data being received from one or more first sensors (e.g., first sensor(s) or biosensor(s) 120, or the like) disposed on a wrist-wearable control device (e.g., wrist- wearable device 105, or the like) when the wrist- wearable control device is being worn by the user, the one or more first sensors being configured to detect gestures of the one or more fingers of the user, when the wrist-wearable control device is being worn by the user, without any fingers of the user touching the wrist-wearable control device.
  • first sensors e.g., first sensor(s) or biosensor(s) 120, or the like
  • wrist-wearable control device e
  • the computing system may analyze second sensor data to detect motion of the wrist- wearable control device (e.g., rotation and/or linear motion, etc.) with respect to at least one axis of rotation among three axes of rotation, the second sensor data being received from one or more second sensors (e.g., second sensor(s) or IMU sensor(s) 125, or the like) disposed on the wrist-wearable control device when the wrist-wearable control device is being worn by the user.
  • second sensors e.g., second sensor(s) or IMU sensor(s) 125, or the like
  • the computing system may generate first instructions based on the first gesture- based command and may send the generated first instructions to the HMD or the other user device, the first instructions serving as input for controlling the UI of the HMD or the other user device.
  • a UI e.g., UI 160, or the like
  • camera input e.g., user device 150, or the like
  • the first gesture-based command may include, without limitation, at least one of a swipe-based command, a drag-based command, a tap command, a double tap command, a point command, a pinch-based command, a clench-based command, a rotate command, a roll command, a pitch command, or a yaw command, and/or the like.
  • the first instructions may include, but are not limited to, at least one of a scroll command, a move command, a select command, a confirm command, a highlight command, a return command, a cancel command, a move cursor command, or a navigate command, and/or the like.
  • the user may start from a relaxed pose of fingers (also referred to herein as "initial pose”; e.g., as shown in Figs. 3A and 3B, or the like), which may be defined by the hand of the user with the fingers (e.g., five fingers) naturally spread out and released or relaxed (i.e., without muscle tension to straighten the fingers nor to curl them (e.g., as in a fist, or the like), and the like).
  • a relaxed pose of fingers also referred to herein as "initial pose”; e.g., as shown in Figs. 3A and 3B, or the like
  • a finger such as, but not limited to, the index finger, or the like
  • a moderate speed may refer to a speed that is greater than a slow, deliberate shift in positioning of the finger, or the like
  • finger lifting may be performed at a fast speed (e.g., as defined by visual blurring of the fingers due to speed of movement, or the like).
  • the finger For a swipe-down command, the finger may be lowered, e.g., at a moderate speed, from either the initial or relaxed pose or the lifted position (as described above), relative to the other fingers, which remain in the initial or relaxed pose. In some cases, finger lowering may be performed at a fast speed.
  • the computing system may identify such micro-gesture as a swipe up command. Conversely, from the initial pose, if the user pulls down the index finger, the computing system may identify such micro-gesture as a swipe down command.
  • the user could use an activation gesture, like a pinch command, to open the page.
  • the pinch command or pinch-based gesture command may include the thumb and forefinger touching or pressing together with minimal or no movement in the other fingers of the hand (not shown). Then, the user can pull down the index finger to trigger the swipe down function to navigate down the page. If the user lifts the index finger above the initial pose, such gesture will trigger the swipe up function.
  • the user may start from the initial or relaxed pose of fingers, similar to that for the swipe-based command gesture (or finger-swipe) command, as described above.
  • a finger such as, but not limited to, the index finger, or the like
  • slow speed may refer to a speed that is slower than the moderate speed described above, or the like
  • the finger may be lowered, e.g., at a slow speed, from either the initial or relaxed pose or the lifted position (as described above), relative to the other fingers, which remain in the initial or relaxed pose.
  • the UI e.g., UI 160, or the like
  • the wrist- wearable device includes the second sensor(s) or IMU sensor(s) (e.g., second sensor(s) or IMU sensor(s) 125, or the like))
  • the lifting and the lowering of the finger may be identified as micro-gestures that result in the computing system (e.g., computing system(s) 115, or the like) causing the UI to perform one of scrolling or dragging along a first direction (in response to the finger lifting) or to scroll or drag along a second direction (in response to the finger lowering).
  • the first direction may result in the computing system causing the UI to perform one of a scroll-up function or a drag-up function
  • the second direction may result in the computing system causing the UI to perform one of a scroll-down function or a drag-down function, or the like.
  • the first direction may result in the computing system causing the UI to perform one of a scroll-left function or a drag-left function (if using the left hand; or a scroll-right function or a drag-right function if using the right hand), while the second direction may result in the computing system causing the UI to perform one of a scroll-right function or a drag-right function (if using the left hand; or a scroll-left function or a drag-left function if using the right hand), or the like.
  • the UI e.g., UI 160, or the like
  • the wrist-wearable device includes the second sensor(s) or IMU sensor(s) (e.g., second sensor(s) or IMU sensor(s) 125, or the like)
  • the lifting and the lowering of the finger as described above with respect to swipe-based commands and drag-based commands
  • motion of the user's wrist as detected by the second sensor(s) or IMU sensor(s) 125, or the like
  • the computing system e.g., computing system(s)
  • UI UI to perform one of scrolling or dragging along a direction based on the degree of supination or pronation of a forearm connected to a wrist of the user on which the wrist-wearable device is being worn, as detected by the second sensor(s) or IMU sensor(s) that tracks motion of the wrist-wearable device.
  • supination of the forearm may refer to rotation of the forearm such that the ventral side of the forearm (or palm) is facing forward and/or upward relative to a sitting or standing position of the body of the user when the user's arm is hanging by the users side or when the forearm is lifted from the arm hanging position (regardless of whether the fingers are spread in an open hand or are closed in a fist), while pronation of the forearm may refer to rotation of the forearm such that the ventral side of the forearm (or palm) is facing backward and/or downward relative to a sitting or standing position of the body of the user when the user's arm is hanging by the users side or when the forearm is lifted from the arm hanging position (regardless of whether the fingers are spread in an open hand or are closed in a fist), and a neutral position of the forearm may refer to rotation of the forearm such that the ventral side of the forearm (or palm) is facing toward the sagittal or median plane,
  • the lifting and the lowering of the finger may be identified as micro-gestures that result in the computing system causing the UI to perform one of scrolling or dragging along the vertical direction (i.e., in the up or down direction, similar to the one dimensional vertical movement embodiment described above), regardless of whether the user's elbow is straight, bent with forearm beside the torso, or bent with forearm in front of the torso.
  • the lifting and the lowering of the finger may be identified as micro-gestures that result in the computing system causing the UI to perform one of scrolling or dragging along the horizontal direction (i.e., in the left or right direction, similar to the one dimensional horizontal movement embodiment described above), regardless of whether the user's elbow is straight, bent with forearm beside the torso, or bent with forearm in front of the torso.
  • the lifting and the lowering of the finger may be identified as micro-gestures that result in the computing system causing the UI to perform one of scrolling or dragging along a diagonal direction based on the degree of rotation between the supinated and pronated positions, regardless of whether the user's elbow is straight, bent with forearm beside the torso, or bent with forearm in front of the torso.
  • a machine learning system e.g., AI system 115c or other computing system(s) 115d, or the like
  • Figs. 2A-2P are schematic diagrams illustrating various non-limiting examples 200 and 200' of a wrist-wearable control device for controlling a UI of a HMD or other user device with camera input, in accordance with various embodiments.
  • Figs. 2A-2H depict various non-limiting examples 200 of a wristband-based wearable control device 205a-205h, respectively.
  • Figs. 2I-2P depict various non-limiting examples 200' of a wristwatch-based wearable control device 205i-205p.
  • Fig. 2 In Fig.
  • the wristband-based wearable control devices 205a-205h or the wristwatch-based wearable control devices 205i-205p may each include, without limitation, a plurality of band portions 210 and a plurality of connectors 215.
  • Each band portion 210 may include, but is not limited to, at least one of one or more first sensors or biosensors 220 (similar to the first sensor(s) or biosensor(s) 120 of Fig. 1, or the like), at least one second sensor(s) or IMU sensor(s) 225 (similar to the second sensor(s) or IMU sensor(s) 125 of Fig.
  • one or more outward facing lights 230 (similar to the one or more outward facing lights 140a-140n of Fig. 1, or the like), and/or the like, or may include a band portion without any of the first sensor(s) or biosensor(s) 220, the second sensor(s) or IMU sensor(s) 225, or the outward facing lights 230, or the like.
  • the wristband-based wearable control devices 205a-205h may include, but are not limited to: (a) a wristband-based wearable control device 205a having a plurality of band portions 210a each including one or more first sensors or biosensors 220 and at least one second sensor(s) or IMU sensor(s) 225 (as shown in Fig. 2A); (b) a wristband-based wearable control device 205b having a plurality of band portions 210b each including one or more first sensors or biosensors 220, at least one second sensor(s) or IMU sensor(s) 225, and one or more outward facing lights 230 (as shown in Fig.
  • a wristband-based wearable control device 205c having a plurality of band portions 210c each including one or more first sensors or biosensors 220, and a single band portion 210a including both one or more first sensors or biosensors 220 and at least one second sensor(s) or IMU sensor(s) 225 (as shown in Fig.
  • a wristband-based wearable control device 205d having a plurality of band portions 210d each including one or more first sensors or biosensors 220 and one or more outward facing lights 230, and a single band portion 210b including one or more first sensors or biosensors 220, at least one second sensor(s) or IMU sensor(s) 225, and one or more outward facing lights 230 (as shown in Fig.
  • a wristband-based wearable control device 205e having a plurality of band portions 210e without any of the first sensor(s) or biosensor(s) 220, the second sensor(s) or IMU sensor(s) 225, or the outward facing lights 230, and a single band portion 210a including both one or more first sensors or biosensors 220 and at least one second sensor(s) or IMU sensor(s) 225 (as shown in Fig.
  • a wristband-based wearable control device 205f having a plurality of band portions 210f each including one or more outward facing lights 230, and a single band portion 210b including one or more first sensors or biosensors 220, at least one second sensor(s) or IMU sensor(s) 225, and one or more outward facing lights 230 (as shown in Fig.
  • a wristband-based wearable control device 205g having a single band portion 210g including at least one second sensor(s) or IMU sensor(s) 225, a (separate) single band portion 210c including one or more first sensors or biosensors 220, and a plurality of band portions 210e without any of the first sensor(s) or biosensor(s) 220, the second sensor(s) or IMU sensor(s) 225, or the outward facing lights 230 (as shown in Fig.
  • a wristband-based wearable control device 205h having a single band portion 21 Oh including at least one second sensor(s) or IMU sensor(s) 225 and one or more outward facing lights 230, a (separate) single band portion 210d including one or more first sensors or biosensors 220 and one or more outward facing lights 230, and a plurality of band portions 21 Of each including one or more outward facing lights 230 (as shown in Fig. 2F1); and/or the like.
  • Each band portion 210 may be connected to an adjacent band portion(s) 210 via connector(s) 215 in a manner configured to removably wrap the wrist- wearable control device 205 around a wrist of a user (e.g., user 110, or the like) when the wrist-wearable control device 205 is worn by the user.
  • the connectors 215 may include elastic material-based connectors that allow the band portions to be pulled away from each other to slip over and onto (or over and off) the hand and wrist of the user.
  • the connectors 215 may include a plurality of links and a clasp (e.g., made of metal, plastic, or other solid materials), or the like.
  • At least one band portion 210 may also include, without limitation, at least one of a computing system (similar to computing system 115 of Fig. 1, or the like) or a communications system (similar to communications system 135 of Fig. 1, or the like).
  • a computing system similar to computing system 115 of Fig. 1, or the like
  • a communications system similar to communications system 135 of Fig. 1, or the like.
  • the wristwatch-based wearable control devices 205i-205p may include, but are not limited to: (i) a wristwatch-based wearable control device 205i having a watch portion 235, a plurality of band portions 210a each including one or more first sensors or biosensors 220 and at least one second sensor(s) or IMU sensor(s) 225 (as shown in Fig. 21); (j) a wristwatch-based wearable control device 205j having a watch portion 235, a plurality of band portions 210b each including one or more first sensors or biosensors 220, at least one second sensor(s) or IMU sensor(s) 225, and one or more outward facing lights 230 (as shown in Fig.
  • a wristwatch-based wearable control device 205k having a watch portion 235, a plurality of band portions 210c each including one or more first sensors or biosensors 220, and a single band portion 210a including both one or more first sensors or biosensors 220 and at least one second sensor(s) or IMU sensor(s) 225 (as shown in Fig.
  • a wristwatch- based wearable control device 2051 having a watch portion 235, a plurality of band portions 210d each including one or more first sensors or biosensors 220 and one or more outward facing lights 230, and a single band portion 210b including one or more first sensors or biosensors 220, at least one second sensor(s) or IMU sensor(s) 225, and one or more outward facing lights 230 (as shown in Fig.
  • a wristwatch-based wearable control device 205m having a watch portion 235, a plurality of band portions 210e without any of the first sensor(s) or biosensor(s) 220, the second sensor(s) or IMU sensor(s) 225, or the outward facing lights 230, and a single band portion 210a including both one or more first sensors or biosensors 220 and at least one second sensor(s) or IMU sensor(s) 225 (as shown in Fig.
  • a wristwatch-based wearable control device 205n having a watch portion 235, a plurality of band portions 210f each including one or more outward facing lights 230, and a single band portion 210b including one or more first sensors or biosensors 220, at least one second sensor(s) or IMU sensor(s) 225, and one or more outward facing lights 230 (as shown in Fig.
  • a wristwatch-based wearable control device 205o having a watch portion 235, a single band portion 210g including at least one second sensor(s) or IMU sensor(s) 225, a (separate) single band portion 210c including one or more first sensors or biosensors 220, and a plurality of band portions 210e without any of the first sensor(s) or biosensor(s) 220, the second sensor(s) or IMU sensor(s) 225, or the outward facing lights 230 (as shown in Fig.
  • a wristwatch-based wearable control device 205p having a watch portion 235, a single band portion 21 Oh including at least one second sensor(s) or IMU sensor(s) 225 and one or more outward facing lights 230, a (separate) single band portion 210d including one or more first sensors or biosensors 220 and one or more outward facing lights 230, and a plurality of band portions 21 Of each including one or more outward facing lights 230 (as shown in Fig. 2P); and/or the like.
  • Each band portion 210 (as well as watch portion 235) may be connected to an adjacent band portion(s) 210 via connector (s) 215 in a manner configured to removably wrap the wrist-wearable control device 205 around a wrist of a user (e.g., user 110, or the like) when the wrist- wear able control device 205 is worn by the user.
  • the connectors 215 may include elastic material-based connectors that allow the band portions to be pulled away from each other to slip over and onto (or over and off) the hand and wrist of the user.
  • the connectors 215 may include a plurality of links and a clasp (e.g., made of metal, plastic, or other solid materials), or the like.
  • At least one band portion 210 may also include, without limitation, at least one of a computing system (similar to computing system 115 of Fig. 1, or the like) or a communications system (similar to communications system 135 of Fig. 1, or the like).
  • watch portion 235 (aside from including watch components or smartwatch components) may include, but is not limited, at least one of a computing system (similar to computing system 115 of Fig. 1, or the like), a communications system (similar to communications system 135 of Fig. 1, or the like), one or more first sensors or biosensors 220, at least one second sensor(s) or IMU sensor(s) 225, or one or more outward facing lights 230, and/or the like.
  • FIGs. 3 A and 3B are schematic diagrams illustrating various non-limiting examples 300 and 300' of the use of a wrist-wearable control device for controlling a UI of a HMD (Fig. 3A) or other user device with camera input (Fig. 3B), in accordance with various embodiments.
  • wrist-wearable device 205 (similar to wrist-wearable devices 105 and 205 of Figs. 1 and 2, or the like) is shown being worn on a wrist 305a of a user (similar to user 110 of Fig. 1, or the like).
  • the wrist-wearable device 205 may each include, without limitation, a plurality of band portions 210 and a plurality of connectors 215.
  • wrist-wearable device 205 comprises a wristwatch-based wearable control device (similar to wristwatch-based wearable control device 205i-205p of Figs. 2I-2P, or the like)
  • the wrist- wearable device 205 may further include watch portion 235 (not shown in Fig. 3; similar to watch portion 235 of Figs.
  • Each band portion 210 may include, but is not limited to, at least one of one or more first sensors or biosensors 220 (not shown in Fig. 3; similar to the first sensor(s) or biosensor(s) 120 and 220 of Figs. 1 and 2, or the like), at least one second sensor(s) or IMU sensor(s) 225 (not shown in Fig. 3; similar to the second sensor(s) or IMU sensor(s) 125 and 225 of Figs. 1 and 2, or the like), or one or more outward facing lights 230 (similar to the one or more outward facing lights 140a-140n and 230 of Figs. 1 and 2, or the like), and/or the like.
  • HMD 310a may include, without limitation, at least one processor (not shown; similar to the one or more processors 150a of Fig. 1, or the like), at least one transceiver (not shown; similar to transceiver 150b of Fig. 1, or the like), one or more cameras (not shown; similar to the one or more cameras 150c of Fig. 1, or the like), display device 315a (similar to display device 150d of Fig. 1, or the like), and UI 320a (similar to UI 160 of Fig. 1, or the like), or the like.
  • HMD 310a may include, but is not limited to, one of a set of virtual reality (“VR”) goggles, augmented reality (“AR”) goggles, a set of mixed reality (“MR”) goggles, a pair of VR-enabled eyewear, a pair of AR-enabled eyewear, a pair of MR-enabled eyewear, a VR-enabled smartphone mounted in a headset, an AR-enabled smartphone mounted in a headset, a MR-enabled smartphone mounted in a headset, and/or the like.
  • VR virtual reality
  • AR augmented reality
  • MR mixed reality
  • user device 310b may include, without limitation, at least one processor (not shown; similar to the one or more processors 150a of Fig. 1, or the like), at least one transceiver (not shown; similar to transceiver 150b of Fig. 1, or the like), one or more cameras 340 (similar to the one or more cameras 150c of Fig. 1, or the like), display screen or device 315b (similar to display device 150d of Fig. 1, or the like), and UI 320b (similar to UI 160 of Fig. 1, or the like), or the like.
  • camera(s) 340 is depicted in Fig.
  • the various embodiments are not so limited, and camera(s) 340 may be integrated within user device 310b.
  • the user device 310b may include, without limitation, one of a smart television ("TV") with built-in camera, a smart TV with externally connected camera, a gaming console with built-in camera, a gaming console with externally connected camera, a monitor with built-in camera, a monitor with externally connected camera, or a projector with externally connected camera, and/or the like.
  • TV smart television
  • gaming console with built-in camera a gaming console with externally connected camera
  • a monitor with built-in camera a monitor with externally connected camera
  • a projector with externally connected camera and/or the like.
  • the monitor or projector may also be communicatively coupled with one of a smart phone, a tablet computer, a laptop computer, a desktop computer, or a portable gaming device, and/or the like, that presents content for display on the monitor or projector.
  • UI 320a or 320b (collectively, “UI 320” or the like), which is presented within display device 315a or 315b (collectively, "display device 315" or the like), may display or present content (in this case, a menu screen with a plurality of system icons 325 (in this case, icons of connected VR headset and mobile device, or the like) and/or a plurality of icons 330 associated with corresponding plurality of software applications ("apps”), or the like, although not limited to such).
  • a menu screen with a plurality of system icons 325 (in this case, icons of connected VR headset and mobile device, or the like) and/or a plurality of icons 330 associated with corresponding plurality of software applications (“apps”), or the like, although not limited to such).
  • the content that may be displayed or presented may include icons, text, graphics, objects, thumb nails, or picture-in-picture ("PIP") -type mini-windows of media content including, but not limited to, video content, image content, representations of audio content (e.g., album cover art, artist photograph or poster, etc.), game content, presentation content, etc.).
  • PIP picture-in-picture
  • the one or more first sensors or biosensors 220 may detect biological characteristics that correspond to motion of soft tissue in or near a wrist of the user on which the wrist-wearable device is being worn, as described in detail above with respect to Fig. 1.
  • the one or more first sensors or biosensors 220 may be used to track, monitor, and/or identify gesture-based commands, including, without limitation, at least one of a swipe-based command (e.g., with finger 305c lifting away from or lowering toward thumb 305d, at moderate speed, with other fingers of hand 305b remaining in the initial or relaxed pose, as described above with respect to Fig. 1, or the like), a drag-based command (e.g., with finger 305c lifting away from or lowering toward thumb 305d, at slow speed, with other fingers of hand 305b remaining in the initial or relaxed pose, as described above with respect to Fig.
  • a swipe-based command e.g., with finger 305c lifting away from or lowering toward thumb 305d, at moderate speed, with other fingers of hand 305b remaining in the initial or relaxed pose, as described above with respect to Fig. 1, or the like
  • a drag-based command e.g., with finger 305c lifting away from or lowering toward thumb 305d,
  • a tap command e.g., with finger 305c lowering toward thumb 305d, at fast speed, or with finger 305c crooked or curved, with other fingers of hand 305b remaining in the initial or relaxed pose, or the like
  • a double tap command e.g., similar to the tap command, but with a pair of such gestures in quick succession, or the like
  • a point command e.g., with finger 305c straightening, with other fingers of hand 305b remaining in the initial or relaxed pose, or the like
  • a pinch-based command e.g., with finger 305c moving toward and touching or pressing against thumb 305d, with other fingers of hand 305b remaining in the initial or relaxed pose, as described above with respect to Fig.
  • the wrist wearable device 205 may generate instructions based on the tracked, monitored, and/or identified gesture-based commands, the instructions including, but not limited to, at least one of a scroll command, a move command, a select command, a confirm command, a highlight command, a return command, or a cancel command, and/or the like.
  • the at least one second sensor(s) or IMU sensor(s) 225 may detect motion of the wrist- wearable device 205 with respect to at least one axis of rotation among three axes of rotation (e.g., rotation R x , R y , and/or R z , (also referred to as pitch, roll, and/or yaw, respectively) with respect to the X-axis, Y-axis, and/or Z-axis, respectively, as shown in Fig. 3, or the like).
  • IMU inertial measurement unit
  • the Y-axis may be parallel to at least one of a longitudinal extension of the user's arm or a main axis of the wrist-wearable device 205, and/or the like, while the X-axis may lay within a plane that is parallel to a plane defined by one of a dorsal side of the forearm or a ventral side of the forearm, and the Z-axis may be parallel to a line that is perpendicular to the plane defined by the one of the dorsal side of the forearm or the ventral side of the forearm.
  • supination or pronation may refer to rotation R y with respect to the Y-axis, with direction of rotation being dependent on which arm the wrist-wearable device is worn.
  • the wrist wearable device 205 may generate instructions based on the detect motion of the wrist-wearable device 205 with respect to at least one axis of rotation among three axes of rotation, the instructions including, but not limited to, at least one of a move cursor command or a navigate command, and/or the like.
  • the one or more outward facing lights 230 may be disposed on the one or more band portions 210, and, in some cases, may be arranged in a predetermined pattern on the one or more band portions and may serve as tracking points for one or more cameras 340 of user device 310a or 310b (collectively, "user device 310" or the like) to track translation movement of the wrist-wearable device 205 along the three axes (e.g., the X-axis, Y-axis, and/or Z-axis, or the like) relative to the user device 310.
  • the tracking of translation movement of the wrist- wearable device along the three axes may serve as additional input for controlling the UI 320 displayed within display device 315 of user device 310.
  • the combination of the one or more first sensors or biosensors 220, the at least one second sensor(s) or IMU sensor(s) 225, and the one or more outward facing lights 230 enable highly accurate six-degrees-of-freedom ("6DOF") control of the UI 320 displayed within display device 315 of user device 310, by combining micro gesture-based UI control (using the one or more first sensors or biosensors 220), motion detection-based UI control (using the at least one second sensor(s) or IMU sensor(s) 225), and translation detection- based UI control (using the one or more outward facing lights 230 as tracking points for camera(s) 340 of the user device 310 to serve as addition UI control input).
  • 6DOF six-degrees-of-freedom
  • micro gesture-based UI control there is some overlap between two or more of the micro gesture-based UI control, the motion detection-based UI control, and the translation detection-based UI control.
  • 6DOF control may be used to navigate within the UI 320.
  • navigating and interacting within the UI 320 may include selecting or highlighting icons, text, graphics, objects, thumb-nails, or picture-in-picture ("PIP") -type mini-windows of media content, and/or the like (collectively, "UI objects 330" or the like) (as depicted, e.g., by bounding box 335 being superimposed over app icons in Fig. 3, or the like), moving the selection or highlighting field (e.g., bounding box 335, or the like) from one UI object to another, and/or the like, based at least in part on the 6DOF UI control.
  • PIP picture-in-picture
  • navigating and interacting within the UI 320 may include scrolling or paging through menus, pages, windows, and/or other UI display regions, or the like, as well as responding to prompts (with micro gesture- based acceptance or dismissal input commands, or with micro gesture-based "yes” or “no” input commands, or with micro gesture-based "OK” or “cancel” input commands, etc., in response to the prompts), or the like.
  • the combination of the one or more first sensors or biosensors 220, the at least one second sensor(s) or IMU sensor(s) 225, and the one or more outward facing lights 230 may also complement and/or supplement gesture-based inputs or commands, e.g., in the case that conditions or circumstances provide less than ideal tracking or detection of one or more of the corresponding gesture-based commands. For example, under low light or dark conditions (or extremely bright conditions), camera(s) 340 may encounter issues with properly tracking the positions and/or orientations of the one or more outward facing lights 230.
  • the wrist-wearable device 205 and thus the one or more outward facing lights 230 move outside the field of view of the camera(s) 340 (which is more likely in the case of FiMD embodiments).
  • the micro gesture -based control and/or the motion detection-based control (which constitute three- degrees -of-freedom ("3DOF") control) may still enable navigating and interacting within the UI 320.
  • multiple wrist-wearable devices 205 may be used (one on each arm of the user), to provide additional 6DOF control of the UI 320, and may be useful for immersive experiences (e.g., game play or AR/VR/MR experiences, or the like), in which one wrist-wearable device may be used for navigating within an environment displayed or presented in the UI 320, while the other wrist-wearable device may be used for navigating and interacting with UI objects or sub-UIs within said environment within the UI 320, or the like.
  • immersive experiences e.g., game play or AR/VR/MR experiences, or the like
  • a machine learning system e.g., machine learning system 115b, AI system 115c, or other computing system(s) 115d or Fig. 1, or the like
  • a machine learning system may be used to further enhance operation of the wrist- wearable device 205 by learning and fine-tuning the detection of the micro-gestures, the motion -based gesture detection, the translation-based gesture detection, and/or instances of intended gesture control by the user compared with regular movement not intended to be gesture control movements, and/or the like, that adapt to use by the user over time, or the like.
  • FIG. 4 is a flow diagram illustrating a method 400 for implementing a wrist- wearable control device for controlling a UI of a HMD or other user device with camera input, in accordance with various embodiments.
  • 1, 2A-2F1, 2I-2P, 3 A, and 3B can operate according to the method 400 illustrated by Fig. 4 (e.g., by executing instructions embodied on a computer readable medium), the systems, examples, or embodiments 100, 200, 200', 300, and 300' of Figs. 1 , 2A-2F1, 2I-2P, 3 A, and 3B can each also operate according to other modes of operation and/or perform other suitable procedures.
  • method 400 at block 405, may comprise receiving, using a computing system, first sensor data from one or more first sensors disposed on a wrist-wearable control device when the wrist-wearable control device is being worn by a user.
  • the one or more first sensors may be configured to detect gestures of one or more fingers of the user, when the wrist-wearable control device is being worn by the user, without any fingers of the user touching the wrist-wearable control device.
  • method 400 may comprise analyzing, using the computing system, the first sensor data to identify a first gesture corresponding to movement of the one or more fingers of the user.
  • the computing system may include, without limitation, at least one of one or more processors on the wrist- wearable control device, one or more processors on a mobile device that is communicatively coupled with the wrist-wearable control device, a machine learning system, an artificial intelligence (“AI") system, a deep learning system, a neural network, a convolutional neural network (“CNN”), or a fully convolutional network (“FCN”), and/or the like.
  • a machine learning system an artificial intelligence (“AI") system
  • AI artificial intelligence
  • CNN convolutional neural network
  • FCN fully convolutional network
  • the one or more first sensors may each include at least one biosensor
  • the at least one biosensor may include, without limitation, at least one of one or more photoplethysmography (“PPG") sensors, one or more electromyography (“EMG”) sensors, one or more sound transducers, or one or more motion transducers, and/or the like, each configured to detect biological characteristics that correspond to motion of soft tissue in or near a wrist of the user on which the wrist-wearable control device is being worn.
  • the soft tissue may include, but is not limited to, at least one of one or more muscles, one or more tendons, or one or more blood vessels, and/or the like, in or near the wrist of the user.
  • the one or more PPG sensors may be configured to optically detect blood volume changes in the one or more blood vessels below skin tissue at or near the wrist of the user corresponding to movement of the one or more fingers of the user.
  • the one or more EMG sensors may be configured to detect electrical activity of the one or more muscles at or near the wrist of the user corresponding to movement of the one or more fingers of the user.
  • the one or more sound transducers may be configured to detect sounds within a body of the user corresponding to movement of the one or more fingers of the user.
  • the one or more motion transducers may be configured to detect motion of the one or more muscles, the one or more tendons, and the one or more blood vessels corresponding to movement of the one or more fingers of the user.
  • Method 400 may comprise receiving, using the computing system, second sensor data from one or more second sensors disposed on the wrist-wearable control device when the wrist-wearable control device is being worn by the user.
  • the one or more second sensors may be configured to detect motion of the wrist-wearable control device with respect to at least one axis of rotation among three axes of rotation.
  • method 400 may comprise analyzing, using the computing system, the second sensor data to detect motion of the wrist-wearable control device with respect to the at least one axis of rotation among the three axes of rotation.
  • the one or more second sensors may each include, without limitation, at least one of an inertial measurement unit (“IMU") sensor or a gyroscope-based sensor, and/or the like.
  • the wrist-wearable control device may include, but is not limited to, one of a wristwatch-based wearable control device or a wristband-based wearable control device, or the like, each including one or more band portions that may be linked to adjacent band portions by connectors in a manner configured to removably wrap the wrist-wearable control device around a wrist of the user.
  • at least one first sensor among the one or more first sensors and at least one second sensor among the one or more second sensors may be disposed within each of the one or more band portions.
  • the one or more first sensors and the one or more second sensors may each be disposed within only one band portion among the one or more band portions, wherein the band portion containing the one or more first sensors may be one of the same band portion containing the one or more second sensors or a separate band portion from the band portion containing the one or more second sensors.
  • at least one first sensor among the one or more first sensors may be disposed within each of the one or more band portions while the one or more second sensors may be disposed within only one band portion among the one or more band portions.
  • Method 400 may further comprise, at block 425, determining, using the computing system, whether the detected motion of the wrist-wearable control device in conjunction with the first gesture corresponds to a first gesture-based command for controlling a user interface ("UI") of a head-mounted device (“HMD”) or other user device with camera input. If so, method 400 may further comprise: generating, using the computing system, first instructions based on the first gesture-based command (block 430); and sending, using the computing system, the generated first instructions to the HMD or the other user device, the first instructions serving as input for controlling the UI of the HMD or the other user device (block 435).
  • UI user interface
  • HMD head-mounted device
  • the HMD or other user device with camera input may include, without limitation, one of a set of virtual reality (“VR”) goggles, augmented reality (“AR”) goggles, a set of mixed reality (“MR”) goggles, a pair of VR-enabled eyewear, a pair of AR- enabled eyewear, a pair of MR-enabled eyewear, a VR-enabled smartphone mounted in a headset, an AR-enabled smartphone mounted in a headset, a MR-enabled smartphone mounted in a headset, a smart television (“TV”) with built-in camera, a smart TV with externally connected camera, a gaming console with built-in camera, a gaming console with externally connected camera, a monitor with built-in camera, a monitor with externally connected camera, or a projector with externally connected camera, and/or the like.
  • VR virtual reality
  • AR augmented reality
  • MR mixed reality
  • VR-enabled eyewear a pair of VR-enabled eyewear
  • the first gesture-based command may include, but is not limited to, at least one of a swipe-based command, a drag-based command, a tap command, a double tap command, a point command, a pinch-based command, a clench-based command, a rotate command, a roll command, a pitch command, or a yaw command, and/or the like.
  • the first instructions may include, without limitation, at least one of a scroll command, a move command, a select command, a confirm command, a highlight command, a return command, a cancel command, a move cursor command, or a navigate command, and/or the like.
  • method 400 may further comprise, at block 440, activating, using the computing system, a plurality of outward-facing lights disposed on one or more band portions of the wrist-wearable control device that are arranged in a predetermined pattern on the one or more band portions and that serve as tracking points for one or more cameras of the HMD or the other user device to track translation movement of the wrist- wearable control device along three axes relative to the HMD or the other user device.
  • the plurality of outward-facing lights may include, but is not limited to, at least one of a plurality of infrared (“IR”) light emitting diodes (“LEDs”) or a plurality of colored LEDs, and/or the like.
  • Lig. 5 is a block diagram illustrating an example of computer or system hardware architecture, in accordance with various embodiments.
  • Lig. 5 provides a schematic illustration of one embodiment of a computer system 500 of the service provider system hardware that can perform the methods provided by various other embodiments, as described herein, and/or can perform the functions of computer or hardware system (i.e., wrist-wearable devices 105, 205a-205p, and 205, computing system 115, display screen 130, and user devices 150, 310a, and 310b, etc.), as described above.
  • computer or hardware system i.e., wrist-wearable devices 105, 205a-205p, and 205, computing system 115, display screen 130, and user devices 150, 310a, and 310b, etc.
  • Lig. 5 is meant only to provide a generalized illustration of various components, of which one or more (or none) of each may be utilized as appropriate. Lig. 5, therefore, broadly illustrates how individual system elements may be implemented in a relatively separated or relatively
  • the computer or hardware system 500 - which might represent an embodiment of the computer or hardware system (i.e., wrist-wearable devices 105, 205a-205p, and 205, computing system 115, display screen 130, and user devices 150, 310a, and 310b, etc.), described above with respect to Ligs. 1-4 - is shown comprising hardware elements that can be electrically coupled via a bus 505 (or may otherwise be in communication, as appropriate).
  • a bus 505 or may otherwise be in communication, as appropriate.
  • the hardware elements may include one or more processors 510, including, without limitation, one or more general-purpose processors and/or one or more special-purpose processors (such as microprocessors, digital signal processing chips, graphics acceleration processors, and/or the like); one or more input devices 515, which can include, without limitation, a mouse, a keyboard, and/or the like; and one or more output devices 520, which can include, without limitation, a display device, a printer, and/or the like.
  • processors 510 including, without limitation, one or more general-purpose processors and/or one or more special-purpose processors (such as microprocessors, digital signal processing chips, graphics acceleration processors, and/or the like)
  • input devices 515 which can include, without limitation, a mouse, a keyboard, and/or the like
  • output devices 520 which can include, without limitation, a display device, a printer, and/or the like.
  • the computer or hardware system 500 may further include (and/or be in communication with) one or more storage devices 525, which can comprise, without limitation, local and/or network accessible storage, and/or can include, without limitation, a disk drive, a drive array, an optical storage device, solid-state storage device such as a random access memory (“RAM”) and/or a read-only memory (“ROM”), which can be programmable, flash-updateable, and/or the like.
  • RAM random access memory
  • ROM read-only memory
  • Such storage devices may be configured to implement any appropriate data stores, including, without limitation, various file systems, database structures, and/or the like.
  • the computer or hardware system 500 might also include a communications subsystem 530, which can include, without limitation, a modem, a network card (wireless or wired), an infra-red communication device, a wireless communication device and/or chipset (such as a BluetoothTM device, an 802.11 device, a WiFi device, a WiMax device, a WWAN device, cellular communication facilities, etc.), and/or the like.
  • the communications subsystem 530 may permit data to be exchanged with a network (such as the network described below, to name one example), with other computer or hardware systems, and/or with any other devices described herein.
  • the computer or hardware system 500 will further comprise a working memory 535, which can include a RAM or ROM device, as described above.
  • the computer or hardware system 500 also may comprise software elements, shown as being currently located within the working memory 535, including an operating system 540, device drivers, executable libraries, and/or other code, such as one or more application programs 545, which may comprise computer programs provided by various embodiments (including, without limitation, hypervisors, VMs, and the like), and/or may be designed to implement methods, and/or configure systems, provided by other embodiments, as described herein.
  • an operating system 540 may comprise computer programs provided by various embodiments (including, without limitation, hypervisors, VMs, and the like), and/or may be designed to implement methods, and/or configure systems, provided by other embodiments, as described herein.
  • application programs 545 may comprise computer programs provided by various embodiments (including, without limitation, hypervisors, VMs, and the like), and/or may be designed to implement methods, and/or configure systems, provided by other embodiments, as described herein.
  • one or more procedures described with respect to the method(s) discussed above might be implemented as code and/or instructions executable by a computer (and/or a processor within a computer); in an aspect, then, such code and/or instructions can be used to configure and/or adapt a general purpose computer (or other device) to perform one or more operations in accordance with the described methods.
  • a set of these instructions and/or code might be encoded and/or stored on a non- transitory computer readable storage medium, such as the storage device(s) 525 described above. In some cases, the storage medium might be incorporated within a computer system, such as the system 500.
  • the storage medium might be separate from a computer system (i.e., a removable medium, such as a compact disc, etc.), and/or provided in an installation package, such that the storage medium can be used to program, configure, and/or adapt a general purpose computer with the instructions/code stored thereon.
  • These instructions might take the form of executable code, which is executable by the computer or hardware system 500 and/or might take the form of source and/or installable code, which, upon compilation and/or installation on the computer or hardware system 500 (e.g., using any of a variety of generally available compilers, installation programs, compression/decompression utilities, etc.) then takes the form of executable code.
  • some embodiments may employ a computer or hardware system (such as the computer or hardware system 500) to perform methods in accordance with various embodiments of the invention. According to a set of embodiments, some or all of the procedures of such methods are performed by the computer or hardware system 500 in response to processor 510 executing one or more sequences of one or more instructions (which might be incorporated into the operating system 540 and/or other code, such as an application program 545) contained in the working memory 535. Such instructions may be read into the working memory 535 from another computer readable medium, such as one or more of the storage device(s) 525.
  • a computer or hardware system such as the computer or hardware system 500
  • machine readable medium and “computer readable medium,” as used herein, refer to any medium that participates in providing data that causes a machine to operate in some fashion.
  • various computer readable media might be involved in providing instructions/code to processor(s) 510 for execution and/or might be used to store and/or carry such instructions/code (e.g., as signals).
  • a computer readable medium is a non-transitory, physical, and/or tangible storage medium.
  • a computer readable medium may take many forms, including, but not limited to, non-volatile media, volatile media, or the like.
  • Non-volatile media includes, for example, optical and/or magnetic disks, such as the storage device(s) 525.
  • Volatile media includes, without limitation, dynamic memory, such as the working memory 535.
  • a computer readable medium may take the form of transmission media, which includes, without limitation, coaxial cables, copper wire, and fiber optics, including the wires that comprise the bus 505, as well as the various components of the communication subsystem 530 (and/or the media by which the communications subsystem 530 provides communication with other devices).
  • transmission media can also take the form of waves (including without limitation radio, acoustic, and/or light waves, such as those generated during radio-wave and infra-red data communications).
  • waves including without limitation radio, acoustic, and/or light waves, such as those generated during radio-wave and infra-red data communications.
  • Common forms of physical and/or tangible computer readable media include, for example, a floppy disk, a flexible disk, a hard disk, magnetic tape, or any other magnetic medium, a CD-ROM, any other optical medium, punch cards, paper tape, any other physical medium with patterns of holes, a RAM, a PROM, and EPROM, a FLASH-EPROM, any other memory chip or cartridge, a carrier wave as described hereinafter, or any other medium from which a computer can read instructions and/or code.
  • Various forms of computer readable media may be involved in carrying one or more sequences of one or more instructions to the processor(s) 510 for execution.
  • the instructions may initially be carried on a magnetic disk and/or optical disc of a remote computer.
  • a remote computer might load the instructions into its dynamic memory and send the instructions as signals over a transmission medium to be received and/or executed by the computer or hardware system 500.
  • These signals which might be in the form of electromagnetic signals, acoustic signals, optical signals, and/or the like, are all examples of carrier waves on which instructions can be encoded, in accordance with various embodiments of the invention.
  • the communications subsystem 530 (and/or components thereof) generally will receive the signals, and the bus 505 then might carry the signals (and/or the data, instructions, etc. carried by the signals) to the working memory 535, from which the processor(s) 505 retrieves and executes the instructions.
  • the instructions received by the working memory 535 may optionally be stored on a storage device 525 either before or after execution by the processor(s) 510.

Abstract

Novel tools and techniques are provided for implementing wrist-wearable controller for head-mounted device ("HMD") or other user device with camera input (collectively, "user device"). In various, a computing system may analyze first sensor data to identify gestures corresponding to movement of a user's finger(s) and second sensor data to detect motion of a wrist-wearable control device with respect to one or more axes of rotation, the first and second sensor data being received from first and second sensors, respectively, that are disposed on the wrist-wearable control device when it is being worn by the user. If the detected motion in conjunction with the gestures corresponds to gesture-based commands for controlling a user interface ("UI") of a user device, the computing system may generate and send instructions based on the gesture-based commands to the user device, the instructions serving as input for controlling the UI of the user device.

Description

WRIST- WEARABLE CONTROLLER LOR HEAD-MOUNTED DEVICE (HMD) OR OTHER USER DEVICE WITH CAMERA INPUT
COPYRIGHT STATEMENT
[0001] A portion of the disclosure of this patent document contains material that is subject to copyright protection. The copyright owner has no objection to the facsimile reproduction by anyone of the patent document or the patent disclosure as it appears in the Patent and Trademark Office patent file or records, but otherwise reserves all copyright rights whatsoever.
FIELD
[0002] The present disclosure relates, in general, to methods, systems, and apparatuses for implementing user interface ("UI") controls, and, more particularly, to methods, systems, and apparatuses for implementing wrist- wearable controller for head-mounted device ("HMD") or other user device with camera input.
BACKGROUND
[0003] Most conventional controllers for HMD or other users devices are designed as handheld control devices. Such conventional handheld control devices, however, always occupy one or both of the user's hands for UI interaction. Although such devices may be suitable for virtual reality ("VR") gaming, or the like, they are not suitable or user-friendly for other use cases, such as daily use with augmented reality ("AR") or mixed reality ("MR") devices or for outdoor use when it is inconvenient to carry extra handheld controllers.
Further, for situations in which they are out of camera view or where visual occlusion occurs, such conventional controllers will fail to accurately provide hand gesture recognition.
[0004] Hence, there is a need for more robust and scalable solutions for implementing UI controls. SUMMARY
[0005] The techniques of this disclosure generally relate to tools and techniques for implementing user interface ("UI") controls, and, more particularly, to methods, systems, and apparatuses for implementing wrist-wearable controller for head-mounted device ("HMD") or other user device with camera input.
[0006] In an aspect, a wrist-wearable control device may comprise one or more first sensors that are configured to detect gestures of one or more fingers of a user, when the wrist- wearable control device is being worn by the user, without any fingers of the user touching the wrist- wearable control device; one or more second sensors that are configured to detect motion of the wrist-wearable control device with respect to at least one axis of rotation among three axes of rotation; at least one processor; and a non-transitory computer readable medium communicatively coupled to the at least one processor. The non-transitory computer readable medium may have stored thereon computer software comprising a set of instructions that, when executed by the at least one processor, causes the wrist-wearable control device to: analyze first sensor data to identify a first gesture corresponding to movement of one or more fingers of the user, the first sensor data being received from the one or more first sensors disposed on the wrist-wearable control device when the wrist-wearable control device is being worn by the user; analyze second sensor data to detect motion of the wrist-wearable control device with respect to the at least one axis of rotation among the three axes of rotation, the second sensor data being received from the one or more second sensors disposed on the wrist-wearable control device when the wrist-wearable control device is being worn by the user; and based on a determination that the detected motion of the wrist-wearable control device in conjunction with the first gesture corresponds to a first gesture-based command for controlling a user interface ("UI") of a head-mounted device ("HMD") or other user device with camera input, generate first instructions based on the first gesture-based command and send the generated first instructions to the HMD or the other user device, the first instructions serving as input for controlling the UI of the HMD or the other user device.
[0007] In another aspect, a method may comprise analyzing, using a computing system, first sensor data to identify a first gesture corresponding to movement of one or more fingers of a user, the first sensor data being received from one or more first sensors disposed on a wrist-wearable control device when the wrist- wearable control device is being worn by the user, the one or more first sensors being configured to detect gestures of the one or more fingers of the user, when the wrist-wearable control device is being worn by the user, without any fingers of the user touching the wrist-wearable control device. The method may further comprise analyzing, using the computing system, second sensor data to detect motion of the wrist-wearable control device with respect to at least one axis of rotation among three axes of rotation, the second sensor data being received from one or more second sensors disposed on the wrist-wearable control device when the wrist-wearable control device is being worn by the user; and based on a determination that the detected motion of the wrist-wearable control device in conjunction with the first gesture corresponds to a first gesture-based command for controlling a user interface ("UI") of a head-mounted device ("HMD") or other user device with camera input, generating, using the computing system, first instructions based on the first gesture-based command and sending, using the computing system, the generated first instructions to the HMD or the other user device, the first instructions serving as input for controlling the UI of the HMD or the other user device.
[0008] Various modifications and additions can be made to the embodiments discussed without departing from the scope of the invention. For example, while the embodiments described above refer to particular features, the scope of this invention also includes embodiments having different combination of features and embodiments that do not include all of the above-described features.
[0009] The details of one or more aspects of the disclosure are set forth in the accompanying drawings and the description below. Other features, objects, and advantages of the techniques described in this disclosure will be apparent from the description and drawings, and from the claims.
BRIEF DESCRIPTION OF THE DRAWINGS
[0010] A further understanding of the nature and advantages of particular embodiments may be realized by reference to the remaining portions of the specification and the drawings, in which like reference numerals are used to refer to similar components. In some instances, a sub-label is associated with a reference numeral to denote one of multiple similar components. When reference is made to a reference numeral without specification to an existing sub-label, it is intended to refer to all such multiple similar components.
[0011] Fig. 1 is a schematic diagram illustrating a system for implementing a wrist- wearable control device for controlling a user interface ("UI") of a head-mounted device ("HMD") or other user device with camera input, in accordance with various embodiments. [0012] Figs. 2A-2P are schematic diagrams illustrating various non-limiting examples of a wrist-wearable control device for controlling a UI of a HMD or other user device with camera input, in accordance with various embodiments.
[0013] Figs. 3A and 3B are schematic diagrams illustrating various non-limiting examples of the use of a wrist- wearable control device for controlling a UI of a HMD or other user device with camera input, in accordance with various embodiments.
[0014] Fig. 4 is a flow diagram illustrating a method for implementing a wrist-wearable control device for controlling a UI of a HMD or other user device with camera input, in accordance with various embodiments.
[0015] Fig. 5 is a block diagram illustrating an example of computer or system hardware architecture, in accordance with various embodiments.
DETAILED DESCRIPTION
[0016] Overview
[0017] Various embodiments provide tools and techniques for implementing user interface ("UI") controls, and, more particularly, to methods, systems, and apparatuses for implementing wrist-wearable controller for head-mounted device ("HMD") or other user device with camera input.
[0018] In various embodiments, a wrist-wearable control device may comprise one or more first sensors that are configured to detect gestures of one or more fingers of a user, when the wrist-wearable control device is being worn by the user, without any fingers of the user touching the wrist-wearable control device; one or more second sensors that are configured to detect motion of the wrist-wearable control device with respect to at least one axis of rotation among three axes of rotation; at least one processor; and a non-transitory computer readable medium communicatively coupled to the at least one processor. The non- transitory computer readable medium may have stored thereon computer software comprising a set of instructions that, when executed by the at least one processor, causes the wrist- wearable control device to: analyze first sensor data to identify a first gesture corresponding to movement of one or more fingers of the user, the first sensor data being received from the one or more first sensors disposed on the wrist-wearable control device when the wrist- wearable control device is being worn by the user; analyze second sensor data to detect motion of the wrist-wearable control device with respect to the at least one axis of rotation among the three axes of rotation, the second sensor data being received from the one or more second sensors disposed on the wrist-wearable control device when the wrist-wearable control device is being worn by the user; and based on a determination that the detected motion of the wrist-wearable control device in conjunction with the first gesture corresponds to a first gesture-based command for controlling a user interface ("UI") of a head-mounted device ("HMD") or other user device with camera input, generate first instructions based on the first gesture-based command and send the generated first instructions to the HMD or the other user device, the first instructions serving as input for controlling the UI of the HMD or the other user device.
[0019] In some embodiments, the wrist-wearable control device may comprise one of a wristwatch-based wearable control device or a wristband-based wearable control device, or the like, each comprising one or more band portions that may be linked to adjacent band portions by connectors in a manner configured to removably wrap the wrist-wearable control device around a wrist of the user. In some instances, at least one first sensor among the one or more first sensors and at least one second sensor among the one or more second sensors may be disposed within each of the one or more band portions. Alternatively, the one or more first sensors and the one or more second sensors may each be disposed within only one band portion among the one or more band portions, wherein the band portion containing the one or more first sensors may be one of the same band portion containing the one or more second sensors or a separate band portion from the band portion containing the one or more second sensors. Alternatively, at least one first sensor among the one or more first sensors may be disposed within each of the one or more band portions while the one or more second sensors may be disposed within only one band portion among the one or more band portions. [0020] According to some embodiments, the HMD or other user device with camera input may comprise one of a set of virtual reality ("VR") goggles, augmented reality ("AR") goggles, a set of mixed reality ("MR") goggles, a pair of VR-enabled eyewear, a pair of AR- enabled eyewear, a pair of MR-enabled eyewear, a VR-enabled smartphone mounted in a headset, an AR-enabled smartphone mounted in a headset, a MR-enabled smartphone mounted in a headset, a smart television ("TV") with built-in camera, a smart TV with externally connected camera, a gaming console with built-in camera, a gaming console with externally connected camera, a monitor with built-in camera, a monitor with externally connected camera, or a projector with externally connected camera, and/or the like.
[0021] Merely by way of example, in some cases, the one or more first sensors may each comprise at least one biosensor, the at least one biosensor may comprise at least one of one or more photoplethysmography ("PPG") sensors, one or more electromyography ("EMG") sensors, one or more sound transducers, or one or more motion transducers, and/or the like, each configured to detect biological characteristics that correspond to motion of soft tissue in or near a wrist of the user on which the wrist-wearable control device is being worn. In some instances, the soft tissue may comprise at least one of one or more muscles, one or more tendons, or one or more blood vessels, and/or the like, in or near the wrist of the user. In some cases, the one or more PPG sensors may be configured to optically detect blood volume changes in the one or more blood vessels below skin tissue at or near the wrist of the user corresponding to movement of the one or more fingers of the user. In some instances, the one or more EMG sensors may be configured to detect electrical activity of the one or more muscles at or near the wrist of the user corresponding to movement of the one or more fingers of the user. In some cases, the one or more sound transducers may be configured to detect sounds within a body of the user corresponding to movement of the one or more fingers of the user. In some instances, the one or more motion transducers may be configured to detect motion of the one or more muscles, the one or more tendons, and the one or more blood vessels corresponding to movement of the one or more fingers of the user.
[0022] In some embodiments, the one or more second sensors may each comprise at least one of an inertial measurement unit ("IMU") sensor or a gyroscope-based sensor, and/or the like.
[0023] According to some embodiments, the wrist-wearable control device may further comprise one or more band portions; and a plurality of outward-facing lights disposed on the one or more band portions, wherein the plurality of outward-facing lights may be arranged in a predetermined pattern on the one or more band portions and may serve as tracking points for one or more cameras of the HMD or the other user device to track translation movement of the wrist-wearable control device along three axes relative to the HMD or the other user device, wherein the tracking of translation movement of the wrist-wearable control device along the three axes may serve as additional input for controlling the UI of the HMD or the other user device. In some cases, the plurality of outward-facing lights may comprise at least one of a plurality of infrared ("IR") light emitting diodes ("LEDs") or a plurality of colored LEDs, and/or the like.
[0024] In some embodiments, the first gesture-based command may comprise at least one of a swipe-based command, a drag-based command, a tap command, a double tap command, a point command, a pinch-based command, a clench-based command, a rotate command, a roll command, a pitch command, or a yaw command, and/or the like. In some instances, the first instructions may comprise at least one of a scroll command, a move command, a select command, a confirm command, a highlight command, a return command, a cancel command, a move cursor command, or a navigate command, and/or the like.
[0025] In another aspect, a computing system may analyze first sensor data to identify a first gesture corresponding to movement of one or more fingers of a user, the first sensor data being received from one or more first sensors disposed on a wrist-wearable control device when the wrist-wearable control device is being worn by the user, the one or more first sensors being configured to detect gestures of the one or more fingers of the user, when the wrist-wearable control device is being worn by the user, without any fingers of the user touching the wrist-wearable control device. The computing system may analyze second sensor data to detect motion of the wrist-wearable control device with respect to at least one axis of rotation among three axes of rotation, the second sensor data being received from one or more second sensors disposed on the wrist-wearable control device when the wrist- wearable control device is being worn by the user. Based on a determination that the detected motion of the wrist-wearable control device in conjunction with the first gesture corresponds to a first gesture-based command for controlling a user interface ("UI") of a head-mounted device ("HMD") or other user device with camera input, the computing system may generate first instructions based on the first gesture-based command and may send the generated first instructions to the HMD or the other user device, the first instructions serving as input for controlling the UI of the HMD or the other user device.
[0026] In some embodiments, the computing system may comprise at least one of one or more processors on the wrist-wearable control device, one or more processors on a mobile device that is communicatively coupled with the wrist-wearable control device, a machine learning system, an artificial intelligence ("AI") system, a deep learning system, a neural network, a convolutional neural network ("CNN"), or a fully convolutional network ("FCN"), and/or the like. In some instances, the wrist-wearable control device may comprise one of a wristwatch-based wearable control device or a wristband-based wearable control device, and/or the like, each comprising one or more band portions that may be linked to adjacent band portions by connectors in a manner configured to removably wrap the wrist-wearable control device around a wrist of the user.
[0027] According to some embodiments, the one or more first sensors may each comprise at least one biosensor, the at least one biosensor may comprise at least one of one or more photoplethysmography ("PPG") sensors, one or more electromyography ("EMG") sensors, one or more sound transducers, or one or more motion transducers, and/or the like, each configured to detect biological characteristics that correspond to motion of soft tissue in or near a wrist of the user on which the wrist-wearable control device is being worn. In some cases, the soft tissue may comprise at least one of one or more muscles, one or more tendons, or one or more blood vessels, and/or the like, in or near the wrist of the user. In some cases, the one or more PPG sensors may be configured to optically detect blood volume changes in the one or more blood vessels below skin tissue at or near the wrist of the user corresponding to movement of the one or more fingers of the user. In some instances, the one or more EMG sensors may be configured to detect electrical activity of the one or more muscles at or near the wrist of the user corresponding to movement of the one or more fingers of the user. In some cases, the one or more sound transducers may be configured to detect sounds within a body of the user corresponding to movement of the one or more fingers of the user. In some instances, the one or more motion transducers may be configured to detect motion of the one or more muscles, the one or more tendons, and the one or more blood vessels corresponding to movement of the one or more fingers of the user.
[0028] In some embodiments, the one or more second sensors may each comprise at least one of an inertial measurement unit ("IMU") sensor or a gyroscope-based sensor, and/or the like.
[0029] According to some embodiments, the wrist-wearable control device may further comprise one or more band portions; and a plurality of outward-facing lights disposed on the one or more band portions, wherein the plurality of outward-facing lights may be arranged in a predetermined pattern on the one or more band portions and may serve as tracking points for one or more cameras of the HMD or the other user device to track translation movement of the wrist-wearable control device along three axes relative to the HMD or the other user device, wherein the tracking of translation movement of the wrist-wearable control device along the three axes may serve as additional input for controlling the UI of the HMD or the other user device. In some cases, the plurality of outward-facing lights may comprise at least one of a plurality of infrared ("IR") light emitting diodes ("LEDs") or a plurality of colored LEDs, and/or the like.
[0030] In the various aspects described herein, a wrist-wearable control device is provided for controlling a UI of a HMD or other user device with camera input (collectively, "user device"). This enables highly accurate six-degrees-of-freedom ("6DOF") control of the UI of user device, by combining micro gesture-based UI control (using the one or more first sensors or biosensors), motion (e.g., rotation and/or linear motion, etc.) detection-based UI control (using the at least one second sensor(s) or IMU sensor(s)), and translation detection- based UI control (using the one or more outward facing lights as tracking points for camera(s) of the user device to serve as addition UI control input).
[0031] These and other aspects of the system and method for implementing a wrist- wearable control device for controlling a UI of a HMD or other user device with camera input are described in greater detail with respect to the figures.
[0032] The following detailed description illustrates a few embodiments in further detail to enable one of skill in the art to practice such embodiments. The described examples are provided for illustrative purposes and are not intended to limit the scope of the invention. [0033] In the following description, for the purposes of explanation, numerous details are set forth in order to provide a thorough understanding of the described embodiments. It will be apparent to one skilled in the art, however, that other embodiments of the present invention may be practiced without some of these details. In other instances, some structures and devices are shown in block diagram form. Several embodiments are described herein, and while various features are ascribed to different embodiments, it should be appreciated that the features described with respect to one embodiment may be incorporated with other embodiments as well. By the same token, however, no single feature or features of any described embodiment should be considered essential to every embodiment of the invention, as other embodiments of the invention may omit such features.
[0034] Unless otherwise indicated, all numbers used herein to express quantities, dimensions, and so forth used should be understood as being modified in all instances by the term "about." In this application, the use of the singular includes the plural unless specifically stated otherwise, and use of the terms "and" and "or" means "and/or" unless otherwise indicated. Moreover, the use of the term "including," as well as other forms, such as "includes" and "included," should be considered non-exclusive. Also, terms such as "element" or "component" encompass both elements and components comprising one unit and elements and components that comprise more than one unit, unless specifically stated otherwise.
[0035] Various embodiments as described herein - while embodying (in some cases) software products, computer-performed methods, and/or computer systems - represent tangible, concrete improvements to existing technological areas, including, without limitation, wrist-wearable device technology, gesture control technology, gesture control technology for wrist-wearable devices, biosensor technology for gesture control, motion tracking technology for gesture control, user interface ("UI") control technology, UI control technology for wrist- wearable devices, 3DOF control technology of UI, 6DOF control technology of UI, and/or the like. In other aspects, some embodiments can improve the functioning of user equipment or systems themselves (e.g., wrist-wearable devices, gesture control systems, gesture control systems for wrist- wearable devices, biosensor systems for gesture control, motion tracking systems for gesture control, UI control systems, UI control systems for wrist-wearable devices, 3DOF control systems of UI, 6DOF control systems of UI, etc.), for example, by analyzing, using a computing system, first sensor data to identify a first gesture corresponding to movement of one or more fingers of a user, the first sensor data being received from one or more first sensors disposed on a wrist-wearable control device when the wrist-wearable control device is being worn by the user, the one or more first sensors being configured to detect gestures of the one or more fingers of the user, when the wrist-wearable control device is being worn by the user, without any fingers of the user touching the wrist-wearable control device; analyzing, using the computing system, second sensor data to detect motion of the wrist-wearable control device with respect to at least one axis of rotation among three axes of rotation, the second sensor data being received from one or more second sensors disposed on the wrist-wearable control device when the wrist- wearable control device is being worn by the user; and based on a determination that the detected motion of the wrist-wearable control device in conjunction with the first gesture corresponds to a first gesture-based command for controlling a user interface ("UI") of a head-mounted device ("HMD") or other user device with camera input, generating, using the computing system, first instructions based on the first gesture-based command and sending, using the computing system, the generated first instructions to the HMD or the other user device, the first instructions serving as input for controlling the UI of the HMD or the other user device; and/or the like.
[0036] In particular, to the extent any abstract concepts are present in the various embodiments, those concepts can be implemented as described herein by devices, software, systems, and methods that involve novel functionality (e.g., steps or operations), such as, by combining micro gesture-based UI control (using the one or more first sensors or biosensors), motion detection-based UI control (using the at least one second sensor(s) or IMU sensor(s)), and translation detection-based UI control (using the one or more outward facing lights as tracking points for camera(s) of the user device to serve as addition UI control input), and/or the like, to name a few examples, that extend beyond mere conventional computer processing operations. These functionalities can produce tangible results outside of the implementing computer system, including, merely by way of example, an optimized wrist-wearable control device that allows for 6DOF UI control of another user device, by using micro-gestures of the fingers on the arm on which the wrist-wearable control device is worn that are detected and identified using biosensors (in some cases, in conjunction with sound/motion transducers (e.g., microphones, etc.) and/or IMU sensors) to monitor, track, and identify such micro gestures in conjunction with motion detection (using IMU sensors) and translation detection (using outward-facing lights that are used as tracking points for a camera(s) of the user device being controlled) of the wrist-wearable control device, at least some of which may be observed or measured by users, wrist-wearable device manufacturers, user device manufacturers, and/or universal remote controller manufacturers.
[0037] Some Embodiments
[0038] We now turn to the embodiments as illustrated by the drawings. Figs. 1-5 illustrate some of the features of the method, system, and apparatus for implementing user interface ("UI") controls, and, more particularly, to methods, systems, and apparatuses for implementing wrist-wearable controller for head-mounted device ("HMD") or other user device with camera input, as referred to above. The methods, systems, and apparatuses illustrated by Figs. 1-5 refer to examples of different embodiments that include various components and steps, which can be considered alternatives or which can be used in conjunction with one another in the various embodiments. The description of the illustrated methods, systems, and apparatuses shown in Figs. 1-5 is provided for purposes of illustration and should not be considered to limit the scope of the different embodiments.
[0039] With reference to the figures, Fig. 1 is a schematic diagram illustrating a system 100 for implementing a wrist-wearable control device for controlling a user interface ("UI") of a head-mounted device ("HMD") or other user device with camera input, in accordance with various embodiments.
[0040] In the non-limiting embodiment of Fig. 1, system 100 may comprise a wrist- wearable device 105 that is configured to be worn on a wrist of a user 110. In some cases, the wrist-wearable device 105 may include, but is not limited to, one of a smart watch, a wrist-wearable display device, a wrist-wearable control device, or other wrist-wearable user device, and/or the like. According to some embodiments, the wrist-wearable control device may include, without limitation, one of a wristwatch-based wearable control device or a wristband-based wearable control device, and/or the like, each including one or more band portions that may be linked to adjacent band portions by connectors in a manner configured to removably wrap the wrist- wearable control device around a wrist of the user 110 (as shown, e.g., in Figs. 2A-2P, 3A, and 3B, or the like). Wrist-wearable device 105 may include, without limitation, computing system(s) 115, one or more first sensors 120, one or more second sensors 125, display screen 130 (optional), communications system 135, and one or more outward-facing lights 140a-140n (optional; collectively, "outward-facing lights 140" or "lights 140" or the like), and/or the like. According to some embodiments, outward facing lights 140 may include, but are not limited to, at least one of a plurality of infrared ("IR") light emitting diodes ("LEDs") or a plurality of colored LEDs (e.g., LEDs emitting light in the visual spectrum that may each be single-color or changeable among multiple colors), and/or the like.
[0041] In some embodiments, computing system 115 may include, without limitation, at least one of one or more processors on the wrist- wearable device (e.g., processor(s) 115a, or the like), one or more processors on a mobile device that is communicatively coupled with the wrist-wearable device (e.g., processor (not shown) on mobile device 145 that is communicatively coupled with wrist- wear able device 105 via communications system 135 (as denoted in Fig. 1 by the lightning bolt symbol between mobile device 145 and communications system 135), or the like), a signal processing system (e.g., signal processing system 115b, or the like), an artificial intelligence ("AI") system (e.g., AI system 115c, or the like), or other computing system(s) 115d, and/or the like. In some cases, the AI system 115c and/or the other computing system(s) 115d may include, but is not limited to, a machine learning system, a deep learning system, a neural network, a convolutional neural network ("CNN"), or a fully convolutional network ("FCN"), and/or the like. In some instances, the mobile device 145 may include, without limitation, one of a smart phone, a tablet computer, a laptop computer, or a portable gaming device, and/or the like.
[0042] Merely by way of example, in some cases, display device 130 may include, but is not limited to, at least one of a touchscreen display device, a non-touchscreen display device, a projection-based display device, a holographic display device, and/or the like. In some instances, the communications system 135 may include wireless communications devices capable of communicating using protocols including, but not limited to, at least one of Bluetooth™ communications protocol, WiFi communications protocol, or other 802.11 suite of communications protocols, ZigBee communications protocol, Z-wave communications protocol, or other 802.15.4 suite of communications protocols, cellular communications protocol (e.g., 3G, 4G, 4G LTE, 5G, etc.), or other suitable communications protocols, and/or the like.
[0043] In some embodiments, system 100 may further comprise a user device 150, which may include, but is not limited to, at least one of one or more processors 150a, a transceiver 150b, one or more cameras 150c, or a display device 150d, and/or the like. In some instances, the transceiver 150b and/or the one or more cameras 150c may each be either integrated within the user device 150, external yet communicatively coupled to user device 150, or partially integrated with and partially external yet communicatively coupled to user device 150, and/or the like. In some cases, the camera(s) 150c may have a field of view ("FOV") 155 that, when directed toward wrist-wearable device 105, may be used to monitor or track position and/or orientation of the wrist-wearable device 105 (in some cases, based at least in part on the outward facing lights 140 disposed on the wrist-wearable device 105, if any). In some instances, a UI 160 may be displayed or presented on display device 150d. In some cases, the wrist- wearable device 105 may communicatively couple with user device 150 via communications system 135 (as denoted in Fig. 1 by the lightning bolt symbol between transceiver 150b of user device 150 and communications system 135 of wrist- wearable device 105), or the like). In some instances, user device 150 may include, without limitation, a head-mounted device ("HMD") or other user device with camera input, and/or the like. In some cases, the HMD may include, but is not limited to, one of a set of virtual reality ("VR") goggles, augmented reality ("AR") goggles, a set of mixed reality ("MR") goggles, a pair of VR-enabled eyewear, a pair of AR-enabled eyewear, a pair of MR-enabled eyewear, a VR-enabled smartphone mounted in a headset, an AR-enabled smartphone mounted in a headset, a MR-enabled smartphone mounted in a headset, and/or the like. In some instances, the other user device with camera input may include, without limitation, one of a smart television ("TV") with built-in camera, a smart TV with externally connected camera, a gaming console with built-in camera, a gaming console with externally connected camera, a monitor with built-in camera, a monitor with externally connected camera, or a projector with externally connected camera, and/or the like.
[0044] According to some embodiments, the one or more first sensors 120 may each include at least one biosensor. In some instances, the at least one biosensor may include, without limitation, at least one of one or more photoplethysmography ("PPG") sensors, one or more electromyography ("EMG") sensors, one or more sound transducers, or one or more motion transducers, and/or the like, each configured to detect biological characteristics that correspond to motion of soft tissue in or near a wrist of the user on which the wrist-wearable device is being worn. In some cases, the soft tissue may include, but is not limited to, at least one of one or more muscles, one or more tendons, or one or more blood vessels, and/or the like, in or near the wrist of the user. In some cases, the one or more PPG sensors may be configured to optically detect blood volume changes in the one or more blood vessels below skin tissue at or near the wrist of the user corresponding to movement of the one or more fingers of the user. In some instances, the one or more EMG sensors may be configured to detect electrical activity of the one or more muscles at or near the wrist of the user corresponding to movement of the one or more fingers of the user. In some cases, the one or more sound transducers (in some cases, including, but not limited to, a microphone, or the like) may be configured to detect sounds within a body of the user corresponding to movement of the one or more fingers of the user. In some instances, the one or more motion transducers (in some cases, including, but not limited to, a microphone, or the like) may be configured to detect motion of the one or more muscles, the one or more tendons, and the one or more blood vessels corresponding to movement of the one or more fingers of the user. [0045] Merely by way of example, in some cases, the one or more second sensors 125 may each include, but is not limited to, at least one of an inertial measurement unit ("IMU") sensor(s) or a gyroscope-based sensor(s), or similar sensor(s) and/or the like. In some instances, at least one first sensor 120 among the one or more first sensors 120 and at least one second sensor 125 among the one or more second sensors 125 may be disposed within each of the one or more band portions. Alternatively, the one or more first sensors 120 and the one or more second sensors 125 may each be disposed within only one band portion among the one or more band portions, wherein the band portion containing the one or more first sensors 120 may be one of the same band portion containing the one or more second sensors 125 or a separate band portion from the band portion containing the one or more second sensors 125. Alternatively, at least one first sensor 120 among the one or more first sensors 120 may be disposed within each of the one or more band portions while the one or more second sensors 125 may be disposed within only one band portion among the one or more band portions.
[0046] In some instances, the outward-facing lights 140 may be disposed on the one or more band portions, and, in some cases, may be arranged in a predetermined pattern on the one or more band portions and may serve as tracking points for one or more cameras 150c of the user device 150 (as captured within the FOV 155) to track translation movement of the wrist-wearable device 105 along three axes relative to the user device 150. In some cases, the tracking of translation movement of the wrist-wearable device along the three axes may serve as additional input for controlling the UI 160 of the user device 150.
[0047] In operation, computing system(s) 115, processor(s) 115a, signal processing system 115b, AI system 115c, other computing system(s) 115d, and/or mobile device 145, or the like (collectively, "computing system" or the like) may analyze first sensor data to identify a first gesture corresponding to movement of one or more fingers of a user (e.g., user 110, or the like), the first sensor data being received from one or more first sensors (e.g., first sensor(s) or biosensor(s) 120, or the like) disposed on a wrist-wearable control device (e.g., wrist- wearable device 105, or the like) when the wrist- wearable control device is being worn by the user, the one or more first sensors being configured to detect gestures of the one or more fingers of the user, when the wrist-wearable control device is being worn by the user, without any fingers of the user touching the wrist-wearable control device. The computing system may analyze second sensor data to detect motion of the wrist- wearable control device (e.g., rotation and/or linear motion, etc.) with respect to at least one axis of rotation among three axes of rotation, the second sensor data being received from one or more second sensors (e.g., second sensor(s) or IMU sensor(s) 125, or the like) disposed on the wrist-wearable control device when the wrist-wearable control device is being worn by the user. Based on a determination that the detected motion of the wrist-wearable control device in conjunction with the first gesture corresponds to a first gesture-based command for controlling a UI (e.g., UI 160, or the like) of a HMD or other user device with camera input (e.g., user device 150, or the like), the computing system may generate first instructions based on the first gesture- based command and may send the generated first instructions to the HMD or the other user device, the first instructions serving as input for controlling the UI of the HMD or the other user device.
[0048] In some embodiments, the first gesture-based command may include, without limitation, at least one of a swipe-based command, a drag-based command, a tap command, a double tap command, a point command, a pinch-based command, a clench-based command, a rotate command, a roll command, a pitch command, or a yaw command, and/or the like. In some instances, the first instructions may include, but are not limited to, at least one of a scroll command, a move command, a select command, a confirm command, a highlight command, a return command, a cancel command, a move cursor command, or a navigate command, and/or the like.
[0049] According to some embodiments, for the swipe-based command gesture (or finger-swipe) command (not shown), the user may start from a relaxed pose of fingers (also referred to herein as "initial pose"; e.g., as shown in Figs. 3A and 3B, or the like), which may be defined by the hand of the user with the fingers (e.g., five fingers) naturally spread out and released or relaxed (i.e., without muscle tension to straighten the fingers nor to curl them (e.g., as in a fist, or the like), and the like). For a swipe-up command, a finger (such as, but not limited to, the index finger, or the like) may be lifted, e.g., at a moderate speed (herein moderate speed may refer to a speed that is greater than a slow, deliberate shift in positioning of the finger, or the like), from the initial or relaxed pose or from a lowered position (as described below), relative to the other fingers, which remain in the initial or relaxed pose. In some cases, finger lifting may be performed at a fast speed (e.g., as defined by visual blurring of the fingers due to speed of movement, or the like). For a swipe-down command, the finger may be lowered, e.g., at a moderate speed, from either the initial or relaxed pose or the lifted position (as described above), relative to the other fingers, which remain in the initial or relaxed pose. In some cases, finger lowering may be performed at a fast speed.
[0050] In some embodiments, from the initial pose, if the user lifts the index finger, the computing system may identify such micro-gesture as a swipe up command. Conversely, from the initial pose, if the user pulls down the index finger, the computing system may identify such micro-gesture as a swipe down command. Taking page navigation functionality as an example, the user could use an activation gesture, like a pinch command, to open the page. Herein, the pinch command or pinch-based gesture command may include the thumb and forefinger touching or pressing together with minimal or no movement in the other fingers of the hand (not shown). Then, the user can pull down the index finger to trigger the swipe down function to navigate down the page. If the user lifts the index finger above the initial pose, such gesture will trigger the swipe up function.
[0051] According to some embodiments, for the drag-based gesture (or finger-drag) command (not shown), the user may start from the initial or relaxed pose of fingers, similar to that for the swipe-based command gesture (or finger-swipe) command, as described above. For a drag-up command, a finger (such as, but not limited to, the index finger, or the like) may be lifted, e.g., at a slow speed (herein slow speed may refer to a speed that is slower than the moderate speed described above, or the like), from the initial or relaxed pose or from a lowered position (as described below), relative to the other fingers, which remain in the initial or relaxed pose. For a drag-down command, the finger may be lowered, e.g., at a slow speed, from either the initial or relaxed pose or the lifted position (as described above), relative to the other fingers, which remain in the initial or relaxed pose.
[0052] In the case that the UI (e.g., UI 160, or the like) provides only one-dimension along which a user may scroll or drag within the UI (regardless of whether or not the wrist- wearable device includes the second sensor(s) or IMU sensor(s) (e.g., second sensor(s) or IMU sensor(s) 125, or the like)), then the lifting and the lowering of the finger (as described above with respect to swipe-based commands and drag-based commands) may be identified as micro-gestures that result in the computing system (e.g., computing system(s) 115, or the like) causing the UI to perform one of scrolling or dragging along a first direction (in response to the finger lifting) or to scroll or drag along a second direction (in response to the finger lowering). For example, if the UI only provides vertical movement, then the first direction may result in the computing system causing the UI to perform one of a scroll-up function or a drag-up function, while the second direction may result in the computing system causing the UI to perform one of a scroll-down function or a drag-down function, or the like. Similarly, if the UI only provides horizontal movement, then the first direction may result in the computing system causing the UI to perform one of a scroll-left function or a drag-left function (if using the left hand; or a scroll-right function or a drag-right function if using the right hand), while the second direction may result in the computing system causing the UI to perform one of a scroll-right function or a drag-right function (if using the left hand; or a scroll-left function or a drag-left function if using the right hand), or the like.
[0053] In the case that the UI (e.g., UI 160, or the like) provides two-dimensions along which a user may scroll or drag within the UI, and in the case that the wrist-wearable device includes the second sensor(s) or IMU sensor(s) (e.g., second sensor(s) or IMU sensor(s) 125, or the like), then the lifting and the lowering of the finger (as described above with respect to swipe-based commands and drag-based commands) in conjunction with motion of the user's wrist (as detected by the second sensor(s) or IMU sensor(s) 125, or the like) may be identified as micro-gestures that result in the computing system (e.g., computing system(s)
115, or the like) causing the UI to perform one of scrolling or dragging along a direction based on the degree of supination or pronation of a forearm connected to a wrist of the user on which the wrist-wearable device is being worn, as detected by the second sensor(s) or IMU sensor(s) that tracks motion of the wrist-wearable device. Herein, supination of the forearm may refer to rotation of the forearm such that the ventral side of the forearm (or palm) is facing forward and/or upward relative to a sitting or standing position of the body of the user when the user's arm is hanging by the users side or when the forearm is lifted from the arm hanging position (regardless of whether the fingers are spread in an open hand or are closed in a fist), while pronation of the forearm may refer to rotation of the forearm such that the ventral side of the forearm (or palm) is facing backward and/or downward relative to a sitting or standing position of the body of the user when the user's arm is hanging by the users side or when the forearm is lifted from the arm hanging position (regardless of whether the fingers are spread in an open hand or are closed in a fist), and a neutral position of the forearm may refer to rotation of the forearm such that the ventral side of the forearm (or palm) is facing toward the sagittal or median plane, which divides the body into the right and left parts along the midline of the body, and is partway or halfway between the supinated and pronated positions.
[0054] In a non-limiting example, if the forearm is pronated (or fully pronated), then the lifting and the lowering of the finger may be identified as micro-gestures that result in the computing system causing the UI to perform one of scrolling or dragging along the vertical direction (i.e., in the up or down direction, similar to the one dimensional vertical movement embodiment described above), regardless of whether the user's elbow is straight, bent with forearm beside the torso, or bent with forearm in front of the torso. Similarly, if the forearm is in the neutral position such that the ventral side of the forearm (or palm) is facing toward the sagittal or median plane, then the lifting and the lowering of the finger may be identified as micro-gestures that result in the computing system causing the UI to perform one of scrolling or dragging along the horizontal direction (i.e., in the left or right direction, similar to the one dimensional horizontal movement embodiment described above), regardless of whether the user's elbow is straight, bent with forearm beside the torso, or bent with forearm in front of the torso. If the forearm is between the supinated and pronated positions (but not at either the pronated or neutral positions), then the lifting and the lowering of the finger may be identified as micro-gestures that result in the computing system causing the UI to perform one of scrolling or dragging along a diagonal direction based on the degree of rotation between the supinated and pronated positions, regardless of whether the user's elbow is straight, bent with forearm beside the torso, or bent with forearm in front of the torso.
[0055] In some embodiments, a machine learning system (e.g., AI system 115c or other computing system(s) 115d, or the like) may be used to further enhance operation of the wrist- wearable device by learning and fine-tuning the detection of the micro-gestures, the motion- based gesture detection, the translation-based gesture detection, and/or instances of intended gesture control by the user compared with regular movement not intended to be gesture control movements, and/or the like, that adapt to use by the user over time, or the like.
[0056] These and other functions of the system 100 (and its components) are described in greater detail below with respect to Figs. 2-4.
[0057] Figs. 2A-2P (collectively, "Fig. 2") are schematic diagrams illustrating various non-limiting examples 200 and 200' of a wrist-wearable control device for controlling a UI of a HMD or other user device with camera input, in accordance with various embodiments. Figs. 2A-2H depict various non-limiting examples 200 of a wristband-based wearable control device 205a-205h, respectively. Figs. 2I-2P depict various non-limiting examples 200' of a wristwatch-based wearable control device 205i-205p. [0058] In Fig. 2, the wristband-based wearable control devices 205a-205h or the wristwatch-based wearable control devices 205i-205p (both sets of devices collectively, "wrist- wearable control devices 205" or the like) may each include, without limitation, a plurality of band portions 210 and a plurality of connectors 215. Each band portion 210 may include, but is not limited to, at least one of one or more first sensors or biosensors 220 (similar to the first sensor(s) or biosensor(s) 120 of Fig. 1, or the like), at least one second sensor(s) or IMU sensor(s) 225 (similar to the second sensor(s) or IMU sensor(s) 125 of Fig. 1, or the like), one or more outward facing lights 230 (similar to the one or more outward facing lights 140a-140n of Fig. 1, or the like), and/or the like, or may include a band portion without any of the first sensor(s) or biosensor(s) 220, the second sensor(s) or IMU sensor(s) 225, or the outward facing lights 230, or the like.
[0059] For instance, the wristband-based wearable control devices 205a-205h may include, but are not limited to: (a) a wristband-based wearable control device 205a having a plurality of band portions 210a each including one or more first sensors or biosensors 220 and at least one second sensor(s) or IMU sensor(s) 225 (as shown in Fig. 2A); (b) a wristband-based wearable control device 205b having a plurality of band portions 210b each including one or more first sensors or biosensors 220, at least one second sensor(s) or IMU sensor(s) 225, and one or more outward facing lights 230 (as shown in Fig. 2B); (c) a wristband-based wearable control device 205c having a plurality of band portions 210c each including one or more first sensors or biosensors 220, and a single band portion 210a including both one or more first sensors or biosensors 220 and at least one second sensor(s) or IMU sensor(s) 225 (as shown in Fig. 2C); (d) a wristband-based wearable control device 205d having a plurality of band portions 210d each including one or more first sensors or biosensors 220 and one or more outward facing lights 230, and a single band portion 210b including one or more first sensors or biosensors 220, at least one second sensor(s) or IMU sensor(s) 225, and one or more outward facing lights 230 (as shown in Fig. 2D); (e) a wristband-based wearable control device 205e having a plurality of band portions 210e without any of the first sensor(s) or biosensor(s) 220, the second sensor(s) or IMU sensor(s) 225, or the outward facing lights 230, and a single band portion 210a including both one or more first sensors or biosensors 220 and at least one second sensor(s) or IMU sensor(s) 225 (as shown in Fig. 2E); (f) a wristband-based wearable control device 205f having a plurality of band portions 210f each including one or more outward facing lights 230, and a single band portion 210b including one or more first sensors or biosensors 220, at least one second sensor(s) or IMU sensor(s) 225, and one or more outward facing lights 230 (as shown in Fig. 2F); (g) a wristband-based wearable control device 205g having a single band portion 210g including at least one second sensor(s) or IMU sensor(s) 225, a (separate) single band portion 210c including one or more first sensors or biosensors 220, and a plurality of band portions 210e without any of the first sensor(s) or biosensor(s) 220, the second sensor(s) or IMU sensor(s) 225, or the outward facing lights 230 (as shown in Fig. 2G); and (h) a wristband-based wearable control device 205h having a single band portion 21 Oh including at least one second sensor(s) or IMU sensor(s) 225 and one or more outward facing lights 230, a (separate) single band portion 210d including one or more first sensors or biosensors 220 and one or more outward facing lights 230, and a plurality of band portions 21 Of each including one or more outward facing lights 230 (as shown in Fig. 2F1); and/or the like.
[0060] Each band portion 210 may be connected to an adjacent band portion(s) 210 via connector(s) 215 in a manner configured to removably wrap the wrist- wearable control device 205 around a wrist of a user (e.g., user 110, or the like) when the wrist-wearable control device 205 is worn by the user. In some instances, the connectors 215 may include elastic material-based connectors that allow the band portions to be pulled away from each other to slip over and onto (or over and off) the hand and wrist of the user. Alternatively, the connectors 215 may include a plurality of links and a clasp (e.g., made of metal, plastic, or other solid materials), or the like. Although not shown in Figs. 2A-2F1, at least one band portion 210 may also include, without limitation, at least one of a computing system (similar to computing system 115 of Fig. 1, or the like) or a communications system (similar to communications system 135 of Fig. 1, or the like).
[0061] Similarly, the wristwatch-based wearable control devices 205i-205p may include, but are not limited to: (i) a wristwatch-based wearable control device 205i having a watch portion 235, a plurality of band portions 210a each including one or more first sensors or biosensors 220 and at least one second sensor(s) or IMU sensor(s) 225 (as shown in Fig. 21); (j) a wristwatch-based wearable control device 205j having a watch portion 235, a plurality of band portions 210b each including one or more first sensors or biosensors 220, at least one second sensor(s) or IMU sensor(s) 225, and one or more outward facing lights 230 (as shown in Fig. 2J); (k) a wristwatch-based wearable control device 205k having a watch portion 235, a plurality of band portions 210c each including one or more first sensors or biosensors 220, and a single band portion 210a including both one or more first sensors or biosensors 220 and at least one second sensor(s) or IMU sensor(s) 225 (as shown in Fig. 2K); (1) a wristwatch- based wearable control device 2051 having a watch portion 235, a plurality of band portions 210d each including one or more first sensors or biosensors 220 and one or more outward facing lights 230, and a single band portion 210b including one or more first sensors or biosensors 220, at least one second sensor(s) or IMU sensor(s) 225, and one or more outward facing lights 230 (as shown in Fig. 2L); (m) a wristwatch-based wearable control device 205m having a watch portion 235, a plurality of band portions 210e without any of the first sensor(s) or biosensor(s) 220, the second sensor(s) or IMU sensor(s) 225, or the outward facing lights 230, and a single band portion 210a including both one or more first sensors or biosensors 220 and at least one second sensor(s) or IMU sensor(s) 225 (as shown in Fig. 2M); (n) a wristwatch-based wearable control device 205n having a watch portion 235, a plurality of band portions 210f each including one or more outward facing lights 230, and a single band portion 210b including one or more first sensors or biosensors 220, at least one second sensor(s) or IMU sensor(s) 225, and one or more outward facing lights 230 (as shown in Fig. 2N); (o) a wristwatch-based wearable control device 205o having a watch portion 235, a single band portion 210g including at least one second sensor(s) or IMU sensor(s) 225, a (separate) single band portion 210c including one or more first sensors or biosensors 220, and a plurality of band portions 210e without any of the first sensor(s) or biosensor(s) 220, the second sensor(s) or IMU sensor(s) 225, or the outward facing lights 230 (as shown in Fig. 20); and (p) a wristwatch-based wearable control device 205p having a watch portion 235, a single band portion 21 Oh including at least one second sensor(s) or IMU sensor(s) 225 and one or more outward facing lights 230, a (separate) single band portion 210d including one or more first sensors or biosensors 220 and one or more outward facing lights 230, and a plurality of band portions 21 Of each including one or more outward facing lights 230 (as shown in Fig. 2P); and/or the like.
[0062] Each band portion 210 (as well as watch portion 235) may be connected to an adjacent band portion(s) 210 via connector (s) 215 in a manner configured to removably wrap the wrist-wearable control device 205 around a wrist of a user (e.g., user 110, or the like) when the wrist- wear able control device 205 is worn by the user. In some instances, the connectors 215 may include elastic material-based connectors that allow the band portions to be pulled away from each other to slip over and onto (or over and off) the hand and wrist of the user. Alternatively, the connectors 215 may include a plurality of links and a clasp (e.g., made of metal, plastic, or other solid materials), or the like. Although not shown in Figs. 2I-2P, at least one band portion 210 may also include, without limitation, at least one of a computing system (similar to computing system 115 of Fig. 1, or the like) or a communications system (similar to communications system 135 of Fig. 1, or the like). [0063] Although not shown in Figs. 2I-2P, watch portion 235 (aside from including watch components or smartwatch components) may include, but is not limited, at least one of a computing system (similar to computing system 115 of Fig. 1, or the like), a communications system (similar to communications system 135 of Fig. 1, or the like), one or more first sensors or biosensors 220, at least one second sensor(s) or IMU sensor(s) 225, or one or more outward facing lights 230, and/or the like.
[0064] These and other functions of the example(s) examples 200 and 200' (and their components) are described in greater detail herein with respect to Figs. 1, 3, and 4.
[0065] Figs. 3 A and 3B (collectively, "Fig. 3") are schematic diagrams illustrating various non-limiting examples 300 and 300' of the use of a wrist-wearable control device for controlling a UI of a HMD (Fig. 3A) or other user device with camera input (Fig. 3B), in accordance with various embodiments.
[0066] In Fig. 3, wrist-wearable device 205 (similar to wrist-wearable devices 105 and 205 of Figs. 1 and 2, or the like) is shown being worn on a wrist 305a of a user (similar to user 110 of Fig. 1, or the like). The wrist-wearable device 205 may each include, without limitation, a plurality of band portions 210 and a plurality of connectors 215. In the case that wrist-wearable device 205 comprises a wristwatch-based wearable control device (similar to wristwatch-based wearable control device 205i-205p of Figs. 2I-2P, or the like), the wrist- wearable device 205 may further include watch portion 235 (not shown in Fig. 3; similar to watch portion 235 of Figs. 2I-2P, or the like). Each band portion 210 (and, in some cases, each watch portion 235, as well) may include, but is not limited to, at least one of one or more first sensors or biosensors 220 (not shown in Fig. 3; similar to the first sensor(s) or biosensor(s) 120 and 220 of Figs. 1 and 2, or the like), at least one second sensor(s) or IMU sensor(s) 225 (not shown in Fig. 3; similar to the second sensor(s) or IMU sensor(s) 125 and 225 of Figs. 1 and 2, or the like), or one or more outward facing lights 230 (similar to the one or more outward facing lights 140a-140n and 230 of Figs. 1 and 2, or the like), and/or the like.
[0067] With reference to the non-limiting embodiment 300 of Fig. 3 A, HMD 310a (similar to user device 150 of Fig. 1, or the like) may include, without limitation, at least one processor (not shown; similar to the one or more processors 150a of Fig. 1, or the like), at least one transceiver (not shown; similar to transceiver 150b of Fig. 1, or the like), one or more cameras (not shown; similar to the one or more cameras 150c of Fig. 1, or the like), display device 315a (similar to display device 150d of Fig. 1, or the like), and UI 320a (similar to UI 160 of Fig. 1, or the like), or the like. In some instances, HMD 310a may include, but is not limited to, one of a set of virtual reality ("VR") goggles, augmented reality ("AR") goggles, a set of mixed reality ("MR") goggles, a pair of VR-enabled eyewear, a pair of AR-enabled eyewear, a pair of MR-enabled eyewear, a VR-enabled smartphone mounted in a headset, an AR-enabled smartphone mounted in a headset, a MR-enabled smartphone mounted in a headset, and/or the like.
[0068] Similarly, in the non-limiting embodiment 300' of Fig. 3B, user device 310b (similar to user device 150 of Fig. 1, or the like) may include, without limitation, at least one processor (not shown; similar to the one or more processors 150a of Fig. 1, or the like), at least one transceiver (not shown; similar to transceiver 150b of Fig. 1, or the like), one or more cameras 340 (similar to the one or more cameras 150c of Fig. 1, or the like), display screen or device 315b (similar to display device 150d of Fig. 1, or the like), and UI 320b (similar to UI 160 of Fig. 1, or the like), or the like. Although camera(s) 340 is depicted in Fig. 3B as being an external yet communicatively coupled camera(s), the various embodiments are not so limited, and camera(s) 340 may be integrated within user device 310b. In some instances, the user device 310b may include, without limitation, one of a smart television ("TV") with built-in camera, a smart TV with externally connected camera, a gaming console with built-in camera, a gaming console with externally connected camera, a monitor with built-in camera, a monitor with externally connected camera, or a projector with externally connected camera, and/or the like. In some cases, the monitor or projector may also be communicatively coupled with one of a smart phone, a tablet computer, a laptop computer, a desktop computer, or a portable gaming device, and/or the like, that presents content for display on the monitor or projector.
[0069] In Figs. 3A and 3B, UI 320a or 320b (collectively, "UI 320" or the like), which is presented within display device 315a or 315b (collectively, "display device 315" or the like), may display or present content (in this case, a menu screen with a plurality of system icons 325 (in this case, icons of connected VR headset and mobile device, or the like) and/or a plurality of icons 330 associated with corresponding plurality of software applications ("apps"), or the like, although not limited to such). In some cases (although not shown), the content that may be displayed or presented may include icons, text, graphics, objects, thumb nails, or picture-in-picture ("PIP") -type mini-windows of media content including, but not limited to, video content, image content, representations of audio content (e.g., album cover art, artist photograph or poster, etc.), game content, presentation content, etc.).
[0070] In some embodiments, the one or more first sensors or biosensors 220 (including, but not limited to, at least one of PPG sensor(s), EMG sensor(s), sound transducer(s), or motion transducer(s), and/or the like) may detect biological characteristics that correspond to motion of soft tissue in or near a wrist of the user on which the wrist-wearable device is being worn, as described in detail above with respect to Fig. 1. In some cases, the one or more first sensors or biosensors 220 may be used to track, monitor, and/or identify gesture-based commands, including, without limitation, at least one of a swipe-based command (e.g., with finger 305c lifting away from or lowering toward thumb 305d, at moderate speed, with other fingers of hand 305b remaining in the initial or relaxed pose, as described above with respect to Fig. 1, or the like), a drag-based command (e.g., with finger 305c lifting away from or lowering toward thumb 305d, at slow speed, with other fingers of hand 305b remaining in the initial or relaxed pose, as described above with respect to Fig. 1, or the like), a tap command (e.g., with finger 305c lowering toward thumb 305d, at fast speed, or with finger 305c crooked or curved, with other fingers of hand 305b remaining in the initial or relaxed pose, or the like), a double tap command (e.g., similar to the tap command, but with a pair of such gestures in quick succession, or the like), a point command (e.g., with finger 305c straightening, with other fingers of hand 305b remaining in the initial or relaxed pose, or the like), a pinch-based command (e.g., with finger 305c moving toward and touching or pressing against thumb 305d, with other fingers of hand 305b remaining in the initial or relaxed pose, as described above with respect to Fig. 1, or the like), or a clench-based command (e.g., with finger 305c other fingers of hand 305b moving toward thumb 305d in a squeezing motion, without the fingers touching thumb 305d, or the like), and/or the like. In some instances, the wrist wearable device 205 may generate instructions based on the tracked, monitored, and/or identified gesture-based commands, the instructions including, but not limited to, at least one of a scroll command, a move command, a select command, a confirm command, a highlight command, a return command, or a cancel command, and/or the like.
[0071] According to some embodiments, the at least one second sensor(s) or IMU sensor(s) 225 (including, but not limited to, at least one of an inertial measurement unit ("IMU") sensor or a gyroscope-based sensor, and/or the like) may detect motion of the wrist- wearable device 205 with respect to at least one axis of rotation among three axes of rotation (e.g., rotation Rx, Ry, and/or Rz, (also referred to as pitch, roll, and/or yaw, respectively) with respect to the X-axis, Y-axis, and/or Z-axis, respectively, as shown in Fig. 3, or the like). In some instances, the Y-axis may be parallel to at least one of a longitudinal extension of the user's arm or a main axis of the wrist-wearable device 205, and/or the like, while the X-axis may lay within a plane that is parallel to a plane defined by one of a dorsal side of the forearm or a ventral side of the forearm, and the Z-axis may be parallel to a line that is perpendicular to the plane defined by the one of the dorsal side of the forearm or the ventral side of the forearm. With reference to Figs. 1 and 3, supination or pronation may refer to rotation Ry with respect to the Y-axis, with direction of rotation being dependent on which arm the wrist-wearable device is worn. In some instances, the wrist wearable device 205 may generate instructions based on the detect motion of the wrist-wearable device 205 with respect to at least one axis of rotation among three axes of rotation, the instructions including, but not limited to, at least one of a move cursor command or a navigate command, and/or the like.
[0072] In some embodiments, the one or more outward facing lights 230 may be disposed on the one or more band portions 210, and, in some cases, may be arranged in a predetermined pattern on the one or more band portions and may serve as tracking points for one or more cameras 340 of user device 310a or 310b (collectively, "user device 310" or the like) to track translation movement of the wrist-wearable device 205 along the three axes (e.g., the X-axis, Y-axis, and/or Z-axis, or the like) relative to the user device 310. In some cases, the tracking of translation movement of the wrist- wearable device along the three axes may serve as additional input for controlling the UI 320 displayed within display device 315 of user device 310.
[0073] The combination of the one or more first sensors or biosensors 220, the at least one second sensor(s) or IMU sensor(s) 225, and the one or more outward facing lights 230 enable highly accurate six-degrees-of-freedom ("6DOF") control of the UI 320 displayed within display device 315 of user device 310, by combining micro gesture-based UI control (using the one or more first sensors or biosensors 220), motion detection-based UI control (using the at least one second sensor(s) or IMU sensor(s) 225), and translation detection- based UI control (using the one or more outward facing lights 230 as tracking points for camera(s) 340 of the user device 310 to serve as addition UI control input). In some cases, there is some overlap between two or more of the micro gesture-based UI control, the motion detection-based UI control, and the translation detection-based UI control. For instance, as shown in the non-limiting embodiments 300 and 300' of Figs. 3 A and 3B, respectively, 6DOF control may be used to navigate within the UI 320. According to some embodiments, navigating and interacting within the UI 320 may include selecting or highlighting icons, text, graphics, objects, thumb-nails, or picture-in-picture ("PIP") -type mini-windows of media content, and/or the like (collectively, "UI objects 330" or the like) (as depicted, e.g., by bounding box 335 being superimposed over app icons in Fig. 3, or the like), moving the selection or highlighting field (e.g., bounding box 335, or the like) from one UI object to another, and/or the like, based at least in part on the 6DOF UI control. Alternatively, or additionally, with or without selection or highlighting fields, navigating and interacting within the UI 320 may include scrolling or paging through menus, pages, windows, and/or other UI display regions, or the like, as well as responding to prompts (with micro gesture- based acceptance or dismissal input commands, or with micro gesture-based "yes" or "no" input commands, or with micro gesture-based "OK" or "cancel" input commands, etc., in response to the prompts), or the like.
[0074] In some instances, the combination of the one or more first sensors or biosensors 220, the at least one second sensor(s) or IMU sensor(s) 225, and the one or more outward facing lights 230 may also complement and/or supplement gesture-based inputs or commands, e.g., in the case that conditions or circumstances provide less than ideal tracking or detection of one or more of the corresponding gesture-based commands. For example, under low light or dark conditions (or extremely bright conditions), camera(s) 340 may encounter issues with properly tracking the positions and/or orientations of the one or more outward facing lights 230. This may also occur if the wrist-wearable device 205 (and thus the one or more outward facing lights 230) move outside the field of view of the camera(s) 340 (which is more likely in the case of FiMD embodiments). In these situations, the micro gesture -based control and/or the motion detection-based control (which constitute three- degrees -of-freedom ("3DOF") control) may still enable navigating and interacting within the UI 320. In some cases, multiple wrist-wearable devices 205 may be used (one on each arm of the user), to provide additional 6DOF control of the UI 320, and may be useful for immersive experiences (e.g., game play or AR/VR/MR experiences, or the like), in which one wrist-wearable device may be used for navigating within an environment displayed or presented in the UI 320, while the other wrist-wearable device may be used for navigating and interacting with UI objects or sub-UIs within said environment within the UI 320, or the like.
[0075] In some embodiments, a machine learning system (e.g., machine learning system 115b, AI system 115c, or other computing system(s) 115d or Fig. 1, or the like) may be used to further enhance operation of the wrist- wearable device 205 by learning and fine-tuning the detection of the micro-gestures, the motion -based gesture detection, the translation-based gesture detection, and/or instances of intended gesture control by the user compared with regular movement not intended to be gesture control movements, and/or the like, that adapt to use by the user over time, or the like. [0076] These and other functions of the example(s) examples 300 and 300' (and their components) are described in greater detail herein with respect to Figs. 1, 2, and 4.
[0077] Fig. 4 is a flow diagram illustrating a method 400 for implementing a wrist- wearable control device for controlling a UI of a HMD or other user device with camera input, in accordance with various embodiments.
[0078] While the techniques and procedures are depicted and/or described in a certain order for purposes of illustration, it should be appreciated that certain procedures may be reordered and/or omitted within the scope of various embodiments. Moreover, while the method 400 illustrated by Fig. 4 can be implemented by or with (and, in some cases, are described below with respect to) the systems, examples, or embodiments 100, 200, 200', 300, and 300' of Figs. 1, 2A-2F1, 2I-2P, 3A, and 3B, respectively (or components thereof), such methods may also be implemented using any suitable hardware (or software) implementation. Similarly, while each of the systems, examples, or embodiments 100, 200, 200', 300, and 300' of Figs. 1, 2A-2F1, 2I-2P, 3 A, and 3B, respectively (or components thereof), can operate according to the method 400 illustrated by Fig. 4 (e.g., by executing instructions embodied on a computer readable medium), the systems, examples, or embodiments 100, 200, 200', 300, and 300' of Figs. 1 , 2A-2F1, 2I-2P, 3 A, and 3B can each also operate according to other modes of operation and/or perform other suitable procedures.
[0079] In the non-limiting embodiment of Fig. 4A, method 400, at block 405, may comprise receiving, using a computing system, first sensor data from one or more first sensors disposed on a wrist-wearable control device when the wrist-wearable control device is being worn by a user. The one or more first sensors may be configured to detect gestures of one or more fingers of the user, when the wrist-wearable control device is being worn by the user, without any fingers of the user touching the wrist-wearable control device. At block 410, method 400 may comprise analyzing, using the computing system, the first sensor data to identify a first gesture corresponding to movement of the one or more fingers of the user. [0080] In some embodiments, the computing system may include, without limitation, at least one of one or more processors on the wrist- wearable control device, one or more processors on a mobile device that is communicatively coupled with the wrist-wearable control device, a machine learning system, an artificial intelligence ("AI") system, a deep learning system, a neural network, a convolutional neural network ("CNN"), or a fully convolutional network ("FCN"), and/or the like.
[0081] According to some embodiments, the one or more first sensors may each include at least one biosensor, the at least one biosensor may include, without limitation, at least one of one or more photoplethysmography ("PPG") sensors, one or more electromyography ("EMG") sensors, one or more sound transducers, or one or more motion transducers, and/or the like, each configured to detect biological characteristics that correspond to motion of soft tissue in or near a wrist of the user on which the wrist-wearable control device is being worn. In some cases, the soft tissue may include, but is not limited to, at least one of one or more muscles, one or more tendons, or one or more blood vessels, and/or the like, in or near the wrist of the user. In some cases, the one or more PPG sensors may be configured to optically detect blood volume changes in the one or more blood vessels below skin tissue at or near the wrist of the user corresponding to movement of the one or more fingers of the user. In some instances, the one or more EMG sensors may be configured to detect electrical activity of the one or more muscles at or near the wrist of the user corresponding to movement of the one or more fingers of the user. In some cases, the one or more sound transducers may be configured to detect sounds within a body of the user corresponding to movement of the one or more fingers of the user. In some instances, the one or more motion transducers may be configured to detect motion of the one or more muscles, the one or more tendons, and the one or more blood vessels corresponding to movement of the one or more fingers of the user. [0082] Method 400, at block 415, may comprise receiving, using the computing system, second sensor data from one or more second sensors disposed on the wrist-wearable control device when the wrist-wearable control device is being worn by the user. The one or more second sensors may be configured to detect motion of the wrist-wearable control device with respect to at least one axis of rotation among three axes of rotation. At block 420, method 400 may comprise analyzing, using the computing system, the second sensor data to detect motion of the wrist-wearable control device with respect to the at least one axis of rotation among the three axes of rotation.
[0083] In some embodiments, the one or more second sensors may each include, without limitation, at least one of an inertial measurement unit ("IMU") sensor or a gyroscope-based sensor, and/or the like. In some cases, the wrist-wearable control device may include, but is not limited to, one of a wristwatch-based wearable control device or a wristband-based wearable control device, or the like, each including one or more band portions that may be linked to adjacent band portions by connectors in a manner configured to removably wrap the wrist-wearable control device around a wrist of the user. In some instances, at least one first sensor among the one or more first sensors and at least one second sensor among the one or more second sensors may be disposed within each of the one or more band portions. Alternatively, the one or more first sensors and the one or more second sensors may each be disposed within only one band portion among the one or more band portions, wherein the band portion containing the one or more first sensors may be one of the same band portion containing the one or more second sensors or a separate band portion from the band portion containing the one or more second sensors. Alternatively, at least one first sensor among the one or more first sensors may be disposed within each of the one or more band portions while the one or more second sensors may be disposed within only one band portion among the one or more band portions.
[0084] Method 400 may further comprise, at block 425, determining, using the computing system, whether the detected motion of the wrist-wearable control device in conjunction with the first gesture corresponds to a first gesture-based command for controlling a user interface ("UI") of a head-mounted device ("HMD") or other user device with camera input. If so, method 400 may further comprise: generating, using the computing system, first instructions based on the first gesture-based command (block 430); and sending, using the computing system, the generated first instructions to the HMD or the other user device, the first instructions serving as input for controlling the UI of the HMD or the other user device (block 435).
[0085] In some instances, the HMD or other user device with camera input may include, without limitation, one of a set of virtual reality ("VR") goggles, augmented reality ("AR") goggles, a set of mixed reality ("MR") goggles, a pair of VR-enabled eyewear, a pair of AR- enabled eyewear, a pair of MR-enabled eyewear, a VR-enabled smartphone mounted in a headset, an AR-enabled smartphone mounted in a headset, a MR-enabled smartphone mounted in a headset, a smart television ("TV") with built-in camera, a smart TV with externally connected camera, a gaming console with built-in camera, a gaming console with externally connected camera, a monitor with built-in camera, a monitor with externally connected camera, or a projector with externally connected camera, and/or the like.
[0086] According to some embodiments, the first gesture-based command may include, but is not limited to, at least one of a swipe-based command, a drag-based command, a tap command, a double tap command, a point command, a pinch-based command, a clench-based command, a rotate command, a roll command, a pitch command, or a yaw command, and/or the like. In some instances, the first instructions may include, without limitation, at least one of a scroll command, a move command, a select command, a confirm command, a highlight command, a return command, a cancel command, a move cursor command, or a navigate command, and/or the like. [0087] In some embodiments, method 400 may further comprise, at block 440, activating, using the computing system, a plurality of outward-facing lights disposed on one or more band portions of the wrist-wearable control device that are arranged in a predetermined pattern on the one or more band portions and that serve as tracking points for one or more cameras of the HMD or the other user device to track translation movement of the wrist- wearable control device along three axes relative to the HMD or the other user device. In some cases, the plurality of outward-facing lights may include, but is not limited to, at least one of a plurality of infrared ("IR") light emitting diodes ("LEDs") or a plurality of colored LEDs, and/or the like.
[0088] Examples of System and Hardware Implementation
[0089] Lig. 5 is a block diagram illustrating an example of computer or system hardware architecture, in accordance with various embodiments. Lig. 5 provides a schematic illustration of one embodiment of a computer system 500 of the service provider system hardware that can perform the methods provided by various other embodiments, as described herein, and/or can perform the functions of computer or hardware system (i.e., wrist-wearable devices 105, 205a-205p, and 205, computing system 115, display screen 130, and user devices 150, 310a, and 310b, etc.), as described above. It should be noted that Lig. 5 is meant only to provide a generalized illustration of various components, of which one or more (or none) of each may be utilized as appropriate. Lig. 5, therefore, broadly illustrates how individual system elements may be implemented in a relatively separated or relatively more integrated manner.
[0090] The computer or hardware system 500 - which might represent an embodiment of the computer or hardware system (i.e., wrist-wearable devices 105, 205a-205p, and 205, computing system 115, display screen 130, and user devices 150, 310a, and 310b, etc.), described above with respect to Ligs. 1-4 - is shown comprising hardware elements that can be electrically coupled via a bus 505 (or may otherwise be in communication, as appropriate). The hardware elements may include one or more processors 510, including, without limitation, one or more general-purpose processors and/or one or more special-purpose processors (such as microprocessors, digital signal processing chips, graphics acceleration processors, and/or the like); one or more input devices 515, which can include, without limitation, a mouse, a keyboard, and/or the like; and one or more output devices 520, which can include, without limitation, a display device, a printer, and/or the like.
[0091] The computer or hardware system 500 may further include (and/or be in communication with) one or more storage devices 525, which can comprise, without limitation, local and/or network accessible storage, and/or can include, without limitation, a disk drive, a drive array, an optical storage device, solid-state storage device such as a random access memory ("RAM") and/or a read-only memory ("ROM"), which can be programmable, flash-updateable, and/or the like. Such storage devices may be configured to implement any appropriate data stores, including, without limitation, various file systems, database structures, and/or the like.
[0092] The computer or hardware system 500 might also include a communications subsystem 530, which can include, without limitation, a modem, a network card (wireless or wired), an infra-red communication device, a wireless communication device and/or chipset (such as a Bluetooth™ device, an 802.11 device, a WiFi device, a WiMax device, a WWAN device, cellular communication facilities, etc.), and/or the like. The communications subsystem 530 may permit data to be exchanged with a network (such as the network described below, to name one example), with other computer or hardware systems, and/or with any other devices described herein. In many embodiments, the computer or hardware system 500 will further comprise a working memory 535, which can include a RAM or ROM device, as described above.
[0093] The computer or hardware system 500 also may comprise software elements, shown as being currently located within the working memory 535, including an operating system 540, device drivers, executable libraries, and/or other code, such as one or more application programs 545, which may comprise computer programs provided by various embodiments (including, without limitation, hypervisors, VMs, and the like), and/or may be designed to implement methods, and/or configure systems, provided by other embodiments, as described herein. Merely by way of example, one or more procedures described with respect to the method(s) discussed above might be implemented as code and/or instructions executable by a computer (and/or a processor within a computer); in an aspect, then, such code and/or instructions can be used to configure and/or adapt a general purpose computer (or other device) to perform one or more operations in accordance with the described methods. [0094] A set of these instructions and/or code might be encoded and/or stored on a non- transitory computer readable storage medium, such as the storage device(s) 525 described above. In some cases, the storage medium might be incorporated within a computer system, such as the system 500. In other embodiments, the storage medium might be separate from a computer system (i.e., a removable medium, such as a compact disc, etc.), and/or provided in an installation package, such that the storage medium can be used to program, configure, and/or adapt a general purpose computer with the instructions/code stored thereon. These instructions might take the form of executable code, which is executable by the computer or hardware system 500 and/or might take the form of source and/or installable code, which, upon compilation and/or installation on the computer or hardware system 500 (e.g., using any of a variety of generally available compilers, installation programs, compression/decompression utilities, etc.) then takes the form of executable code.
[0095] It will be apparent to those skilled in the art that substantial variations may be made in accordance with particular requirements. For example, customized hardware (such as programmable logic controllers, field-programmable gate arrays, application-specific integrated circuits, and/or the like) might also be used, and/or particular elements might be implemented in hardware, software (including portable software, such as applets, etc.), or both. Further, connection to other computing devices such as network input/output devices may be employed.
[0096] As mentioned above, in one aspect, some embodiments may employ a computer or hardware system (such as the computer or hardware system 500) to perform methods in accordance with various embodiments of the invention. According to a set of embodiments, some or all of the procedures of such methods are performed by the computer or hardware system 500 in response to processor 510 executing one or more sequences of one or more instructions (which might be incorporated into the operating system 540 and/or other code, such as an application program 545) contained in the working memory 535. Such instructions may be read into the working memory 535 from another computer readable medium, such as one or more of the storage device(s) 525. Merely by way of example, execution of the sequences of instructions contained in the working memory 535 might cause the processor(s) 510 to perform one or more procedures of the methods described herein. [0097] The terms "machine readable medium" and "computer readable medium," as used herein, refer to any medium that participates in providing data that causes a machine to operate in some fashion. In an embodiment implemented using the computer or hardware system 500, various computer readable media might be involved in providing instructions/code to processor(s) 510 for execution and/or might be used to store and/or carry such instructions/code (e.g., as signals). In many implementations, a computer readable medium is a non-transitory, physical, and/or tangible storage medium. In some embodiments, a computer readable medium may take many forms, including, but not limited to, non-volatile media, volatile media, or the like. Non-volatile media includes, for example, optical and/or magnetic disks, such as the storage device(s) 525. Volatile media includes, without limitation, dynamic memory, such as the working memory 535. In some alternative embodiments, a computer readable medium may take the form of transmission media, which includes, without limitation, coaxial cables, copper wire, and fiber optics, including the wires that comprise the bus 505, as well as the various components of the communication subsystem 530 (and/or the media by which the communications subsystem 530 provides communication with other devices). In an alternative set of embodiments, transmission media can also take the form of waves (including without limitation radio, acoustic, and/or light waves, such as those generated during radio-wave and infra-red data communications). [0098] Common forms of physical and/or tangible computer readable media include, for example, a floppy disk, a flexible disk, a hard disk, magnetic tape, or any other magnetic medium, a CD-ROM, any other optical medium, punch cards, paper tape, any other physical medium with patterns of holes, a RAM, a PROM, and EPROM, a FLASH-EPROM, any other memory chip or cartridge, a carrier wave as described hereinafter, or any other medium from which a computer can read instructions and/or code.
[0099] Various forms of computer readable media may be involved in carrying one or more sequences of one or more instructions to the processor(s) 510 for execution. Merely by way of example, the instructions may initially be carried on a magnetic disk and/or optical disc of a remote computer. A remote computer might load the instructions into its dynamic memory and send the instructions as signals over a transmission medium to be received and/or executed by the computer or hardware system 500. These signals, which might be in the form of electromagnetic signals, acoustic signals, optical signals, and/or the like, are all examples of carrier waves on which instructions can be encoded, in accordance with various embodiments of the invention.
[0100] The communications subsystem 530 (and/or components thereof) generally will receive the signals, and the bus 505 then might carry the signals (and/or the data, instructions, etc. carried by the signals) to the working memory 535, from which the processor(s) 505 retrieves and executes the instructions. The instructions received by the working memory 535 may optionally be stored on a storage device 525 either before or after execution by the processor(s) 510.
[0101] While particular features and aspects have been described with respect to some embodiments, one skilled in the art will recognize that numerous modifications are possible. For example, the methods and processes described herein may be implemented using hardware components, software components, and/or any combination thereof. Further, while various methods and processes described herein may be described with respect to particular structural and/or functional components for ease of description, methods provided by various embodiments are not limited to any particular structural and/or functional architecture but instead can be implemented on any suitable hardware, firmware and/or software configuration. Similarly, while particular functionality is ascribed to particular system components, unless the context dictates otherwise, this functionality need not be limited to such and can be distributed among various other system components in accordance with the several embodiments.
[0102] Moreover, while the procedures of the methods and processes described herein are described in a particular order for ease of description, unless the context dictates otherwise, various procedures may be reordered, added, and/or omitted in accordance with various embodiments. Moreover, the procedures described with respect to one method or process may be incorporated within other described methods or processes; likewise, system components described according to a particular structural architecture and/or with respect to one system may be organized in alternative structural architectures and/or incorporated within other described systems. Hence, while various embodiments are described with — or without — particular features for ease of description and to illustrate some aspects of those embodiments, the various components and/or features described herein with respect to a particular embodiment can be substituted, added and/or subtracted from among other described embodiments, unless the context dictates otherwise. Consequently, although several embodiments are described above, it will be appreciated that the invention is intended to cover all modifications and equivalents within the scope of the following claims.

Claims

WHAT IS CLAIMED IS:
1. A wrist-wearable control device, comprising: one or more first sensors that are configured to detect gestures of one or more fingers of a user, when the wrist-wearable control device is being worn by the user, without any fingers of the user touching the wrist-wearable control device; one or more second sensors that are configured to detect motion of the wrist-wearable control device with respect to at least one axis of rotation among three axes of rotation; at least one processor; and a non-transitory computer readable medium communicatively coupled to the at least one processor, the non-transitory computer readable medium having stored thereon computer software comprising a set of instructions that, when executed by the at least one processor, causes the wrist-wearable control device to: analyze first sensor data to identify a first gesture corresponding to movement of one or more fingers of the user, the first sensor data being received from the one or more first sensors disposed on the wrist-wearable control device when the wrist-wearable control device is being worn by the user; analyze second sensor data to detect motion of the wrist-wearable control device with respect to the at least one axis of rotation among the three axes of rotation, the second sensor data being received from the one or more second sensors disposed on the wrist-wearable control device when the wrist-wearable control device is being worn by the user; and based on a determination that the detected motion of the wrist-wearable control device in conjunction with the first gesture corresponds to a first gesture-based command for controlling a user interface ("UI") of a head- mounted device ("HMD") or other user device with camera input, generate first instructions based on the first gesture-based command and send the generated first instructions to the HMD or the other user device, the first instructions serving as input for controlling the UI of the HMD or the other user device.
2. The wrist-wearable control device of claim 1, wherein the wrist-wearable control device comprises one of a wristwatch-based wearable control device or a wristband- based wearable control device, each comprising one or more band portions that are linked to adjacent band portions by connectors in a manner configured to removably wrap the wrist- wearable control device around a wrist of the user.
3. The wrist-wearable control device of claim 2, wherein at least one first sensor among the one or more first sensors and at least one second sensor among the one or more second sensors are disposed within each of the one or more band portions.
4. The wrist-wearable control device of claim 2, wherein the one or more first sensors and the one or more second sensors are each disposed within only one band portion among the one or more band portions, wherein the band portion containing the one or more first sensors may be one of the same band portion containing the one or more second sensors or a separate band portion from the band portion containing the one or more second sensors.
5. The wrist-wearable control device of claim 2, wherein at least one first sensor among the one or more first sensors is disposed within each of the one or more band portions while the one or more second sensors are disposed within only one band portion among the one or more band portions.
6. The wrist-wearable control device of any of claims 1-5, wherein the HMD or other user device with camera input comprises one of a set of virtual reality ("VR") goggles, augmented reality ("AR") goggles, a set of mixed reality ("MR") goggles, a pair of VR- enabled eyewear, a pair of AR-enabled eyewear, a pair of MR-enabled eyewear, a VR- enabled smartphone mounted in a headset, an AR-enabled smartphone mounted in a headset, a MR-enabled smartphone mounted in a headset, a smart television ("TV") with built-in camera, a smart TV with externally connected camera, a gaming console with built-in camera, a gaming console with externally connected camera, a monitor with built-in camera, a monitor with externally connected camera, or a projector with externally connected camera.
7. The wrist-wearable control device of any of claims 1-6, wherein the one or more first sensors each comprises at least one biosensor, the at least one biosensor comprising at least one of one or more photoplethysmography ("PPG") sensors, one or more electromyography ("EMG") sensors, one or more sound transducers, or one or more motion transducers, each configured to detect biological characteristics that correspond to motion of soft tissue in or near a wrist of the user on which the wrist-wearable control device is being worn, wherein the soft tissue comprises at least one of one or more muscles, one or more tendons, or one or more blood vessels in or near the wrist of the user.
8. The wrist-wearable control device of claim 7, wherein the one or more PPG sensors are configured to optically detect blood volume changes in the one or more blood vessels below skin tissue at or near the wrist of the user corresponding to movement of the one or more fingers of the user, wherein the one or more EMG sensors are configured to detect electrical activity of the one or more muscles at or near the wrist of the user corresponding to movement of the one or more fingers of the user, wherein the one or more sound transducers are configured to detect sounds within a body of the user corresponding to movement of the one or more fingers of the user, wherein the one or more motion transducers are configured to detect motion of the one or more muscles, the one or more tendons, and the one or more blood vessels corresponding to movement of the one or more fingers of the user.
9. The wrist- wearable control device of any of claims 1-8, wherein the one or more second sensors each comprises at least one of an inertial measurement unit ("IMU") sensor or a gyroscope-based sensor.
10. The wrist-wearable control device of any of claims 1-9, further comprising: one or more band portions; and a plurality of outward-facing lights disposed on the one or more band portions, wherein the plurality of outward-facing lights are arranged in a predetermined pattern on the one or more band portions and serve as tracking points for one or more cameras of the HMD or the other user device to track translation movement of the wrist-wearable control device along three axes relative to the HMD or the other user device, wherein the tracking of translation movement of the wrist- wearable control device along the three axes serves as additional input for controlling the UI of the HMD or the other user device.
11. The wrist-wearable control device of claim 10, wherein the plurality of outward-facing lights comprises at least one of a plurality of infrared ("IR") light emitting diodes ("LEDs") or a plurality of colored LEDs.
12. The wrist- wearable control device of any of claims 1-11, wherein the first gesture-based command comprises at least one of a swipe-based command, a drag-based command, a tap command, a double tap command, a point command, a pinch-based command, a clench-based command, a rotate command, a roll command, a pitch command, or a yaw command, wherein the first instructions comprise at least one of a scroll command, a move command, a select command, a confirm command, a highlight command, a return command, a cancel command, a move cursor command, or a navigate command.
13. A method, comprising: analyzing, using a computing system, first sensor data to identify a first gesture corresponding to movement of one or more fingers of a user, the first sensor data being received from one or more first sensors disposed on a wrist-wearable control device when the wrist-wearable control device is being worn by the user, the one or more first sensors being configured to detect gestures of the one or more fingers of the user, when the wrist-wearable control device is being worn by the user, without any fingers of the user touching the wrist-wearable control device; analyzing, using the computing system, second sensor data to detect motion of the wrist-wearable control device with respect to at least one axis of rotation among three axes of rotation, the second sensor data being received from one or more second sensors disposed on the wrist-wearable control device when the wrist- wearable control device is being worn by the user; and based on a determination that the detected motion of the wrist-wearable control device in conjunction with the first gesture corresponds to a first gesture-based command for controlling a user interface ("UI") of a head-mounted device ("HMD") or other user device with camera input, generating, using the computing system, first instructions based on the first gesture-based command and sending, using the computing system, the generated first instructions to the HMD or the other user device, the first instructions serving as input for controlling the UI of the HMD or the other user device.
14. The method of claim 13, wherein the computing system comprises at least one of one or more processors on the wrist-wearable control device, one or more processors on a mobile device that is communicatively coupled with the wrist-wearable control device, a machine learning system, an artificial intelligence ("AI") system, a deep learning system, a neural network, a convolutional neural network ("CNN"), or a fully convolutional network ("FCN").
15. The method of claim 13 or 14, wherein the wrist-wearable control device comprises one of a wristwatch-based wearable control device or a wristband-based wearable control device, each comprising one or more band portions that are linked to adjacent band portions by connectors in a manner configured to removably wrap the wrist-wearable control device around a wrist of the user.
16. The method of any of claims 13-15, wherein the one or more first sensors each comprises at least one biosensor, the at least one biosensor comprising at least one of one or more photoplethysmography ("PPG") sensors, one or more electromyography ("EMG") sensors, one or more sound transducers, or one or more motion transducers, each configured to detect biological characteristics that correspond to motion of soft tissue in or near a wrist of the user on which the wrist-wearable control device is being worn, wherein the soft tissue comprises at least one of one or more muscles, one or more tendons, or one or more blood vessels in or near the wrist of the user.
17. The method of claim 16, wherein the one or more PPG sensors are configured to optically detect blood volume changes in the one or more blood vessels below skin tissue at or near the wrist of the user corresponding to movement of the one or more fingers of the user, wherein the one or more EMG sensors are configured to detect electrical activity of the one or more muscles at or near the wrist of the user corresponding to movement of the one or more fingers of the user, wherein the one or more sound transducers are configured to detect sounds within a body of the user corresponding to movement of the one or more fingers of the user, wherein the one or more motion transducers are configured to detect motion of the one or more muscles, the one or more tendons, and the one or more blood vessels corresponding to movement of the one or more fingers of the user.
18. The method of any of claims 13-17, wherein the one or more second sensors each comprises at least one of an inertial measurement unit ("IMU") sensor or a gyroscope- based sensor.
19. The method of any of claims 13-18, wherein the wrist- wear able control device further comprises one or more band portions and a plurality of outward-facing lights disposed on the one or more band portions, wherein the plurality of outward-facing lights are arranged in a predetermined pattern on the one or more band portions and serve as tracking points for one or more cameras of the HMD or the other user device to track translation movement of the wrist-wearable control device along three axes relative to the HMD or the other user device, wherein the tracking of translation movement of the wrist-wearable control device along the three axes serves as additional input for controlling the UI of the HMD or the other user device.
20. The method of claim 19, wherein the plurality of outward-facing lights comprises at least one of a plurality of infrared ("IR") light emitting diodes ("LEDs") or a plurality of colored LEDs.
PCT/US2022/027315 2022-05-02 2022-05-02 Wrist-wearable controller for head-mounted device (hmd) or other user device with camera input WO2022204614A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/US2022/027315 WO2022204614A1 (en) 2022-05-02 2022-05-02 Wrist-wearable controller for head-mounted device (hmd) or other user device with camera input

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2022/027315 WO2022204614A1 (en) 2022-05-02 2022-05-02 Wrist-wearable controller for head-mounted device (hmd) or other user device with camera input

Publications (1)

Publication Number Publication Date
WO2022204614A1 true WO2022204614A1 (en) 2022-09-29

Family

ID=83396084

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2022/027315 WO2022204614A1 (en) 2022-05-02 2022-05-02 Wrist-wearable controller for head-mounted device (hmd) or other user device with camera input

Country Status (1)

Country Link
WO (1) WO2022204614A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180120948A1 (en) * 2014-06-19 2018-05-03 Thalmic Labs Inc. Systems, devices, and methods for gesture identification
US20190033974A1 (en) * 2017-07-27 2019-01-31 Facebook Technologies, Llc Armband for tracking hand motion using electrical impedance measurement
US20200241649A1 (en) * 2014-03-14 2020-07-30 Sony Interactive Entertainment Inc. Gaming Device With Rotatably Placed Cameras
US20200310539A1 (en) * 2019-03-29 2020-10-01 Facebook Technologies, Llc Methods and apparatus for gesture detection and classification
US20220011855A1 (en) * 2015-10-30 2022-01-13 Ostendo Technologies, Inc. System and Methods for On Body Gestural Interfaces and Projection Displays

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200241649A1 (en) * 2014-03-14 2020-07-30 Sony Interactive Entertainment Inc. Gaming Device With Rotatably Placed Cameras
US20180120948A1 (en) * 2014-06-19 2018-05-03 Thalmic Labs Inc. Systems, devices, and methods for gesture identification
US20220011855A1 (en) * 2015-10-30 2022-01-13 Ostendo Technologies, Inc. System and Methods for On Body Gestural Interfaces and Projection Displays
US20190033974A1 (en) * 2017-07-27 2019-01-31 Facebook Technologies, Llc Armband for tracking hand motion using electrical impedance measurement
US20200310539A1 (en) * 2019-03-29 2020-10-01 Facebook Technologies, Llc Methods and apparatus for gesture detection and classification

Similar Documents

Publication Publication Date Title
Yamashita et al. CheekInput: turning your cheek into an input surface by embedded optical sensors on a head-mounted display
EP2836892B1 (en) Control of remote device based on gestures
US20200174561A1 (en) System for gaze interaction
KR102170321B1 (en) System, method and device to recognize motion using gripped object
US7849421B2 (en) Virtual mouse driving apparatus and method using two-handed gestures
Kikuchi et al. EarTouch: turning the ear into an input surface
JP6251957B2 (en) Display device, head-mounted display device, and display device control method
US20150143283A1 (en) Information processing device, display control method, and program
US20120192121A1 (en) Breath-sensitive digital interface
JP2014186361A (en) Information processing device, operation control method, and program
KR102297473B1 (en) Apparatus and method for providing touch inputs by using human body
US20130293460A1 (en) Computer vision based control of an icon on a display
WO2012145142A2 (en) Control of electronic device using nerve analysis
KR102654621B1 (en) Method for displaying object and electronic device thereof
EP3392740A1 (en) Information processing device, information processing method, and program
US10877554B2 (en) High efficiency input apparatus and method for virtual reality and augmented reality
WO2022204614A1 (en) Wrist-wearable controller for head-mounted device (hmd) or other user device with camera input
WO2022226429A1 (en) Smart watch -based control of user interface (ui) of other device(s)
KR20160071626A (en) The Apparatus and Method for Portable Device displaying index information
CN114730214A (en) Human interface device
WO2022221781A1 (en) Novel finger swipe -based smart watch user interface (ui) control
Lang et al. A multimodal smartwatch-based interaction concept for immersive environments
US20240036646A1 (en) Controlling a user interface with a trackpad and a smart watch
US20230095282A1 (en) Method And Device For Faciliating Interactions With A Peripheral Device
Lu et al. Realizing multi-touch-like gestures in 3d space

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22776803

Country of ref document: EP

Kind code of ref document: A1