US20160357265A1 - Motion gesture input detected using optical sensors - Google Patents

Motion gesture input detected using optical sensors Download PDF

Info

Publication number
US20160357265A1
US20160357265A1 US15/117,692 US201415117692A US2016357265A1 US 20160357265 A1 US20160357265 A1 US 20160357265A1 US 201415117692 A US201415117692 A US 201415117692A US 2016357265 A1 US2016357265 A1 US 2016357265A1
Authority
US
United States
Prior art keywords
light
wearer
skin
optical sensor
wearable device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US15/117,692
Inventor
Ehsan Maani
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Apple Inc
Original Assignee
Apple Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Apple Inc. filed Critical Apple Inc.
Publication of US20160357265A1 publication Critical patent/US20160357265A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/1686Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being an integrated camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/002Specific input/output arrangements not covered by G06F3/01 - G06F3/16
    • G06F3/005Input arrangements through a video camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/014Hand-worn input/output arrangements, e.g. data gloves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means

Definitions

  • This relates generally to optical sensors and, more specifically, to using optical sensors in a wearable device to recognize gestures of the wearer.
  • Optical sensors have been incorporated into a variety of user devices to provide enhanced functionality and new opportunities for user interaction.
  • Optical sensors for detecting light, detecting proximity, taking photographs, or the like have been incorporated into mobile phones (e.g., smartphones), tablet computers, wearable devices (e.g., watches, glasses, etc.), and other computing devices, allowing software developers to create engaging software applications (“apps”) for entertainment, productivity, health, and the like.
  • optical sensors work in conjunction with a variety of other input mechanisms for interacting with a device (e.g., touchscreens, buttons, microphones for voice commands, etc.).
  • buttons or other interactive elements can be cumbersome or uncomfortable to use in certain positions or in certain operating conditions. For example, it may be cumbersome to interact with a device using both hands (e.g., holding a device in one hand while engaging interface elements with the other).
  • buttons or engage touchscreen functions it may be difficult to press small buttons or engage touchscreen functions while a user's hands are otherwise occupied or unavailable (e.g., when wearing gloves, carrying groceries, holding a child's hand, driving, etc.).
  • device interaction can be limited in a variety of other ways.
  • An exemplary method for determining gestures can include causing light to be emitted from a wearable user device, sensing a portion of the light that is reflected by a wearer's skin, and determining a gesture made by the wearer based on changes in the sensed portion of the light.
  • the changes in the sensed portion of the light correspond to a change in a distance between an optical sensor of the wearable user device and the wearer's skin, and sensing the portion of the light that is reflected by the wearer's skin comprises sensing the portion of the light using the optical sensor.
  • the changes in the sensed portion of the light correspond to a change in an intensity of the sensed portion of the light reflected by the wearer's skin.
  • causing the light to be emitted from the wearable user device can include causing the light to be emitted from a first LED at a first wavelength and a second LED at a second wavelength that is different from the first wavelength.
  • causing the light to be emitted from the wearable user device can include causing the light to be emitted at an angle relative to the wearable device, wherein the emitted light is incident on the wearer's skin at a non-perpendicular angle.
  • sensing the portion of the light that is reflected by the wearer's skin can include generating a signal based on the sensed portion of the light using an optical sensor positioned between a first LED and a second LED, wherein causing the light to be emitted from the wearable user device can include causing the light to be emitted from the first LED and the second LED.
  • determining the gesture based on changes in the sensed portion of the light can include identifying a positive peak, a negative peak, and a zero crossing in a derivative of a signal generated by an optical sensor used for sensing the portion of the light that is reflected by the wearer's skin.
  • the gesture comprises a first clench.
  • An exemplary non-transitory computer-readable storage medium can include computer-executable instructions for performing any of the exemplary methods discussed above.
  • a database can be coupled to the non-transitory computer-readable storage medium, and the database can include gesture recognition data.
  • An exemplary system can include the non-transitory computer-readable storage medium and a processor capable of executing the computer-executable instructions.
  • An exemplary wearable device for determining gestures can include a light source configured to emit light from the device toward a wearer's skin when the wearable device is worn, an optical sensor configured to generate a signal based on sensing a portion of the light reflected by the wearer's skin, a non-transitory computer-readable storage medium comprising computer-executable instructions for determining a gesture made by the wearer based on changes in the signal, and a processor coupled to receive the signal, wherein the processor is capable of executing the computer-executable instructions.
  • the changes in the signal correspond to a change in a distance between the optical sensor and the wearer's skin.
  • the signal generated by the optical sensor changes based on an intensity of the sensed portion of the light reflected by the wearer's skin.
  • the light source can include a first LED configured to emit light at a first wavelength and a second LED configured to emit light at a second wavelength that is different from the first wavelength.
  • the light source can include a first LED and a second LED, and the optical sensor can be positioned between the first LED and the second LED.
  • the light source is angled relative to the wearable device to direct light to be incident on the wearer's skin at a non-perpendicular angle.
  • the computer-executable instructions for determining the gesture include computer-executable instructions for determining the gesture by identifying a positive peak, a negative peak, and a zero crossing in a derivative of the signal.
  • the gesture comprises a first clench.
  • the optical sensor is configured to sense a wavelength of light that corresponds to a wavelength of light emitted by the light source.
  • An exemplary system for determining gestures can include a light source configured to emit light from a wearable user device toward a wearer's skin when the wearable user device is worn; an optical sensor configured to generate a signal based on sensing a portion of the light reflected by the wearer's skin; a non-transitory computer-readable storage medium including computer-executable instructions for determining a gesture made by the wearer based on changes in the signal; a processor coupled to receive the signal, wherein the processor is capable of executing the computer-executable instructions; and a communication module coupled to the processor, wherein the communication module is configured to communicate with a mobile device.
  • the non-transitory computer-readable storage medium further includes instructions for communicating a command to the mobile device via the communication module in response to determining the gesture.
  • An exemplary computer-implemented method for indirectly determining a hand gesture using a wearable device worn on a wearer's wrist can include causing light to be emitted toward the wearer's wrist from a light source of the wearable device, wherein the light source is positioned proximate to skin at the wearer's wrist; sensing a portion of the light that is reflected by the skin at the wearer's wrist using an optical sensor of the wearable device, wherein the optical sensor is positioned proximate to the skin at the wearer's wrist; indirectly determining a hand gesture made by the wearer based on a change in the sensed portion of the light that is reflected by the skin at the wearer's wrist, wherein the change results from a distance between the optical sensor and the skin at the wearer's wrist increasing or decreasing due to the hand gesture.
  • the distance between the optical sensor and the skin at the wearer's wrist increases or decreases less than 1 / 8 th of an inch due to the hand gesture.
  • the change in the sensed portion of the light can correspond to an intensity of the sensed portion of the light increasing or decreasing based on the increasing or decreasing distance between the optical sensor and the skin at the wearer's wrist.
  • indirectly determining the hand gesture made by the wearer includes sensing the change in the sensed portion of the light that is reflected by the skin at the wearer's wrist without directly sensing hand movement of the wearer.
  • FIG. 1 illustrates an exemplary wearable device near a user's skin.
  • FIG. 2 illustrates an exemplary wearable device separated a distance from a user's skin.
  • FIG. 3A illustrates the skin-facing or back side of an exemplary wearable device.
  • FIG. 3B illustrates the face or front side of an exemplary wearable device.
  • FIG. 4 illustrates an open hand of a user wearing an exemplary wearable device.
  • FIG. 5 illustrates a clenched first of a user wearing an exemplary wearable device.
  • FIG. 6 illustrates an exemplary waveform of a sensed signal from an optical sensor of a wearable device.
  • FIG. 7 illustrates an exemplary derivative waveform of a sensed signal from an optical sensor of a wearable device.
  • FIG. 8 illustrates an exemplary derivative waveform with gesture recognition thresholds.
  • FIG. 9 illustrates an exemplary process for determining gestures using an optical sensor of a wearable device.
  • FIG. 10 illustrates an exemplary process for recognizing gestures using an optical sensor and performing functions corresponding to the recognized gestures.
  • FIG. 11 illustrates an exemplary system for detecting gestures.
  • This relates to a wearable device with an optical sensor that can be used to recognize gestures of a user wearing the device.
  • one or more light sources can be positioned on the back or skin-facing side of a wearable device, such as a watch, wristband, armband, leg-band, chest-band, headband, or the like.
  • An optical sensor can be positioned near the one or more light sources on the same side of the wearable device. During operation, light can be emitted from the one or more light sources and sensed using the optical sensor. Changes in the sensed light caused by movements of the user wearing the device can be used to recognize user gestures.
  • light emitted from a light source can reflect off a wearer's skin, and the reflected light can be sensed using the optical sensor.
  • the wearer gestures in some way, such as by clenching the first on the arm where the device is worn, the reflected light can change perceptibly due to muscle contraction, the device shifting on the wrist, skin stretching, the distance changing between the optical sensor and the wearer's skin, or the like.
  • Various changing features of the reflected and sensed light can be used to recognize a deliberate gesture, such as changes in angle of incidence, intensity, position, wavelength, or the like.
  • incorporating an optical sensor in a wearable device and using it to recognize wearer gestures can provide a convenient way to interact with the wearable device.
  • Such gesture recognition can supplement or even replace other interfaces, such as touchscreens, buttons, dials, or the like, and it can be used to perform the same or similar functions, such as selecting a display element, navigating to a different view, changing the time, changing a display, answering a phone call, or any of a variety of other functions.
  • such gesture recognition can provide one-handed or hands-free device operation, which can provide convenience as well as safety (e.g., when carrying groceries, driving, or the like). It should be understood that many other applications are possible, and gesture recognition as discussed herein can provide a variety of other benefits and enhance a user's experience in many other ways.
  • FIG. 1 illustrates exemplary wearable device 100 near a wearer's skin surface 116 .
  • wearable device 100 can be configured to be worn on a user's wrist (e.g., as a bracelet, wristwatch, wristband, etc.), and skin surface 116 could be anywhere around the user's wrist depending on how the user prefers to wear the device (e.g., with an associated display near the palm or near the back side of the hand).
  • Wearable device 100 can include, among other things, light source 104 , light source 106 , and optical sensor 102 .
  • Light sources 104 and 106 can each include a light-emitting diode (LED) or another light source.
  • the light sources can be angled such that the bulk of the emitted light reflects back toward optical sensor 102 when the device is worn normally (e.g., angled such that emitted light can be incident on the wearer's skin at a non-perpendicular angle). Angling the light sources can also avoid arcing between the light source and the sensor (e.g., emitted light reaching the sensor without first reflecting off a user's skin).
  • light source 104 can be angled sixty degrees away from vertical toward optical sensor 102 , such that the bulk of emitted light 108 can reflect off skin surface 116 toward optical sensor 102 , as illustrated by reflected light 110 .
  • light source 106 can be angled sixty degrees away from vertical toward optical sensor 102 , such that the bulk of emitted light 112 can reflect off skin surface 116 toward optical sensor 102 , as illustrated by reflected light 114 .
  • light sources 104 and 106 can be angled more sharply vertical (e.g., thirty degrees from vertical toward optical sensor 102 ) or more sharply horizontal.
  • Light sources 104 and 106 can be configured in a variety of different ways for robust recognition of changes in reflected light 110 and reflected light 114 .
  • light sources 104 and 106 can include LED's that emit infrared light in the 900 nm range.
  • light source 104 can be configured to emit light having a different wavelength than that of the light emitted by light source 106 .
  • the light emitted by light source 104 and the light emitted by light source 106 can be selected such that they can be differentiated by optical sensor 102 , and can be visible, ultraviolet, infrared, or other wavelengths.
  • Light sources 104 and 106 can also be operated differently to allow for distinct reflected light recognition, such as by pulsing emitted light at different frequencies, emitting light at alternating intervals, or the like. In some examples, having two separate light sources can aid in distinguishing between localized movements or whole device movements (e.g., certain muscles contracting versus an entire device shifting uniformly). In still other examples, however, wearable device 100 can include a single light source (e.g., either light source 104 or light source 106 ), more than two light sources, more than one optical sensor, or various other combinations, or can include an optical sensor that can detect changes in ambient lighting reflected by skin surface 116 without a dedicated light source.
  • a single light source e.g., either light source 104 or light source 106
  • more than two light sources e.g., more than one optical sensor, or various other combinations, or can include an optical sensor that can detect changes in ambient lighting reflected by skin surface 116 without a dedicated light source.
  • Optical sensor 102 can include a photodiode or other photodetector capable of converting light into a current, voltage, or other signal.
  • optical sensor 102 can convert sensed light into a signal (e.g., current, voltage, etc.) with a magnitude or value that corresponds to the amount of light received by the sensor.
  • optical sensor 102 can be configured to sense particular wavelengths of light (e.g., those generated by light sources 104 and 106 ) while not sensing others.
  • distance 118 between skin surface 116 and wearable device 100 can dictate where reflected light 110 and reflected light 114 are incident on the skin-facing side of wearable device 100 . While emitted light and reflected light are illustrated in FIG.
  • optical sensor 102 might generate a signal representative of a relatively small amount of sensed light as much of reflected light 110 and reflected light 114 might strike the body of wearable device 100 rather than the sensor.
  • FIG. 2 illustrates wearable device 100 separated from skin surface 116 by distance 220 , which might be larger than distance 118 of FIG. 1 .
  • Light sources 104 and 106 can generate emitted light 222 and emitted light 226 , which can be reflected off skin surface 116 as reflected light 224 and reflected light 228 , respectively.
  • Distance 220 can allow reflected light 224 and reflected light 228 to strike optical sensor 102 near the center of the sensor (as opposed to striking the edges of the sensor, as in FIG. 1 ).
  • the simplified straight light lines in FIG. 2 are illustrated for simplicity, and the light can spread into a larger beam and reflect at different angles. As such, the more centered light incidence illustrated in FIG.
  • optical sensor 102 can signify that more of the spread of light can be received by the sensor than if the spread of light was incident elsewhere. With more of the spread of light incident near the center, optical sensor 102 might generate a signal representative of a relatively large amount of sensed light as the bulk of reflected light 224 and reflected light 228 might strike the sensor itself rather than other components (e.g., being incident on the sensor rather than the body of wearable device 100 ). Optical sensor 102 can thus generate distinct signals that can be representative of the distance between wearable device 100 and skin surface 116 .
  • a user's hand gesture can account for the difference between distance 118 in FIG. 1 and distance 220 in FIG. 2 .
  • a user can gesture by clenching a fist, moving a finger, bending a wrist, or the like, which can cause the distance between wearable device 100 and skin surface 116 to change (e.g., change from distance 118 in FIG. 1 to distance 220 in FIG. 2 ).
  • a user's gesture can cause at least a slight shift in a wrist strap associated with a wearable device or a deformation in a user's skin, which can cause a change in distance between the device and the user's skin.
  • contracting muscles can cause the wrist to expand, which can also change the distance between a wearable device and a user's skin. It should be understood that many other factors associated with a gesture can cause a change in the distance between a wearable device and a user's skin.
  • the different signals generated by optical sensor 102 based on different distances can be used to recognize when a user gestures deliberately to interact with a wearable device.
  • factors besides the distance between a wearable device and a skin surface can be used to recognize a gesture.
  • skin stretching, user perspiration, and the like can cause perceptible changes in reflected light that optical sensor 102 can detect.
  • light sources 104 and 106 can generate light that can at least partially penetrate the outer skin layer and reflect in different ways based on muscles contracting, tendons shifting, tissue compressing, skin stretching, density changing, pressure changing, or the like.
  • optical sensor 102 can sense the reflected light and generate signals based on how emitted light is modified before being incident on the sensor. Changes in the signals generated by optical sensor 102 can be used to recognize wearer activity.
  • gesture recognition can be used to determine hand gestures indirectly by sensing small changes that occur at a wearer's wrist as opposed to directly sensing movements of a gesturing hand.
  • a deliberate gesture might include clenching and unclenching a fist
  • an increasing or decreasing distance e.g., changing less than 1 / 8 th of an inch
  • an optical sensor of a wearable device can be sensed and used to indirectly recognize the gesture rather than directly detecting the motion of the clenching and unclenching first (e.g., finger and palm movements).
  • Light can be emitted toward the skin at a wearer's wrist, reflect off the skin at the wearer's wrist, and be sensed by an optical sensor positioned proximate to the skin at the wearer's wrist.
  • the intensity of the sensed light can increase or decrease based on an increasing or decreasing distance between the optical sensor and the skin at the wearer's wrist, which changing distance can result from a deliberate hand gesture as discussed above. Accordingly, by monitoring small changes in light emitted and sensed at the wearer's wrist, a hand gesture can be indirectly recognized without directly sensing finger or palm movements.
  • gesture recognition discussed herein can be applied to a variety of other body parts beyond the wrist and beyond wearable devices that are worn on the wrist, and wrist-worn devices discussed herein are provided as non-limiting examples of gesture recognition using optical sensors.
  • gesture recognition according to the examples herein can be performed using a device on an armband, leg-band, chest-band, headband, or the like.
  • Device and skin movements can be detected, and those detected movements can be used to determine a gesture.
  • a wearable device in a headband can recognize a deliberate gesture from a user raising one or both eyebrows, causing the headband device to shift and/or perceptibly altering the skin on the forehead.
  • a wearable device in a leg-band or ankle-band can recognize a deliberate gesture from a user bending a leg, moving a foot, or the like, causing the leg-band or ankle-band device to shift and/or perceptibly altering the skin on the leg, ankle, or foot.
  • any movements that cause device shifting, distance changes, skin movements, or the like can be detected and used for determining a gesture.
  • the various examples discussed herein can be applied to a variety of devices that can be worn near any part of the skin anywhere on the body, and gestures can be determined from changes in optical sensor signals of those various devices.
  • light sources and optical sensors can be a fraction of the illustrated size, and light sources and optical sensors can be positioned adjacent one another in a much smaller space.
  • distances 118 and 220 may be exaggerated, and in some instances, optical sensor 102 can detect differences in distances that may be imperceptible to the human eye.
  • optical sensor 102 can be sensitive enough to recognize a slight difference in distance (e.g., based on a slight change in wrist circumference during a gesture).
  • wearable device 100 can include a variety of other components not illustrated or described here, such as a display, touchscreen, buttons, dials, central processing unit, memory, camera, indicator lights, oscillator, clock, wireless transceivers, batteries, motions sensors, light sensors, touch sensors, other sensors, or the like. It should also be understood that gesture recognition using an optical sensor, as discussed herein, can be implemented in a variety of different wearable devices with a variety of different functions.
  • FIG. 3A illustrates the skin-facing or back side of exemplary wearable device 100 .
  • wearable device 100 includes a wristwatch or other wearable device, which can be attached to wrist strap 330 for securing the device to a user's wrist.
  • the skin-facing or back side of wearable device 100 can include light sources 104 and 106 separated by optical sensor 102 .
  • Light sources 104 and 106 can include LED's, which can be positioned behind glass, plastic, or other transparent windows, or can be positioned behind the same window.
  • optical sensor 102 can also be positioned behind the same or a different glass, plastic, or other transparent window.
  • Optical sensor 102 and particularly a sensing or detecting area of optical sensor 102 , can be sized to avoid interference from arm hair and other skin surface variations.
  • optical sensor 102 can be sized several times larger than a typical arm hair such that the sensor might not be easily blocked from sensing reflected light.
  • light sources 104 and 106 and optical sensor 102 can be positioned within wrist strap 330 , and wrist strap 330 can include wiring to source power from and enable communication with wearable device 100 .
  • light sources 104 and 106 and optical sensor 102 can be oriented differently (e.g., in-line with wrist strap 330 ).
  • FIG. 3B illustrates the face or front side of exemplary wearable device 100 secured to wrist strap 330 .
  • wearable device 100 can include a display 332 on the front side, and a user can interact with information shown on display 332 using gestures. For example, images, data, virtual buttons, a clock, applications, music tracks, or the like can be displayed on display 332 , and a user can perform certain gestures to make selections, advance to a new display, enable the display, return to a home screen, navigate forward, navigate backward, open a notification, scroll through information, scroll through applications, scroll through songs, pause playing music, start playing an audio file, answer a phone call, engage a virtual assistant, or perform any of a variety of other functions.
  • Various gestures can be used to engage these functions, as discussed in more detail below with reference to FIG. 4 and FIG. 5 .
  • display 332 can include touchscreen capability that can be used in addition to or instead of gestures to interact with wearable device 100 .
  • wearable device 100 can include a variety of other components not illustrated in FIG. 3A or FIG. 3B or described here, such as buttons, dials, central processing unit, memory, camera, indicator lights, oscillator, clock, wireless transceivers, batteries, motions sensors, light sensors, touch sensors, other sensors, or the like.
  • FIG. 4 illustrates an open hand of a user wearing exemplary wearable device 100 .
  • a user can interact with wearable device 100 using a variety of gestures that can cause perceptible changes in light directed to a user's skin, reflected off the user's skin, and sensed by an optical sensor.
  • a wide variety of movements can cause changes to reflected light that can alter the signal generated by an optical sensor, and such an altered signal can be interpreted as a deliberate gesture.
  • Movements that can be interpreted as a deliberate gesture can include finger movements, hand movements, wrist movements, whole arm movements, or the like.
  • the open hand in FIG. 4 can be clenched into a first as shown in FIG. 5 and opened or unclenched back to the position shown in FIG. 4 as a deliberate gesture for interacting with wearable device 100 .
  • the wearer's wrist in FIG. 5 can be at least slightly altered as compared to the wearer's wrist in FIG. 4 (e.g., circumference expanded, skin stretched, muscles contracted, tendons shifted, tissue compressed, etc.).
  • the at least slight alteration of the wrist from having an open hand as in FIG. 4 to having a clenched first as in FIG. 5 can cause the distance between wearable device 100 and the wearer's skin to change (e.g., increase or decrease).
  • an optical sensor associated with wearable device 100 can generate a signal that changes when the distance between the device and the wearer's skin changes, such that the motion of clenching the user's first can be perceived along with the subsequent motion of unclenching the user's fist.
  • the sequential clenching and unclenching of the user's first can be recognized as a deliberate gesture (e.g., like the equivalent of a mouse cursor click or touchscreen press to select a virtual button, answer a phone call, navigate forward, etc.).
  • a deliberate gesture e.g., like the equivalent of a mouse cursor click or touchscreen press to select a virtual button, answer a phone call, navigate forward, etc.
  • other motions can be used as gestures for interacting with wearable device 100 .
  • the wearer can move the thumb inward to touch the palm and subsequently release the thumb back to its natural position (e.g., the position shown in FIG. 4 ).
  • the wearer can contract the fingers (e.g., digits other than the thumb) to touch the palm and subsequently release them back to their natural position.
  • the wearer can contract an individual finger (e.g., the index finger, middle finger, etc.) to touch the palm and subsequently release the finger to its natural position.
  • the wearer can bend the hand downward at the wrist (e.g., flexion), bend the hand upward at the wrist (e.g., extension), twist the hand (e.g., pronation or supination), angle the hand to the left or right (e.g., radial deviation or ulnar deviation), or perform any of a variety of other deliberate motions that can cause at least slight alterations at the wrist that can be perceived by an optical sensor sensing changes in incident light.
  • wearable device 100 can be configured to uniquely recognize different gestures for performing different functions as well as recognize different ways of performing the same gestures as indicating different user intent. For example, a wearer can clench the fist, hold the clenched first for a certain amount of time, and subsequently open the hand to request a function different than one associated with clenching and unclenching the first in rapid succession (e.g., as in the difference between a simple mouse cursor click and a double-click). In another example, a wearer can twist the hand in one direction to request one function (e.g., navigate forward) and twist the hand in the other direction to request a different function (e.g., navigate backward).
  • a wearer can clench the fist, hold the clenched first for a certain amount of time, and subsequently open the hand to request a function different than one associated with clenching and unclenching the first in rapid succession (e.g., as in the difference between a simple mouse cursor click and a double
  • a wearer can touch the thumb to the palm to request one function (e.g., select) and touch the fingers to the palm to request a different function (e.g., return to a home screen).
  • one function e.g., select
  • a different function e.g., return to a home screen.
  • any of a group of different motions can each be perceived as a user request for the same function (e.g., any of first clenching, wrist bending, or thumb touching can be interpreted as a request for the same function).
  • wearable device 100 can be configured in some examples to perform learning or training to adapt to a particular wearer.
  • an application associated with wearable device 100 can instruct the wearer to perform certain movements (e.g., instruct to clench and unclench the first by displaying the command on the device display, playing the command audibly, or the like).
  • wearable device 100 can monitor how signals generated by the optical sensor change in response to the requested movements. Wearable device 100 can then associate changes in the optical sensor signals with particular movements in order to recognize a deliberate gesture of the specific wearer in the future.
  • wearable device 100 can be trained differently for wearing on different hands or for wearing in different positions on the wrist (e.g., palm side or back side).
  • particular changes in the optical sensor signal and associated gestures can be stored in a database (e.g., in memory on the device, remote from the device, on another device, etc.), and sensed changes in the optical sensor signal can be compared to the stored data to recognize a particular gesture.
  • FIG. 6 illustrates an exemplary waveform of a sensed signal from an optical sensor of a wearable device (e.g., such as from optical sensor 102 of wearable device 100 discussed above).
  • the vertical axis can correspond to a magnitude of a signal generated by an optical sensor over time (e.g., in seconds on the horizontal axis).
  • a signal can correspond to a current, voltage, or similar signal representative of sensed light.
  • the signal can be a current with units of milliamps, which can be quantized along a scale to enable comparison, calculation, and gesture recognition (e.g., can be digitized with 24 bits of precision).
  • the waveform in FIG. 6 can correspond to a user wearing a wearable device with an optical sensor as discussed herein performing six repeated gestures, each gesture being separated by a gap of a few seconds.
  • each spike or gesture event 640 can correspond to the clenching and unclenching of the wearer's first in rapid succession.
  • first clench 642 can correspond to an increasing magnitude
  • subsequent first release 644 can correspond to a decreasing magnitude.
  • the combination of an increasing magnitude and a decreasing magnitude in rapid succession can be recognized as a deliberate gesture.
  • the wearable device may have shifted relative to the corresponding wearer such that more light was sensed when the first was clenched than when the first was unclenched.
  • the waveform can be seen rising in magnitude with first clench 642 and subsequently falling in magnitude with first release 644 .
  • the signal and the associated waveform can vary for different wearers or for the same wearer performing a different gesture.
  • the magnitude may first fall and then rise in succession for a different user or for a different gesture (e.g., bending the wrist).
  • the peak magnitude of the spike associated with gesture event 640 can be lower or higher for different users or different gestures.
  • the width or the slope of a spike associated with a gesture event can vary for different users or different gestures (e.g., wider when holding the first clenched to indicate a particular user request). It should be understood that various different gestures and wearer characteristics can produce a variety of different waveforms that can be used to recognize gesture events (e.g., deliberate user gestures indicating particular user intent). Moreover, it should be understood that in some examples a wearable device can learn or be trained to associate unique signals with a particular wearer's gestures.
  • FIG. 7 illustrates an exemplary derivative waveform of a sensed signal from an optical sensor of a wearable device.
  • the derivative waveform of FIG. 7 can correspond to the raw signal waveform of FIG. 6 .
  • positives peaks 750 , negative peaks 752 , and zero crossings 754 can be used to positively identify a deliberate user gesture for interacting with a wearable device.
  • having an occurrence of positive peak 750 , followed by zero crossing 754 , followed by negative peak 752 can be sufficient to identify recognized gesture event 756 (marked by rectangles).
  • order can vary such that a negative peak can be followed by a zero crossing and a positive peak and also be identified as a recognized gesture event.
  • having an occurrence of a positive peak, a negative peak, and a zero crossing within a certain time window can be sufficient to recognize a gesture.
  • gesture recognition thresholds can be used to filter out noise or other haphazard data and correctly recognize deliberate user gestures for interacting with a wearable device.
  • FIG. 8 illustrates an exemplary derivative waveform with gesture recognition thresholds, including positive threshold 860 and negative threshold 862 .
  • a positive or negative peak in a derivative waveform can be recognize when outside predetermined thresholds, such as being outside positive threshold 860 and negative threshold 862 .
  • peaks or other data movements within the thresholds can be ignored. For example, minor vibrations or casual hand motions can lead to significant movements in a raw optical sensor signal as well as in the derivative of the raw optical sensor signal.
  • gesture recognition thresholds can be used to restrict positive gesture recognition to instances of significant movement outside the thresholds.
  • a variety of other techniques and data can be used to positively recognize deliberate user gestures while filtering out noise and other haphazard data. For example, concavity changes, zero crossings, time between peaks, peak slope, peak width, and other data characteristics can be used to limit gesture recognition to particular expected signal behavior. In this manner, a wearable device can accurately recognize and respond to user gestures when appropriate, and avoid recognizing user interaction when none was intended.
  • FIG. 9 illustrates exemplary process 970 , which can be used according to various examples discussed herein for determining gestures using an optical sensor in a wearable device.
  • light can be caused to be emitted from a wearable user device.
  • a light source can include a light-emitting diode (LED) that can be angled such that the bulk of the emitted light reflects back toward an optical sensor (e.g., optical sensor 102 ) when the device is worn normally (e.g., angled such that emitted light can be incident on the wearer's skin at a non-perpendicular angle).
  • LED light-emitting diode
  • each light source can be configured to emit light having a different wavelength than that of the other light source.
  • the light emitted by individual light sources can be selected such that they can be differentiated by an optical sensor, and can be visible, ultraviolet, infrared, or other wavelengths.
  • Light sources can also be operated differently to allow for distinct reflected light recognition, such as by pulsing emitted light at different frequencies, emitting light at alternating intervals, or the like.
  • having two separate light sources can aid in distinguishing between localized movements or whole device movements (e.g., certain muscles contracting versus an entire device shifting uniformly).
  • a wearable device can include a single light source, more than two light sources, more than one optical sensor, or various other combinations, or can include an optical sensor that can detect changes in ambient lighting reflected by a skin surface without a dedicated light source.
  • light can be sensed that is reflected by a user's skin.
  • reflected light can be sensed as discussed above with reference to optical sensor 102 .
  • Light can be sensed in a variety of ways, such as being sensed with a photodiode that generates a current, voltage, or other signal representing the relative amount of light incident on the sensor (e.g., the intensity of received light).
  • an optical sensor can be configured to sense primarily or exclusively light with similar characteristics (e.g., wavelength) as light emitted from light sources associated with a wearable device.
  • the distance between a skin surface of a user and a wearable device can dictate where reflected light is incident on the skin-facing side of the wearable device, which can affect the intensity of light received at the sensor.
  • an optical sensor in a first position relative to a wearer's skin, an optical sensor might generate a signal representative of a relatively small amount of sensed light, while in another position relative to a wearer's skin, the optical sensor might generate a signal representative of a relatively large amount of sensed light given a shift in where the light is incident relative to the sensor. Movements of the wearable device and the optical sensor relative to a wearer's skin can thus cause the signal generated by the optical sensor to change as the intensity of light received at the sensor changes.
  • a gesture can be determined based on changes in sensed light.
  • a gesture event can be determined from a raw optical sensor signal (e.g., as discussed with reference to FIG. 6 ), or from the derivative of an optical sensor signal (e.g., as discussed with reference to FIG. 7 or FIG. 8 ).
  • a wide variety of movements can cause changes to reflected light that can alter the signal generated by an optical sensor, and such an altered signal can be interpreted as a deliberate gesture. Movements that can be interpreted as a deliberate gesture can include finger movements, hand movements, wrist movements, whole arm movements, or the like.
  • the wearer's wrist can be at least slightly altered between a clenched first and an open hand (e.g., circumference expanded, skin stretched, muscles contracted, tendons shifted, tissue compressed, etc.).
  • the at least slight alteration of the wrist from having an open hand to having a clenched first can cause the distance between the wearable device and the wearer's skin to change (e.g., increase or decrease).
  • an optical sensor associated with the wearable device can generate a signal that changes when the distance between the device and the wearer's skin changes, such that the motion of clenching the user's first can be perceived along with the subsequent motion of unclenching the user's fist.
  • the sequential clenching and unclenching of the user's first can be determined to be a deliberate gesture (e.g., like the equivalent of a mouse cursor click or touchscreen press to select a virtual button, answer a phone call, navigate forward, etc.).
  • a deliberate gesture e.g., like the equivalent of a mouse cursor click or touchscreen press to select a virtual button, answer a phone call, navigate forward, etc.
  • a derivative of the optical sensor signal can be used for gesture recognition. For example, having an occurrence of a positive peak, followed by a zero crossing, followed by a negative peak in the derivative of the optical sensor signal can be sufficient to determine a deliberate gesture. In another example, order can vary such that a negative peak can be followed by a zero crossing and a positive peak and also be identified as a deliberate gesture. In yet another example, having an occurrence of a positive peak, a negative peak, and a zero crossing within a certain time window can be sufficient to determine a deliberate gesture in a derivative of an optical sensor signal.
  • a variety of other techniques and data can be used to positively determine deliberate user gestures while filtering out noise and other haphazard data. For example, concavity changes, zero crossings, time between peaks, peak slope, peak width, and other data characteristics can be used to limit gesture recognition to particular expected signal behavior. In this manner, a wearable device can accurately determine and respond to user gestures when appropriate, and avoid recognizing user interaction when none was intended.
  • a determined gesture can be associated with a particular user request, and a wearable device can respond appropriately to the user request (e.g., a request to select a virtual button, navigate to a different view, answer a phone call, advance to a different song, or the like).
  • Gesture recognition can thus be used in some examples to enable a user to interact with a wearable device without pushing buttons, touching a touchscreen, or the like.
  • FIG. 10 illustrates exemplary process 1080 , which can be used according to various examples discussed herein for recognizing gestures using an optical sensor and performing functions corresponding to the recognized gestures.
  • gesture recognition can be enabled upon user engagement with a wearable device.
  • light sources and an optical sensor integrated into a wearable device for gesture recognition can be disabled, kept off, or used for functions other than gesture recognition when deliberate gestures may be unlikely (e.g., when a wearer is not engaging the device, when no one is wearing the device, when a wearer is running, etc.).
  • Disabling gesture recognition using the light sources and optical sensor can conserve battery power, avoid false gesture detection, free them to be used for other functions, or the like.
  • gesture recognition can be automatically enabled (e.g., the light sources and optical sensor can be enabled for recognizing gestures).
  • a variety of methods can be used to recognize that a user is engaged with a wearable device. For example, other sensors in the wearable device can be used to detect that a user has raised the wearable device, angled it toward the user's face, and is looking at the display (e.g., using an accelerometer, gyroscope, camera, proximity sensor, light sensor, or the like).
  • a user can press a button, touch a touchscreen, say a command, shake the device, or the like to commence engagement with the wearable device.
  • sensors in the wearable device can be used to detect that a user is both not moving and actively looking at the display before enabling active gesture recognition, as movement could lead to artifacts and false gesture detection.
  • light sources and an optical sensor can be enabled at times when a user is not engaging the device, but signals generated by the optical sensor can be ignored or discarded until user engagement is detected.
  • light can be caused to be emitted from a wearable user device (e.g., as described with reference to block 972 of process 970 ).
  • light can be sensed that is reflected by a user's skin (e.g., as described with reference to block 974 of process 970 ).
  • a determination can be made as to whether changes in sensed light indicate a deliberate user gesture (e.g., a gesture intended to cause a functional response by the wearable device).
  • a deliberate gesture can be recognized from a raw optical sensor signal (e.g., as discussed with reference to FIG. 6 ), or from the derivative of an optical sensor signal (e.g., as discussed with reference to FIG. 7 or FIG. 8 ).
  • polling cycles can be repeated of causing light to be emitted and sensing reflected light to determine whether a user has performed a deliberate gesture. Consecutive polling cycles can be separated by a time interval, or they can be continuous until a gesture is detected or another event breaks the cycle (e.g., device powers off, user engagement terminates, a timeout interval passes, etc.).
  • a function corresponding to the recognized gesture can be performed at block 1090 .
  • a particular recognized gesture can indicate a user's desired intent to answer an incoming call, so the function of answering the incoming call can be performed.
  • recognized gestures in other contexts can correspond to a variety of other functions that can be performed upon recognizing a user's deliberate gesture (e.g., navigate forward, navigate backward, pause music, open a notification, etc.).
  • gesture recognition can continue (e.g., at block 1084 ) to allow a user to continue interacting with a wearable device using gestures.
  • gesture recognition can be disabled upon performance of a function at block 1090 , or when the wearable device is powered off, a user ceases to engage the device, a timeout interval passes, or the like.
  • gesture recognition can be used to interact with a wearable device
  • a user can effectively generate commands using gestures that a wearable device can communicate to any of a variety of other devices, such as a mobile phone, television, audio system, media player, game console, lighting system, security system, tablet computer, or the like.
  • a wearable device can be in communication with a media player (e.g., via Wi-Fi, Bluetooth, the Internet etc.), and the wearer can perform a recognizable gesture.
  • the wearable device can transmit a corresponding command to the media player, such as navigate through a menu, pause playback, select content for display, or the like.
  • a wearable device can be in communication with a mobile telephone (e.g., via Bluetooth), and the wearer can perform a recognizable gesture.
  • the wearable device can transmit a corresponding command to the mobile telephone, such as answer a phone call, silence a ringtone, emit a sound to help locate the phone, or the like. It should be understood that gesture recognition can be employed for still many other device-to-device interactions.
  • gesture recognition can be achieved using a light source and an optical sensor
  • various other sensors and elements can be used in conjunction with a light source and an optical sensor to recognize deliberate user gestures for interacting with a wearable device.
  • accelerometers and/or gyroscopes can be used to detect movement and determine whether optical sensor signals represent deliberate gestures instead of artifacts of other random user movements.
  • certain gestures can yield unique optical sensor signals concurrently with unique accelerometer and/or gyroscope signals, and the simultaneously occurrence of a combination of such signals can be used to positively recognize a deliberate gesture.
  • other device elements, sensors, and signal combinations can be used to avoid false gesture detection and correctly recognize deliberate user gestures.
  • System 1100 can include instructions stored in a non-transitory computer readable storage medium, such as memory 1102 or storage device 1101 , and executed by processor 1105 .
  • the instructions can also be stored and/or transported within any non-transitory computer readable storage medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions.
  • a “non-transitory computer readable storage medium” can be any medium that can contain or store the program for use by or in connection with the instruction execution system, apparatus, or device.
  • the non-transitory computer readable storage medium can include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus or device, a portable computer diskette (magnetic), a random access memory (RAM) (magnetic), a read-only memory (ROM) (magnetic), an erasable programmable read-only memory (EPROM) (magnetic), a portable optical disc such as CD, CD-R, CD-RW, DVD, DVD-R, or DVD-RW, flash memory such as compact flash cards, secured digital cards, USB memory devices, memory sticks, and the like, or any type of database.
  • the instructions can also be propagated within any transport medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions.
  • a “transport medium” can be any medium that can communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
  • the transport medium can include, but is not limited to, an electronic, magnetic, optical, electromagnetic, or infrared wired or wireless propagation medium.
  • System 1100 can also include an input/output (“I/O”) module, such as I/O module 1103 , which can enable processor 1105 to communicate with other components of system 1100 as well as peripherals, devices, servers, databases, and the like.
  • I/O module 1103 can include a transceiver, radio, modem, or the like that can enable processor 1105 to communicate with an external device through wired or wireless communication means, including LAN, WAN, Wi-Fi, Bluetooth, cellular, or the like.
  • System 1100 can further include touch sensitive display 1107 coupled to processor 1105 for detecting touch and displaying information. It is to be understood that the system is not limited to the components and configuration of FIG. 11 , but can include other or additional components in multiple configurations according to various examples. Additionally, the components of system 1100 can be included within a single device, or can be distributed among multiple devices. In some examples, processor 1105 can be located within touch sensitive display 1107 .
  • some examples of the disclosure are directed to a computer-implemented method for determining gestures, the method comprising: causing light to be emitted from a wearable user device; sensing a portion of the light that is reflected by a wearer's skin; and determining a gesture made by the wearer based on changes in the sensed portion of the light.
  • the changes in the sensed portion of the light correspond to a change in a distance between an optical sensor of the wearable user device and the wearer's skin; and sensing the portion of the light that is reflected by the wearer's skin comprises sensing the portion of the light using the optical sensor.
  • causing the light to be emitted from the wearable user device comprises: causing the light to be emitted from a first LED at a first wavelength and a second LED at a second wavelength that is different from the first wavelength.
  • causing the light to be emitted from the wearable user device comprises: causing the light to be emitted at an angle relative to the wearable device, wherein the emitted light is incident on the wearer's skin at a non-perpendicular angle.
  • sensing the portion of the light that is reflected by the wearer's skin comprises: generating a signal based on the sensed portion of the light using an optical sensor positioned between a first LED and a second LED; and causing the light to be emitted from the wearable user device comprises causing the light to be emitted from the first LED and the second LED.
  • determining the gesture based on changes in the sensed portion of the light comprises: identifying a positive peak, a negative peak, and a zero crossing in a derivative of a signal generated by an optical sensor used for sensing the portion of the light that is reflected by the wearer's skin. Additionally or alternatively to one or more of the examples disclosed above, in some examples the gesture comprises a first clench.
  • non-transitory computer-readable storage medium comprising computer-executable instructions for performing any of the methods described above; and a database can be coupled to the non-transitory computer-readable storage medium, wherein the database comprises gesture recognition data.
  • a wearable device for determining gestures, the device comprising: a light source configured to emit light from the device toward a wearer's skin when the wearable device is worn; an optical sensor configured to generate a signal based on sensing a portion of the light reflected by the wearer's skin; a non-transitory computer-readable storage medium comprising computer-executable instructions for determining a gesture made by the wearer based on changes in the signal; and a processor coupled to receive the signal, wherein the processor is capable of executing the computer-executable instructions.
  • the changes in the signal correspond to a change in a distance between the optical sensor and the wearer's skin. Additionally or alternatively to one or more of the examples disclosed above, in some examples the signal generated by the optical sensor changes based on an intensity of the sensed portion of the light reflected by the wearer's skin. Additionally or alternatively to one or more of the examples disclosed above, in some examples the light source comprises a first LED configured to emit light at a first wavelength and a second LED configured to emit light at a second wavelength that is different from the first wavelength.
  • the light source comprises a first LED and a second LED; and the optical sensor is positioned between the first LED and the second LED. Additionally or alternatively to one or more of the examples disclosed above, in some examples the light source is angled relative to the wearable device to direct light to be incident on the wearer's skin at a non-perpendicular angle. Additionally or alternatively to one or more of the examples disclosed above, in some examples the computer-executable instructions for determining the gesture comprise computer-executable instructions for determining the gesture by identifying a positive peak, a negative peak, and a zero crossing in a derivative of the signal.
  • the gesture comprises a first clench.
  • the optical sensor is configured to sense a wavelength of light that corresponds to a wavelength of light emitted by the light source.
  • a system for determining gestures comprising: a light source configured to emit light from a wearable user device toward a wearer's skin when the wearable user device is worn; an optical sensor configured to generate a signal based on sensing a portion of the light reflected by the wearer's skin; a non-transitory computer-readable storage medium comprising computer-executable instructions for determining a gesture made by the wearer based on changes in the signal; a processor coupled to receive the signal, wherein the processor is capable of executing the computer-executable instructions; and a communication module coupled to the processor, wherein the communication module is configured to communicate with a mobile device.
  • the non-transitory computer-readable storage medium further comprises instructions for communicating a command to the mobile device via the communication module in response to determining the gesture.
  • other examples of the disclosure are directed to a computer-implemented method for indirectly determining a hand gesture using a wearable device worn on a wearer's wrist, the method comprising: causing light to be emitted toward the wearer's wrist from a light source of the wearable device, wherein the light source is positioned proximate to skin at the wearer's wrist; sensing a portion of the light that is reflected by the skin at the wearer's wrist using an optical sensor of the wearable device, wherein the optical sensor is positioned proximate to the skin at the wearer's wrist; indirectly determining a hand gesture made by the wearer based on a change in the sensed portion of the light that is reflected by the skin at the wearer's wrist, wherein the change results from a distance between the optical sensor and the skin at the wearer's wrist increasing or decreasing due to the hand gesture.
  • the distance between the optical sensor and the skin at the wearer's wrist increases or decreases less than 1 / 8 th of an inch due to the hand gesture.
  • the change in the sensed portion of the light corresponds to an intensity of the sensed portion of the light increasing or decreasing based on the increasing or decreasing distance between the optical sensor and the skin at the wearer's wrist.
  • indirectly determining the hand gesture made by the wearer comprises: sensing the change in the sensed portion of the light that is reflected by the skin at the wearer's wrist without directly sensing hand movement of the wearer.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Multimedia (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A wearable device with an optical sensor is disclosed that can be used to recognize gestures of a user wearing the device. Light sources can be positioned on the back or skin-facing side of a wearable device, and an optical sensor can be positioned near the light sources. During operation, light can be emitted from the light sources and sensed using the optical sensor. Changes in the sensed light can be used to recognize user gestures. For example, light emitted from a light source can reflect off a wearer's skin, and the reflected light can be sensed using the optical sensor. When the wearer gestures in a particular way, the reflected light can change perceptibly due to muscle contraction, device shifting, skin stretching, or the distance changing between the optical sensor and the wearer's skin. Recognized gestures can be interpreted as commands for interacting with the wearable device.

Description

    FIELD
  • This relates generally to optical sensors and, more specifically, to using optical sensors in a wearable device to recognize gestures of the wearer.
  • BACKGROUND
  • Optical sensors have been incorporated into a variety of user devices to provide enhanced functionality and new opportunities for user interaction. Optical sensors for detecting light, detecting proximity, taking photographs, or the like have been incorporated into mobile phones (e.g., smartphones), tablet computers, wearable devices (e.g., watches, glasses, etc.), and other computing devices, allowing software developers to create engaging software applications (“apps”) for entertainment, productivity, health, and the like. In some instances, optical sensors work in conjunction with a variety of other input mechanisms for interacting with a device (e.g., touchscreens, buttons, microphones for voice commands, etc.).
  • Many devices, however, can have limited device interaction and control capabilities due to device size constraints, display size constraints, operational constraints, and the like. For example, small or thin user devices can have a limited number of physical buttons for receiving user input. Similarly, small user devices can have touchscreens with limited space for providing virtual buttons or other virtual user interface elements. In addition, some devices can have buttons or other interactive elements that are cumbersome or uncomfortable to use in certain positions or in certain operating conditions. For example, it may be cumbersome to interact with a device using both hands (e.g., holding a device in one hand while engaging interface elements with the other). In another example, it may be difficult to press small buttons or engage touchscreen functions while a user's hands are otherwise occupied or unavailable (e.g., when wearing gloves, carrying groceries, holding a child's hand, driving, etc.). In still other examples, device interaction can be limited in a variety of other ways.
  • SUMMARY
  • Wearable user devices and methods for determining gestures are disclosed. An exemplary method for determining gestures can include causing light to be emitted from a wearable user device, sensing a portion of the light that is reflected by a wearer's skin, and determining a gesture made by the wearer based on changes in the sensed portion of the light. In one example, the changes in the sensed portion of the light correspond to a change in a distance between an optical sensor of the wearable user device and the wearer's skin, and sensing the portion of the light that is reflected by the wearer's skin comprises sensing the portion of the light using the optical sensor. In another example, the changes in the sensed portion of the light correspond to a change in an intensity of the sensed portion of the light reflected by the wearer's skin. In still another example, causing the light to be emitted from the wearable user device can include causing the light to be emitted from a first LED at a first wavelength and a second LED at a second wavelength that is different from the first wavelength. In yet another example, causing the light to be emitted from the wearable user device can include causing the light to be emitted at an angle relative to the wearable device, wherein the emitted light is incident on the wearer's skin at a non-perpendicular angle. In another example, sensing the portion of the light that is reflected by the wearer's skin can include generating a signal based on the sensed portion of the light using an optical sensor positioned between a first LED and a second LED, wherein causing the light to be emitted from the wearable user device can include causing the light to be emitted from the first LED and the second LED. In another example, determining the gesture based on changes in the sensed portion of the light can include identifying a positive peak, a negative peak, and a zero crossing in a derivative of a signal generated by an optical sensor used for sensing the portion of the light that is reflected by the wearer's skin. In still another example, the gesture comprises a first clench.
  • An exemplary non-transitory computer-readable storage medium can include computer-executable instructions for performing any of the exemplary methods discussed above. A database can be coupled to the non-transitory computer-readable storage medium, and the database can include gesture recognition data. An exemplary system can include the non-transitory computer-readable storage medium and a processor capable of executing the computer-executable instructions.
  • An exemplary wearable device for determining gestures can include a light source configured to emit light from the device toward a wearer's skin when the wearable device is worn, an optical sensor configured to generate a signal based on sensing a portion of the light reflected by the wearer's skin, a non-transitory computer-readable storage medium comprising computer-executable instructions for determining a gesture made by the wearer based on changes in the signal, and a processor coupled to receive the signal, wherein the processor is capable of executing the computer-executable instructions. In one example, the changes in the signal correspond to a change in a distance between the optical sensor and the wearer's skin. In another example, the signal generated by the optical sensor changes based on an intensity of the sensed portion of the light reflected by the wearer's skin. In still another example, the light source can include a first LED configured to emit light at a first wavelength and a second LED configured to emit light at a second wavelength that is different from the first wavelength. In yet another example, the light source can include a first LED and a second LED, and the optical sensor can be positioned between the first LED and the second LED. In another example, the light source is angled relative to the wearable device to direct light to be incident on the wearer's skin at a non-perpendicular angle. In another example, the computer-executable instructions for determining the gesture include computer-executable instructions for determining the gesture by identifying a positive peak, a negative peak, and a zero crossing in a derivative of the signal. In still another example, the gesture comprises a first clench. In yet another example, the optical sensor is configured to sense a wavelength of light that corresponds to a wavelength of light emitted by the light source.
  • An exemplary system for determining gestures can include a light source configured to emit light from a wearable user device toward a wearer's skin when the wearable user device is worn; an optical sensor configured to generate a signal based on sensing a portion of the light reflected by the wearer's skin; a non-transitory computer-readable storage medium including computer-executable instructions for determining a gesture made by the wearer based on changes in the signal; a processor coupled to receive the signal, wherein the processor is capable of executing the computer-executable instructions; and a communication module coupled to the processor, wherein the communication module is configured to communicate with a mobile device. In one example, the non-transitory computer-readable storage medium further includes instructions for communicating a command to the mobile device via the communication module in response to determining the gesture.
  • An exemplary computer-implemented method for indirectly determining a hand gesture using a wearable device worn on a wearer's wrist can include causing light to be emitted toward the wearer's wrist from a light source of the wearable device, wherein the light source is positioned proximate to skin at the wearer's wrist; sensing a portion of the light that is reflected by the skin at the wearer's wrist using an optical sensor of the wearable device, wherein the optical sensor is positioned proximate to the skin at the wearer's wrist; indirectly determining a hand gesture made by the wearer based on a change in the sensed portion of the light that is reflected by the skin at the wearer's wrist, wherein the change results from a distance between the optical sensor and the skin at the wearer's wrist increasing or decreasing due to the hand gesture. In one example, the distance between the optical sensor and the skin at the wearer's wrist increases or decreases less than 1/8th of an inch due to the hand gesture. In another example, the change in the sensed portion of the light can correspond to an intensity of the sensed portion of the light increasing or decreasing based on the increasing or decreasing distance between the optical sensor and the skin at the wearer's wrist. In still another example, indirectly determining the hand gesture made by the wearer includes sensing the change in the sensed portion of the light that is reflected by the skin at the wearer's wrist without directly sensing hand movement of the wearer.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 illustrates an exemplary wearable device near a user's skin.
  • FIG. 2 illustrates an exemplary wearable device separated a distance from a user's skin.
  • FIG. 3A illustrates the skin-facing or back side of an exemplary wearable device.
  • FIG. 3B illustrates the face or front side of an exemplary wearable device.
  • FIG. 4 illustrates an open hand of a user wearing an exemplary wearable device.
  • FIG. 5 illustrates a clenched first of a user wearing an exemplary wearable device.
  • FIG. 6 illustrates an exemplary waveform of a sensed signal from an optical sensor of a wearable device.
  • FIG. 7 illustrates an exemplary derivative waveform of a sensed signal from an optical sensor of a wearable device.
  • FIG. 8 illustrates an exemplary derivative waveform with gesture recognition thresholds.
  • FIG. 9 illustrates an exemplary process for determining gestures using an optical sensor of a wearable device.
  • FIG. 10 illustrates an exemplary process for recognizing gestures using an optical sensor and performing functions corresponding to the recognized gestures.
  • FIG. 11 illustrates an exemplary system for detecting gestures.
  • DETAILED DESCRIPTION
  • In the following description of examples, reference is made to the accompanying drawings in which it is shown by way of illustration specific examples that can be practiced. It is to be understood that other examples can be used and structural changes can be made without departing from the scope of the various examples.
  • This relates to a wearable device with an optical sensor that can be used to recognize gestures of a user wearing the device. In one example, one or more light sources can be positioned on the back or skin-facing side of a wearable device, such as a watch, wristband, armband, leg-band, chest-band, headband, or the like. An optical sensor can be positioned near the one or more light sources on the same side of the wearable device. During operation, light can be emitted from the one or more light sources and sensed using the optical sensor. Changes in the sensed light caused by movements of the user wearing the device can be used to recognize user gestures. For example, light emitted from a light source can reflect off a wearer's skin, and the reflected light can be sensed using the optical sensor. When the wearer gestures in some way, such as by clenching the first on the arm where the device is worn, the reflected light can change perceptibly due to muscle contraction, the device shifting on the wrist, skin stretching, the distance changing between the optical sensor and the wearer's skin, or the like. Various changing features of the reflected and sensed light can be used to recognize a deliberate gesture, such as changes in angle of incidence, intensity, position, wavelength, or the like.
  • In some examples, incorporating an optical sensor in a wearable device and using it to recognize wearer gestures can provide a convenient way to interact with the wearable device. Such gesture recognition can supplement or even replace other interfaces, such as touchscreens, buttons, dials, or the like, and it can be used to perform the same or similar functions, such as selecting a display element, navigating to a different view, changing the time, changing a display, answering a phone call, or any of a variety of other functions. In some examples, such gesture recognition can provide one-handed or hands-free device operation, which can provide convenience as well as safety (e.g., when carrying groceries, driving, or the like). It should be understood that many other applications are possible, and gesture recognition as discussed herein can provide a variety of other benefits and enhance a user's experience in many other ways.
  • FIG. 1 illustrates exemplary wearable device 100 near a wearer's skin surface 116. In one example, wearable device 100 can be configured to be worn on a user's wrist (e.g., as a bracelet, wristwatch, wristband, etc.), and skin surface 116 could be anywhere around the user's wrist depending on how the user prefers to wear the device (e.g., with an associated display near the palm or near the back side of the hand). Wearable device 100 can include, among other things, light source 104, light source 106, and optical sensor 102.
  • Light sources 104 and 106 can each include a light-emitting diode (LED) or another light source. In some examples, the light sources can be angled such that the bulk of the emitted light reflects back toward optical sensor 102 when the device is worn normally (e.g., angled such that emitted light can be incident on the wearer's skin at a non-perpendicular angle). Angling the light sources can also avoid arcing between the light source and the sensor (e.g., emitted light reaching the sensor without first reflecting off a user's skin). For example, light source 104 can be angled sixty degrees away from vertical toward optical sensor 102, such that the bulk of emitted light 108 can reflect off skin surface 116 toward optical sensor 102, as illustrated by reflected light 110. Similarly, light source 106 can be angled sixty degrees away from vertical toward optical sensor 102, such that the bulk of emitted light 112 can reflect off skin surface 116 toward optical sensor 102, as illustrated by reflected light 114. In other examples, light sources 104 and 106 can be angled more sharply vertical (e.g., thirty degrees from vertical toward optical sensor 102) or more sharply horizontal.
  • Light sources 104 and 106 can be configured in a variety of different ways for robust recognition of changes in reflected light 110 and reflected light 114. In one example, light sources 104 and 106 can include LED's that emit infrared light in the 900nm range. In other examples, light source 104 can be configured to emit light having a different wavelength than that of the light emitted by light source 106. In these examples, the light emitted by light source 104 and the light emitted by light source 106 can be selected such that they can be differentiated by optical sensor 102, and can be visible, ultraviolet, infrared, or other wavelengths. Light sources 104 and 106 can also be operated differently to allow for distinct reflected light recognition, such as by pulsing emitted light at different frequencies, emitting light at alternating intervals, or the like. In some examples, having two separate light sources can aid in distinguishing between localized movements or whole device movements (e.g., certain muscles contracting versus an entire device shifting uniformly). In still other examples, however, wearable device 100 can include a single light source (e.g., either light source 104 or light source 106), more than two light sources, more than one optical sensor, or various other combinations, or can include an optical sensor that can detect changes in ambient lighting reflected by skin surface 116 without a dedicated light source.
  • Optical sensor 102 can include a photodiode or other photodetector capable of converting light into a current, voltage, or other signal. In one example, optical sensor 102 can convert sensed light into a signal (e.g., current, voltage, etc.) with a magnitude or value that corresponds to the amount of light received by the sensor. In some examples, optical sensor 102 can be configured to sense particular wavelengths of light (e.g., those generated by light sources 104 and 106) while not sensing others. As illustrated in FIG. 1, distance 118 between skin surface 116 and wearable device 100 can dictate where reflected light 110 and reflected light 114 are incident on the skin-facing side of wearable device 100. While emitted light and reflected light are illustrated in FIG. 1 as straight lines for simplicity, it should be understood that light can spread into a larger beam and reflect at different angles. In the example of FIG. 1, with distance 118 separating wearable device 100 and skin surface 116, reflected light 110 and reflected light 114 are shown striking the edges of optical sensor 102 (e.g., the bulk of the light represented by straight lines striking the edges). In one example, optical sensor 102 might generate a signal representative of a relatively small amount of sensed light as much of reflected light 110 and reflected light 114 might strike the body of wearable device 100 rather than the sensor.
  • In contrast, FIG. 2 illustrates wearable device 100 separated from skin surface 116 by distance 220, which might be larger than distance 118 of FIG. 1. Light sources 104 and 106 can generate emitted light 222 and emitted light 226, which can be reflected off skin surface 116 as reflected light 224 and reflected light 228, respectively. Distance 220 can allow reflected light 224 and reflected light 228 to strike optical sensor 102 near the center of the sensor (as opposed to striking the edges of the sensor, as in FIG. 1). It should be understood that the simplified straight light lines in FIG. 2 are illustrated for simplicity, and the light can spread into a larger beam and reflect at different angles. As such, the more centered light incidence illustrated in FIG. 2 can signify that more of the spread of light can be received by the sensor than if the spread of light was incident elsewhere. With more of the spread of light incident near the center, optical sensor 102 might generate a signal representative of a relatively large amount of sensed light as the bulk of reflected light 224 and reflected light 228 might strike the sensor itself rather than other components (e.g., being incident on the sensor rather than the body of wearable device 100). Optical sensor 102 can thus generate distinct signals that can be representative of the distance between wearable device 100 and skin surface 116.
  • In some examples, a user's hand gesture can account for the difference between distance 118 in FIG. 1 and distance 220 in FIG. 2. As discussed in more detail below with reference to FIGS. 4-8, a user can gesture by clenching a fist, moving a finger, bending a wrist, or the like, which can cause the distance between wearable device 100 and skin surface 116 to change (e.g., change from distance 118 in FIG. 1 to distance 220 in FIG. 2). In some examples, a user's gesture can cause at least a slight shift in a wrist strap associated with a wearable device or a deformation in a user's skin, which can cause a change in distance between the device and the user's skin. Similarly, in some instances, contracting muscles can cause the wrist to expand, which can also change the distance between a wearable device and a user's skin. It should be understood that many other factors associated with a gesture can cause a change in the distance between a wearable device and a user's skin. The different signals generated by optical sensor 102 based on different distances can be used to recognize when a user gestures deliberately to interact with a wearable device.
  • In other examples, factors besides the distance between a wearable device and a skin surface can be used to recognize a gesture. For example, skin stretching, user perspiration, and the like can cause perceptible changes in reflected light that optical sensor 102 can detect. In some examples, light sources 104 and 106 can generate light that can at least partially penetrate the outer skin layer and reflect in different ways based on muscles contracting, tendons shifting, tissue compressing, skin stretching, density changing, pressure changing, or the like. In any of the various examples, optical sensor 102 can sense the reflected light and generate signals based on how emitted light is modified before being incident on the sensor. Changes in the signals generated by optical sensor 102 can be used to recognize wearer activity.
  • It should be understood that gesture recognition as discussed herein can be used to determine hand gestures indirectly by sensing small changes that occur at a wearer's wrist as opposed to directly sensing movements of a gesturing hand. For example, while a deliberate gesture might include clenching and unclenching a fist, an increasing or decreasing distance (e.g., changing less than 1/8th of an inch) between an optical sensor of a wearable device and the skin at the wearer's wrist can be sensed and used to indirectly recognize the gesture rather than directly detecting the motion of the clenching and unclenching first (e.g., finger and palm movements). Light can be emitted toward the skin at a wearer's wrist, reflect off the skin at the wearer's wrist, and be sensed by an optical sensor positioned proximate to the skin at the wearer's wrist. The intensity of the sensed light can increase or decrease based on an increasing or decreasing distance between the optical sensor and the skin at the wearer's wrist, which changing distance can result from a deliberate hand gesture as discussed above. Accordingly, by monitoring small changes in light emitted and sensed at the wearer's wrist, a hand gesture can be indirectly recognized without directly sensing finger or palm movements.
  • It should further be understood that gesture recognition discussed herein can be applied to a variety of other body parts beyond the wrist and beyond wearable devices that are worn on the wrist, and wrist-worn devices discussed herein are provided as non-limiting examples of gesture recognition using optical sensors. For example, gesture recognition according to the examples herein can be performed using a device on an armband, leg-band, chest-band, headband, or the like. Device and skin movements can be detected, and those detected movements can be used to determine a gesture. For example, a wearable device in a headband can recognize a deliberate gesture from a user raising one or both eyebrows, causing the headband device to shift and/or perceptibly altering the skin on the forehead. Similarly, a wearable device in a leg-band or ankle-band can recognize a deliberate gesture from a user bending a leg, moving a foot, or the like, causing the leg-band or ankle-band device to shift and/or perceptibly altering the skin on the leg, ankle, or foot. As with the examples discussed herein, any movements that cause device shifting, distance changes, skin movements, or the like can be detected and used for determining a gesture. Accordingly, the various examples discussed herein can be applied to a variety of devices that can be worn near any part of the skin anywhere on the body, and gestures can be determined from changes in optical sensor signals of those various devices.
  • It should further be understood that the proportions, spacing, or layout of light sources, optical sensors, and wearable devices that can be used to recognize gestures as discussed herein can vary from those examples illustrated in FIG. 1 and FIG. 2. For example, light sources and optical sensors can be a fraction of the illustrated size, and light sources and optical sensors can be positioned adjacent one another in a much smaller space. Moreover, distances 118 and 220 may be exaggerated, and in some instances, optical sensor 102 can detect differences in distances that may be imperceptible to the human eye. For example, although a change in distance between a wearable device and a user's skin may not be apparent to the wearer when gesturing, optical sensor 102 can be sensitive enough to recognize a slight difference in distance (e.g., based on a slight change in wrist circumference during a gesture).
  • It should further be understood that wearable device 100 can include a variety of other components not illustrated or described here, such as a display, touchscreen, buttons, dials, central processing unit, memory, camera, indicator lights, oscillator, clock, wireless transceivers, batteries, motions sensors, light sensors, touch sensors, other sensors, or the like. It should also be understood that gesture recognition using an optical sensor, as discussed herein, can be implemented in a variety of different wearable devices with a variety of different functions.
  • FIG. 3A illustrates the skin-facing or back side of exemplary wearable device 100. In one example, wearable device 100 includes a wristwatch or other wearable device, which can be attached to wrist strap 330 for securing the device to a user's wrist. As illustrated, the skin-facing or back side of wearable device 100 can include light sources 104 and 106 separated by optical sensor 102. Light sources 104 and 106 can include LED's, which can be positioned behind glass, plastic, or other transparent windows, or can be positioned behind the same window. In some examples, optical sensor 102 can also be positioned behind the same or a different glass, plastic, or other transparent window. Optical sensor 102, and particularly a sensing or detecting area of optical sensor 102, can be sized to avoid interference from arm hair and other skin surface variations. For example, optical sensor 102 can be sized several times larger than a typical arm hair such that the sensor might not be easily blocked from sensing reflected light.
  • It should be understood that the proportions, spacing, or layout of light sources and optical sensors that can be used to recognize gestures can vary from the example illustrated in FIG. 3A. For example, light sources 104 and 106 and optical sensor 102 can be positioned within wrist strap 330, and wrist strap 330 can include wiring to source power from and enable communication with wearable device 100. In other examples, light sources 104 and 106 and optical sensor 102 can be oriented differently (e.g., in-line with wrist strap 330).
  • FIG. 3B illustrates the face or front side of exemplary wearable device 100 secured to wrist strap 330. In one example, wearable device 100 can include a display 332 on the front side, and a user can interact with information shown on display 332 using gestures. For example, images, data, virtual buttons, a clock, applications, music tracks, or the like can be displayed on display 332, and a user can perform certain gestures to make selections, advance to a new display, enable the display, return to a home screen, navigate forward, navigate backward, open a notification, scroll through information, scroll through applications, scroll through songs, pause playing music, start playing an audio file, answer a phone call, engage a virtual assistant, or perform any of a variety of other functions. Various gestures can be used to engage these functions, as discussed in more detail below with reference to FIG. 4 and FIG. 5.
  • In some examples, display 332 can include touchscreen capability that can be used in addition to or instead of gestures to interact with wearable device 100. Similarly, wearable device 100 can include a variety of other components not illustrated in FIG. 3A or FIG. 3B or described here, such as buttons, dials, central processing unit, memory, camera, indicator lights, oscillator, clock, wireless transceivers, batteries, motions sensors, light sensors, touch sensors, other sensors, or the like.
  • FIG. 4 illustrates an open hand of a user wearing exemplary wearable device 100. As mentioned above, a user can interact with wearable device 100 using a variety of gestures that can cause perceptible changes in light directed to a user's skin, reflected off the user's skin, and sensed by an optical sensor. A wide variety of movements can cause changes to reflected light that can alter the signal generated by an optical sensor, and such an altered signal can be interpreted as a deliberate gesture. Movements that can be interpreted as a deliberate gesture can include finger movements, hand movements, wrist movements, whole arm movements, or the like. For example, the open hand in FIG. 4 can be clenched into a first as shown in FIG. 5 and opened or unclenched back to the position shown in FIG. 4 as a deliberate gesture for interacting with wearable device 100.
  • Notably, the wearer's wrist in FIG. 5 can be at least slightly altered as compared to the wearer's wrist in FIG. 4 (e.g., circumference expanded, skin stretched, muscles contracted, tendons shifted, tissue compressed, etc.). The at least slight alteration of the wrist from having an open hand as in FIG. 4 to having a clenched first as in FIG. 5 can cause the distance between wearable device 100 and the wearer's skin to change (e.g., increase or decrease). As discussed above, an optical sensor associated with wearable device 100 can generate a signal that changes when the distance between the device and the wearer's skin changes, such that the motion of clenching the user's first can be perceived along with the subsequent motion of unclenching the user's fist. In some examples, the sequential clenching and unclenching of the user's first can be recognized as a deliberate gesture (e.g., like the equivalent of a mouse cursor click or touchscreen press to select a virtual button, answer a phone call, navigate forward, etc.).
  • In other examples, other motions can be used as gestures for interacting with wearable device 100. For example, the wearer can move the thumb inward to touch the palm and subsequently release the thumb back to its natural position (e.g., the position shown in FIG. 4). In another example, the wearer can contract the fingers (e.g., digits other than the thumb) to touch the palm and subsequently release them back to their natural position. In yet another example, the wearer can contract an individual finger (e.g., the index finger, middle finger, etc.) to touch the palm and subsequently release the finger to its natural position. In still other examples, the wearer can bend the hand downward at the wrist (e.g., flexion), bend the hand upward at the wrist (e.g., extension), twist the hand (e.g., pronation or supination), angle the hand to the left or right (e.g., radial deviation or ulnar deviation), or perform any of a variety of other deliberate motions that can cause at least slight alterations at the wrist that can be perceived by an optical sensor sensing changes in incident light.
  • In some examples, wearable device 100 can be configured to uniquely recognize different gestures for performing different functions as well as recognize different ways of performing the same gestures as indicating different user intent. For example, a wearer can clench the fist, hold the clenched first for a certain amount of time, and subsequently open the hand to request a function different than one associated with clenching and unclenching the first in rapid succession (e.g., as in the difference between a simple mouse cursor click and a double-click). In another example, a wearer can twist the hand in one direction to request one function (e.g., navigate forward) and twist the hand in the other direction to request a different function (e.g., navigate backward). In yet another example, a wearer can touch the thumb to the palm to request one function (e.g., select) and touch the fingers to the palm to request a different function (e.g., return to a home screen). It should be understood that a variety of different motions can be perceived and associated with a variety of different functions. In other examples, however, any of a group of different motions can each be perceived as a user request for the same function (e.g., any of first clenching, wrist bending, or thumb touching can be interpreted as a request for the same function).
  • As wrist shape, size, musculature, and other characteristics vary for different users, wearable device 100 can be configured in some examples to perform learning or training to adapt to a particular wearer. For example, an application associated with wearable device 100 can instruct the wearer to perform certain movements (e.g., instruct to clench and unclench the first by displaying the command on the device display, playing the command audibly, or the like). As the wearer performs the movements, wearable device 100 can monitor how signals generated by the optical sensor change in response to the requested movements. Wearable device 100 can then associate changes in the optical sensor signals with particular movements in order to recognize a deliberate gesture of the specific wearer in the future. In some instances, wearable device 100 can be trained differently for wearing on different hands or for wearing in different positions on the wrist (e.g., palm side or back side). In some examples, particular changes in the optical sensor signal and associated gestures can be stored in a database (e.g., in memory on the device, remote from the device, on another device, etc.), and sensed changes in the optical sensor signal can be compared to the stored data to recognize a particular gesture.
  • FIG. 6 illustrates an exemplary waveform of a sensed signal from an optical sensor of a wearable device (e.g., such as from optical sensor 102 of wearable device 100 discussed above). In one example, the vertical axis can correspond to a magnitude of a signal generated by an optical sensor over time (e.g., in seconds on the horizontal axis). Such a signal can correspond to a current, voltage, or similar signal representative of sensed light. For example, the signal can be a current with units of milliamps, which can be quantized along a scale to enable comparison, calculation, and gesture recognition (e.g., can be digitized with 24 bits of precision).
  • The waveform in FIG. 6 can correspond to a user wearing a wearable device with an optical sensor as discussed herein performing six repeated gestures, each gesture being separated by a gap of a few seconds. For example, each spike or gesture event 640 can correspond to the clenching and unclenching of the wearer's first in rapid succession. As indicated, first clench 642 can correspond to an increasing magnitude, and subsequent first release 644 can correspond to a decreasing magnitude. In some examples, the combination of an increasing magnitude and a decreasing magnitude in rapid succession can be recognized as a deliberate gesture.
  • In the example illustrated in FIG. 6, the wearable device may have shifted relative to the corresponding wearer such that more light was sensed when the first was clenched than when the first was unclenched. As such, the waveform can be seen rising in magnitude with first clench 642 and subsequently falling in magnitude with first release 644. In other examples, however, the signal and the associated waveform can vary for different wearers or for the same wearer performing a different gesture. For example, the magnitude may first fall and then rise in succession for a different user or for a different gesture (e.g., bending the wrist). In another example, the peak magnitude of the spike associated with gesture event 640 can be lower or higher for different users or different gestures. Similarly, the width or the slope of a spike associated with a gesture event can vary for different users or different gestures (e.g., wider when holding the first clenched to indicate a particular user request). It should be understood that various different gestures and wearer characteristics can produce a variety of different waveforms that can be used to recognize gesture events (e.g., deliberate user gestures indicating particular user intent). Moreover, it should be understood that in some examples a wearable device can learn or be trained to associate unique signals with a particular wearer's gestures.
  • In some examples, given how an optical sensor signal can vary for different users and different gestures, a derivative of the optical sensor signal can be used to achieve robust gesture recognition (e.g., while ignoring noise, casual movements, stray light pollution, etc.). FIG. 7 illustrates an exemplary derivative waveform of a sensed signal from an optical sensor of a wearable device. In one example, the derivative waveform of FIG. 7 can correspond to the raw signal waveform of FIG. 6. In some examples, positives peaks 750, negative peaks 752, and zero crossings 754 can be used to positively identify a deliberate user gesture for interacting with a wearable device. For example, having an occurrence of positive peak 750, followed by zero crossing 754, followed by negative peak 752 can be sufficient to identify recognized gesture event 756 (marked by rectangles). In another example, order can vary such that a negative peak can be followed by a zero crossing and a positive peak and also be identified as a recognized gesture event. In yet another example, having an occurrence of a positive peak, a negative peak, and a zero crossing within a certain time window can be sufficient to recognize a gesture.
  • In some examples, gesture recognition thresholds can be used to filter out noise or other haphazard data and correctly recognize deliberate user gestures for interacting with a wearable device. FIG. 8 illustrates an exemplary derivative waveform with gesture recognition thresholds, including positive threshold 860 and negative threshold 862. In one example, for purposes of recognizing a deliberate gesture, a positive or negative peak in a derivative waveform can be recognize when outside predetermined thresholds, such as being outside positive threshold 860 and negative threshold 862. In some examples, peaks or other data movements within the thresholds can be ignored. For example, minor vibrations or casual hand motions can lead to significant movements in a raw optical sensor signal as well as in the derivative of the raw optical sensor signal. To avoid errantly recognizing gestures (e.g., false positives) from such movements, gesture recognition thresholds can be used to restrict positive gesture recognition to instances of significant movement outside the thresholds.
  • In other examples, a variety of other techniques and data can be used to positively recognize deliberate user gestures while filtering out noise and other haphazard data. For example, concavity changes, zero crossings, time between peaks, peak slope, peak width, and other data characteristics can be used to limit gesture recognition to particular expected signal behavior. In this manner, a wearable device can accurately recognize and respond to user gestures when appropriate, and avoid recognizing user interaction when none was intended.
  • FIG. 9 illustrates exemplary process 970, which can be used according to various examples discussed herein for determining gestures using an optical sensor in a wearable device. At block 972, light can be caused to be emitted from a wearable user device. For example, light can be emitted as discussed above with reference to light source 104 and/or light source 106 of wearable device 100 in FIG. 1. A light source can include a light-emitting diode (LED) that can be angled such that the bulk of the emitted light reflects back toward an optical sensor (e.g., optical sensor 102) when the device is worn normally (e.g., angled such that emitted light can be incident on the wearer's skin at a non-perpendicular angle). In some examples, two light sources can be used, and each light source can be configured to emit light having a different wavelength than that of the other light source. In such examples, the light emitted by individual light sources can be selected such that they can be differentiated by an optical sensor, and can be visible, ultraviolet, infrared, or other wavelengths. Light sources can also be operated differently to allow for distinct reflected light recognition, such as by pulsing emitted light at different frequencies, emitting light at alternating intervals, or the like. In some examples, having two separate light sources can aid in distinguishing between localized movements or whole device movements (e.g., certain muscles contracting versus an entire device shifting uniformly). In still other examples, however, a wearable device can include a single light source, more than two light sources, more than one optical sensor, or various other combinations, or can include an optical sensor that can detect changes in ambient lighting reflected by a skin surface without a dedicated light source.
  • Referring again to FIG. 9, at block 974, light can be sensed that is reflected by a user's skin. For example, reflected light can be sensed as discussed above with reference to optical sensor 102. Light can be sensed in a variety of ways, such as being sensed with a photodiode that generates a current, voltage, or other signal representing the relative amount of light incident on the sensor (e.g., the intensity of received light). In some examples, an optical sensor can be configured to sense primarily or exclusively light with similar characteristics (e.g., wavelength) as light emitted from light sources associated with a wearable device. In one example, the distance between a skin surface of a user and a wearable device can dictate where reflected light is incident on the skin-facing side of the wearable device, which can affect the intensity of light received at the sensor. For example, in a first position relative to a wearer's skin, an optical sensor might generate a signal representative of a relatively small amount of sensed light, while in another position relative to a wearer's skin, the optical sensor might generate a signal representative of a relatively large amount of sensed light given a shift in where the light is incident relative to the sensor. Movements of the wearable device and the optical sensor relative to a wearer's skin can thus cause the signal generated by the optical sensor to change as the intensity of light received at the sensor changes.
  • Referring again to FIG. 9, at block 976, a gesture can be determined based on changes in sensed light. For example, a gesture event can be determined from a raw optical sensor signal (e.g., as discussed with reference to FIG. 6), or from the derivative of an optical sensor signal (e.g., as discussed with reference to FIG. 7 or FIG. 8). A wide variety of movements can cause changes to reflected light that can alter the signal generated by an optical sensor, and such an altered signal can be interpreted as a deliberate gesture. Movements that can be interpreted as a deliberate gesture can include finger movements, hand movements, wrist movements, whole arm movements, or the like. For example, with a wristwatch-type wearable device, the wearer's wrist can be at least slightly altered between a clenched first and an open hand (e.g., circumference expanded, skin stretched, muscles contracted, tendons shifted, tissue compressed, etc.). The at least slight alteration of the wrist from having an open hand to having a clenched first can cause the distance between the wearable device and the wearer's skin to change (e.g., increase or decrease). As discussed above, an optical sensor associated with the wearable device can generate a signal that changes when the distance between the device and the wearer's skin changes, such that the motion of clenching the user's first can be perceived along with the subsequent motion of unclenching the user's fist. In some examples, the sequential clenching and unclenching of the user's first can be determined to be a deliberate gesture (e.g., like the equivalent of a mouse cursor click or touchscreen press to select a virtual button, answer a phone call, navigate forward, etc.).
  • In some examples, a derivative of the optical sensor signal can be used for gesture recognition. For example, having an occurrence of a positive peak, followed by a zero crossing, followed by a negative peak in the derivative of the optical sensor signal can be sufficient to determine a deliberate gesture. In another example, order can vary such that a negative peak can be followed by a zero crossing and a positive peak and also be identified as a deliberate gesture. In yet another example, having an occurrence of a positive peak, a negative peak, and a zero crossing within a certain time window can be sufficient to determine a deliberate gesture in a derivative of an optical sensor signal.
  • In other examples, a variety of other techniques and data can be used to positively determine deliberate user gestures while filtering out noise and other haphazard data. For example, concavity changes, zero crossings, time between peaks, peak slope, peak width, and other data characteristics can be used to limit gesture recognition to particular expected signal behavior. In this manner, a wearable device can accurately determine and respond to user gestures when appropriate, and avoid recognizing user interaction when none was intended.
  • In some examples, a determined gesture can be associated with a particular user request, and a wearable device can respond appropriately to the user request (e.g., a request to select a virtual button, navigate to a different view, answer a phone call, advance to a different song, or the like). Gesture recognition can thus be used in some examples to enable a user to interact with a wearable device without pushing buttons, touching a touchscreen, or the like.
  • FIG. 10 illustrates exemplary process 1080, which can be used according to various examples discussed herein for recognizing gestures using an optical sensor and performing functions corresponding to the recognized gestures. At block 1082, gesture recognition can be enabled upon user engagement with a wearable device. In some examples, light sources and an optical sensor integrated into a wearable device for gesture recognition can be disabled, kept off, or used for functions other than gesture recognition when deliberate gestures may be unlikely (e.g., when a wearer is not engaging the device, when no one is wearing the device, when a wearer is running, etc.). Disabling gesture recognition using the light sources and optical sensor can conserve battery power, avoid false gesture detection, free them to be used for other functions, or the like.
  • Upon recognizing that a user is engaged with a device, gesture recognition can be automatically enabled (e.g., the light sources and optical sensor can be enabled for recognizing gestures). A variety of methods can be used to recognize that a user is engaged with a wearable device. For example, other sensors in the wearable device can be used to detect that a user has raised the wearable device, angled it toward the user's face, and is looking at the display (e.g., using an accelerometer, gyroscope, camera, proximity sensor, light sensor, or the like). In another example, a user can press a button, touch a touchscreen, say a command, shake the device, or the like to commence engagement with the wearable device. In yet another example, sensors in the wearable device can be used to detect that a user is both not moving and actively looking at the display before enabling active gesture recognition, as movement could lead to artifacts and false gesture detection. In still other examples, light sources and an optical sensor can be enabled at times when a user is not engaging the device, but signals generated by the optical sensor can be ignored or discarded until user engagement is detected.
  • At block 1084, light can be caused to be emitted from a wearable user device (e.g., as described with reference to block 972 of process 970). At block 1086, light can be sensed that is reflected by a user's skin (e.g., as described with reference to block 974 of process 970). At block 1088, a determination can be made as to whether changes in sensed light indicate a deliberate user gesture (e.g., a gesture intended to cause a functional response by the wearable device). For example, a deliberate gesture can be recognized from a raw optical sensor signal (e.g., as discussed with reference to FIG. 6), or from the derivative of an optical sensor signal (e.g., as discussed with reference to FIG. 7 or FIG. 8).
  • If no gesture is indicated by changes in sensed light (e.g., the “no” branch of block 1088), light can again be caused to be emitted at block 1084 and reflected light can again be sensed at block 1086. In some examples, polling cycles can be repeated of causing light to be emitted and sensing reflected light to determine whether a user has performed a deliberate gesture. Consecutive polling cycles can be separated by a time interval, or they can be continuous until a gesture is detected or another event breaks the cycle (e.g., device powers off, user engagement terminates, a timeout interval passes, etc.).
  • If a gesture is indicated by changes in sensed light (e.g., the “yes” branch of block 1088), a function corresponding to the recognized gesture can be performed at block 1090. For example, a particular recognized gesture can indicate a user's desired intent to answer an incoming call, so the function of answering the incoming call can be performed. In other examples, recognized gestures in other contexts can correspond to a variety of other functions that can be performed upon recognizing a user's deliberate gesture (e.g., navigate forward, navigate backward, pause music, open a notification, etc.). In some examples, after performing the function corresponding to the recognized gesture at block 1090, gesture recognition can continue (e.g., at block 1084) to allow a user to continue interacting with a wearable device using gestures. In other examples, gesture recognition can be disabled upon performance of a function at block 1090, or when the wearable device is powered off, a user ceases to engage the device, a timeout interval passes, or the like.
  • Although various examples herein demonstrate how gesture recognition can be used to interact with a wearable device, it should be understood that gesture recognition can be used to interact with other devices as well. In some examples, a user can effectively generate commands using gestures that a wearable device can communicate to any of a variety of other devices, such as a mobile phone, television, audio system, media player, game console, lighting system, security system, tablet computer, or the like. For example, a wearable device can be in communication with a media player (e.g., via Wi-Fi, Bluetooth, the Internet etc.), and the wearer can perform a recognizable gesture. In response to recognizing the gesture, the wearable device can transmit a corresponding command to the media player, such as navigate through a menu, pause playback, select content for display, or the like. In another example, a wearable device can be in communication with a mobile telephone (e.g., via Bluetooth), and the wearer can perform a recognizable gesture. In response to recognizing the gesture, the wearable device can transmit a corresponding command to the mobile telephone, such as answer a phone call, silence a ringtone, emit a sound to help locate the phone, or the like. It should be understood that gesture recognition can be employed for still many other device-to-device interactions.
  • In addition, although various examples herein demonstrate how gesture recognition can be achieved using a light source and an optical sensor, it should be understood that various other sensors and elements can be used in conjunction with a light source and an optical sensor to recognize deliberate user gestures for interacting with a wearable device. For example, accelerometers and/or gyroscopes can be used to detect movement and determine whether optical sensor signals represent deliberate gestures instead of artifacts of other random user movements. Similarly, certain gestures can yield unique optical sensor signals concurrently with unique accelerometer and/or gyroscope signals, and the simultaneously occurrence of a combination of such signals can be used to positively recognize a deliberate gesture. In other examples, other device elements, sensors, and signal combinations can be used to avoid false gesture detection and correctly recognize deliberate user gestures.
  • One or more of the functions described above relating to determining gestures can be performed by a system similar or identical to system 1100 shown in FIG. 11. System 1100 can include instructions stored in a non-transitory computer readable storage medium, such as memory 1102 or storage device 1101, and executed by processor 1105. The instructions can also be stored and/or transported within any non-transitory computer readable storage medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. In the context of this document, a “non-transitory computer readable storage medium” can be any medium that can contain or store the program for use by or in connection with the instruction execution system, apparatus, or device. The non-transitory computer readable storage medium can include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus or device, a portable computer diskette (magnetic), a random access memory (RAM) (magnetic), a read-only memory (ROM) (magnetic), an erasable programmable read-only memory (EPROM) (magnetic), a portable optical disc such as CD, CD-R, CD-RW, DVD, DVD-R, or DVD-RW, flash memory such as compact flash cards, secured digital cards, USB memory devices, memory sticks, and the like, or any type of database.
  • The instructions can also be propagated within any transport medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. In the context of this document, a “transport medium” can be any medium that can communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. The transport medium can include, but is not limited to, an electronic, magnetic, optical, electromagnetic, or infrared wired or wireless propagation medium.
  • System 1100 can also include an input/output (“I/O”) module, such as I/O module 1103, which can enable processor 1105 to communicate with other components of system 1100 as well as peripherals, devices, servers, databases, and the like. For example, I/O module 1103 can include a transceiver, radio, modem, or the like that can enable processor 1105 to communicate with an external device through wired or wireless communication means, including LAN, WAN, Wi-Fi, Bluetooth, cellular, or the like.
  • System 1100 can further include touch sensitive display 1107 coupled to processor 1105 for detecting touch and displaying information. It is to be understood that the system is not limited to the components and configuration of FIG. 11, but can include other or additional components in multiple configurations according to various examples. Additionally, the components of system 1100 can be included within a single device, or can be distributed among multiple devices. In some examples, processor 1105 can be located within touch sensitive display 1107.
  • Therefore, according to the above, some examples of the disclosure are directed to a computer-implemented method for determining gestures, the method comprising: causing light to be emitted from a wearable user device; sensing a portion of the light that is reflected by a wearer's skin; and determining a gesture made by the wearer based on changes in the sensed portion of the light. Additionally or alternatively to one or more of the examples disclosed above, in some examples the changes in the sensed portion of the light correspond to a change in a distance between an optical sensor of the wearable user device and the wearer's skin; and sensing the portion of the light that is reflected by the wearer's skin comprises sensing the portion of the light using the optical sensor. Additionally or alternatively to one or more of the examples disclosed above, in some examples the changes in the sensed portion of the light correspond to a change in an intensity of the sensed portion of the light reflected by the wearer's skin. Additionally or alternatively to one or more of the examples disclosed above, in some examples causing the light to be emitted from the wearable user device comprises: causing the light to be emitted from a first LED at a first wavelength and a second LED at a second wavelength that is different from the first wavelength. Additionally or alternatively to one or more of the examples disclosed above, in some examples causing the light to be emitted from the wearable user device comprises: causing the light to be emitted at an angle relative to the wearable device, wherein the emitted light is incident on the wearer's skin at a non-perpendicular angle. Additionally or alternatively to one or more of the examples disclosed above, in some examples sensing the portion of the light that is reflected by the wearer's skin comprises: generating a signal based on the sensed portion of the light using an optical sensor positioned between a first LED and a second LED; and causing the light to be emitted from the wearable user device comprises causing the light to be emitted from the first LED and the second LED. Additionally or alternatively to one or more of the examples disclosed above, in some examples determining the gesture based on changes in the sensed portion of the light comprises: identifying a positive peak, a negative peak, and a zero crossing in a derivative of a signal generated by an optical sensor used for sensing the portion of the light that is reflected by the wearer's skin. Additionally or alternatively to one or more of the examples disclosed above, in some examples the gesture comprises a first clench.
  • According to the above, other examples of the disclosure are directed to a non-transitory computer-readable storage medium comprising computer-executable instructions for performing any of the methods described above; and a database can be coupled to the non-transitory computer-readable storage medium, wherein the database comprises gesture recognition data.
  • According to the above, other examples of the disclosure are directed to a system comprising: the non-transitory computer-readable storage medium discussed above; and a processor capable of executing the computer-executable instructions.
  • According to the above, other examples of the disclosure are directed to a wearable device for determining gestures, the device comprising: a light source configured to emit light from the device toward a wearer's skin when the wearable device is worn; an optical sensor configured to generate a signal based on sensing a portion of the light reflected by the wearer's skin; a non-transitory computer-readable storage medium comprising computer-executable instructions for determining a gesture made by the wearer based on changes in the signal; and a processor coupled to receive the signal, wherein the processor is capable of executing the computer-executable instructions. Additionally or alternatively to one or more of the examples disclosed above, in some examples the changes in the signal correspond to a change in a distance between the optical sensor and the wearer's skin. Additionally or alternatively to one or more of the examples disclosed above, in some examples the signal generated by the optical sensor changes based on an intensity of the sensed portion of the light reflected by the wearer's skin. Additionally or alternatively to one or more of the examples disclosed above, in some examples the light source comprises a first LED configured to emit light at a first wavelength and a second LED configured to emit light at a second wavelength that is different from the first wavelength. Additionally or alternatively to one or more of the examples disclosed above, in some examples the light source comprises a first LED and a second LED; and the optical sensor is positioned between the first LED and the second LED. Additionally or alternatively to one or more of the examples disclosed above, in some examples the light source is angled relative to the wearable device to direct light to be incident on the wearer's skin at a non-perpendicular angle. Additionally or alternatively to one or more of the examples disclosed above, in some examples the computer-executable instructions for determining the gesture comprise computer-executable instructions for determining the gesture by identifying a positive peak, a negative peak, and a zero crossing in a derivative of the signal. Additionally or alternatively to one or more of the examples disclosed above, in some examples the gesture comprises a first clench. Additionally or alternatively to one or more of the examples disclosed above, in some examples the optical sensor is configured to sense a wavelength of light that corresponds to a wavelength of light emitted by the light source.
  • According to the above, other examples of the disclosure are directed to a system for determining gestures, the system comprising: a light source configured to emit light from a wearable user device toward a wearer's skin when the wearable user device is worn; an optical sensor configured to generate a signal based on sensing a portion of the light reflected by the wearer's skin; a non-transitory computer-readable storage medium comprising computer-executable instructions for determining a gesture made by the wearer based on changes in the signal; a processor coupled to receive the signal, wherein the processor is capable of executing the computer-executable instructions; and a communication module coupled to the processor, wherein the communication module is configured to communicate with a mobile device. Additionally or alternatively to one or more of the examples disclosed above, in some examples the non-transitory computer-readable storage medium further comprises instructions for communicating a command to the mobile device via the communication module in response to determining the gesture.
  • According to the above, other examples of the disclosure are directed to a computer-implemented method for indirectly determining a hand gesture using a wearable device worn on a wearer's wrist, the method comprising: causing light to be emitted toward the wearer's wrist from a light source of the wearable device, wherein the light source is positioned proximate to skin at the wearer's wrist; sensing a portion of the light that is reflected by the skin at the wearer's wrist using an optical sensor of the wearable device, wherein the optical sensor is positioned proximate to the skin at the wearer's wrist; indirectly determining a hand gesture made by the wearer based on a change in the sensed portion of the light that is reflected by the skin at the wearer's wrist, wherein the change results from a distance between the optical sensor and the skin at the wearer's wrist increasing or decreasing due to the hand gesture. Additionally or alternatively to one or more of the examples disclosed above, in some examples the distance between the optical sensor and the skin at the wearer's wrist increases or decreases less than 1/8th of an inch due to the hand gesture. Additionally or alternatively to one or more of the examples disclosed above, in some examples the change in the sensed portion of the light corresponds to an intensity of the sensed portion of the light increasing or decreasing based on the increasing or decreasing distance between the optical sensor and the skin at the wearer's wrist. Additionally or alternatively to one or more of the examples disclosed above, in some examples indirectly determining the hand gesture made by the wearer comprises: sensing the change in the sensed portion of the light that is reflected by the skin at the wearer's wrist without directly sensing hand movement of the wearer.
  • Although examples have been fully described with reference to the accompanying drawings, it is to be noted that various changes and modifications will become apparent to those skilled in the art. Such changes and modifications are to be understood as being included within the scope of the various examples as defined by the appended claims.

Claims (22)

1. A computer-implemented method for determining gestures, the method comprising:
causing light to be emitted from a wearable user device;
sensing a portion of the light that is reflected by a wearer's skin; and
determining a gesture made by the wearer based on changes in the sensed portion of the light.
2. The method of claim 1, wherein the changes in the sensed portion of the light correspond to a change in a distance between an optical sensor of the wearable user device and the wearer's skin; and
wherein sensing the portion of the light that is reflected by the wearer's skin comprises sensing the portion of the light using the optical sensor.
3. The method of claim 1, wherein the changes in the sensed portion of the light correspond to a change in an intensity of the sensed portion of the light reflected by the wearer's skin.
4. The method of claim 1, wherein causing the light to be emitted from the wearable user device comprises:
causing the light to be emitted from a first LED at a first wavelength and a second LED at a second wavelength that is different from the first wavelength.
5. The method of claim 1, wherein causing the light to be emitted from the wearable user device comprises:
causing the light to be emitted at an angle relative to the wearable device, wherein the emitted light is incident on the wearer's skin at a non-perpendicular angle.
6. The method of claim 1, wherein sensing the portion of the light that is reflected by the wearer's skin comprises:
generating a signal based on the sensed portion of the light using an optical sensor positioned between a first LED and a second LED; and
wherein causing the light to be emitted from the wearable user device comprises causing the light to be emitted from the first LED and the second LED.
7. The method of claim 1, wherein determining the gesture based on changes in the sensed portion of the light comprises:
identifying a positive peak, a negative peak, and a zero crossing in a derivative of a signal generated by an optical sensor used for sensing the portion of the light that is reflected by the wearer's skin.
8. The method of claim 1, wherein the gesture comprises a first clench.
9-10. (canceled)
11. A wearable device for determining gestures, the device comprising:
a light source configured to emit light from the device toward a wearer's skin when the wearable device is worn;
an optical sensor configured to generate a signal based on sensing a portion of the light reflected by the wearer's skin;
a non-transitory computer-readable storage medium comprising computer-executable instructions for determining a gesture made by the wearer based on changes in the signal; and
a processor coupled to receive the signal, wherein the processor is capable of executing the computer-executable instructions.
12. The wearable device of claim 11, wherein the changes in the signal correspond to a change in a distance between the optical sensor and the wearer's skin.
13. The wearable device of claim 11, wherein the signal generated by the optical sensor changes based on an intensity of the sensed portion of the light reflected by the wearer's skin.
14. The wearable device of claim 11, wherein the light source comprises a first LED configured to emit light at a first wavelength and a second LED configured to emit light at a second wavelength that is different from the first wavelength.
15. The wearable device of claim 11, wherein the light source comprises a first LED and a second LED; and
wherein the optical sensor is positioned between the first LED and the second LED.
16. The wearable device of claim 11, wherein the light source is angled relative to the wearable device to direct light to be incident on the wearer's skin at a non-perpendicular angle.
17. The wearable device of claim 11, wherein the computer-executable instructions for determining the gesture comprise computer-executable instructions for determining the gesture by identifying a positive peak, a negative peak, and a zero crossing in a derivative of the signal.
18. The wearable device of claim 11, wherein the gesture comprises a first clench.
19. The wearable device of claim 11, wherein the optical sensor is configured to sense a wavelength of light that corresponds to a wavelength of light emitted by the light source.
20. A system for determining gestures, the system comprising:
a light source configured to emit light from a wearable user device toward a wearer's skin when the wearable user device is worn;
an optical sensor configured to generate a signal based on sensing a portion of the light reflected by the wearer's skin;
a non-transitory computer-readable storage medium comprising computer-executable instructions for determining a gesture made by the wearer based on changes in the signal;
a processor coupled to receive the signal, wherein the processor is capable of executing the computer-executable instructions; and
a communication module coupled to the processor, wherein the communication module is configured to communicate with a mobile device.
21. The system of claim 20, wherein the non-transitory computer-readable storage medium further comprises instructions for communicating a command to the mobile device via the communication module in response to determining the gesture.
22. A computer-implemented method for indirectly determining a hand gesture using a wearable device worn on a wearer's wrist, the method comprising:
causing light to be emitted toward the wearer's wrist from a light source of the wearable device, wherein the light source is positioned proximate to skin at the wearer's wrist;
sensing a portion of the light that is reflected by the skin at the wearer's wrist using an optical sensor of the wearable device, wherein the optical sensor is positioned proximate to the skin at the wearer's wrist;
indirectly determining a hand gesture made by the wearer based on a change in the sensed portion of the light that is reflected by the skin at the wearer's wrist, wherein the change results from a distance between the optical sensor and the skin at the wearer's wrist increasing or decreasing due to the hand gesture.
23-25. (canceled)
US15/117,692 2014-02-10 2014-02-10 Motion gesture input detected using optical sensors Abandoned US20160357265A1 (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2014/015647 WO2015119637A1 (en) 2014-02-10 2014-02-10 Motion gesture input detected using optical sensors

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2014/015647 A-371-Of-International WO2015119637A1 (en) 2014-02-10 2014-02-10 Motion gesture input detected using optical sensors

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/169,048 Continuation US11422635B2 (en) 2014-02-10 2021-02-05 Optical sensing device

Publications (1)

Publication Number Publication Date
US20160357265A1 true US20160357265A1 (en) 2016-12-08

Family

ID=50185052

Family Applications (2)

Application Number Title Priority Date Filing Date
US15/117,692 Abandoned US20160357265A1 (en) 2014-02-10 2014-02-10 Motion gesture input detected using optical sensors
US17/169,048 Active US11422635B2 (en) 2014-02-10 2021-02-05 Optical sensing device

Family Applications After (1)

Application Number Title Priority Date Filing Date
US17/169,048 Active US11422635B2 (en) 2014-02-10 2021-02-05 Optical sensing device

Country Status (7)

Country Link
US (2) US20160357265A1 (en)
EP (1) EP3100134B1 (en)
JP (1) JP6427598B2 (en)
KR (4) KR20200119912A (en)
CN (2) CN110045824B (en)
AU (3) AU2014381638B2 (en)
WO (1) WO2015119637A1 (en)

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160162038A1 (en) * 2014-05-20 2016-06-09 Huawei Technologies Co., Ltd. Method For Performing Operation On Intelligent Wearing Device By Using Gesture, And Intelligent Wearing Device
US20170031453A1 (en) * 2014-02-12 2017-02-02 Koninklijke Philips N.V. Movement detection apparatus for detecting a hand movement
US20180084385A1 (en) * 2015-04-15 2018-03-22 Huawei Technologies Co., Ltd. Message sending method in local area network, local area network gateway, and wearable device
US10094661B2 (en) * 2014-09-24 2018-10-09 Pixart Imaging Inc. Optical sensor and optical sensor system
US20180372871A1 (en) * 2017-06-25 2018-12-27 Pixart Imaging Inc. Object state determining apparatus and object state determining method
US20190109905A1 (en) * 2016-09-21 2019-04-11 Semiconductor Components Industries, Llc Low power sensor communication using two or fewer wires
CN112136162A (en) * 2018-05-15 2020-12-25 皇家飞利浦有限公司 Wrist-worn medical alert device for communicating emergency messages to care providers
WO2021126223A1 (en) * 2019-12-19 2021-06-24 Google Llc Direct manipulation of display device using wearable computing device
US20210303081A1 (en) * 2015-09-30 2021-09-30 Apple Inc. Systems and apparatus for object detection
US20210303068A1 (en) * 2020-03-31 2021-09-30 Apple Inc. Skin-to-skin contact detection
US11397468B2 (en) 2020-03-31 2022-07-26 Apple Inc. Skin-to-skin contact detection
US11422635B2 (en) 2014-02-10 2022-08-23 Apple Inc. Optical sensing device
US11422692B2 (en) 2018-09-28 2022-08-23 Apple Inc. System and method of controlling devices using motion gestures
US11460919B1 (en) * 2021-03-16 2022-10-04 Zeit Technologies Corporation Wearable gesture detector
US11790698B2 (en) 2019-11-01 2023-10-17 Samsung Electronics Co., Ltd. Electronic device for recognizing gesture of user using a plurality of sensor signals

Families Citing this family (32)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015102588A1 (en) 2013-12-30 2015-07-09 Apple Inc. User identification system based on plethysmography
US10488936B2 (en) 2014-09-30 2019-11-26 Apple Inc. Motion and gesture input from a wearable device
CN105162979B (en) * 2015-08-26 2019-02-05 Oppo广东移动通信有限公司 A kind of incoming call mute control method and smartwatch
US9939899B2 (en) 2015-09-25 2018-04-10 Apple Inc. Motion and gesture input from a wearable device
CN106020446A (en) * 2016-05-05 2016-10-12 广东小天才科技有限公司 An operation control method and system for an intelligent wearable apparatus
CN106020444A (en) * 2016-05-05 2016-10-12 广东小天才科技有限公司 An operation control method and system for intelligent wearable apparatuses
TWI585711B (en) * 2016-05-24 2017-06-01 泰金寶電通股份有限公司 Method for obtaining care information, method for sharing care information, and electronic apparatus therefor
GB2552219A (en) * 2016-07-15 2018-01-17 Sony Interactive Entertainment Inc Wearable input device
US10478099B2 (en) 2016-09-22 2019-11-19 Apple Inc. Systems and methods for determining axial orientation and location of a user's wrist
WO2018058462A1 (en) * 2016-09-29 2018-04-05 深圳市柔宇科技有限公司 Control method, control device and smart wearable apparatus
US10120455B2 (en) * 2016-12-28 2018-11-06 Industrial Technology Research Institute Control device and control method
US11185279B2 (en) * 2017-01-18 2021-11-30 Koninklijke Philips N.V. Detecting erythema caused by wearable devices
US10558278B2 (en) 2017-07-11 2020-02-11 Apple Inc. Interacting with an electronic device through physical movement
CN109791439A (en) * 2017-07-24 2019-05-21 深圳市柔宇科技有限公司 Gesture identification method, head wearable device and gesture identifying device
EP3477452B1 (en) * 2017-10-27 2022-07-06 Vestel Elektronik Sanayi ve Ticaret A.S. Electronic device and method of operating an electronic device
KR20190059443A (en) 2017-11-23 2019-05-31 주식회사 라온즈 Optical sensing module for wearable smart device and portable smart device
WO2019173011A1 (en) * 2018-03-09 2019-09-12 Apple Inc. Electronic device including contactless palm biometric sensor and related methods
CN109240490A (en) * 2018-08-14 2019-01-18 歌尔科技有限公司 Intelligent wearable device and the interaction control method based on it, system
CN109271026B (en) * 2018-09-14 2022-03-15 歌尔科技有限公司 Wearable device, method, system, device and storage medium for realizing mouse function
CN109692471A (en) * 2018-12-29 2019-04-30 歌尔科技有限公司 A kind of wearable device and exchange method
US11537217B2 (en) * 2019-01-28 2022-12-27 Ams Sensors Singapore Pte. Ltd. Device including an optoelectronic module operable to respond to a user's finger movements for controlling the device
US20220244170A1 (en) * 2019-06-28 2022-08-04 3M Innovative Properties Company State detection of material surfaces of wearable objects using color sensing
CN112462932A (en) * 2019-09-06 2021-03-09 苹果公司 Gesture input system with wearable or handheld device based on self-mixing interferometry
CN110658889A (en) * 2019-09-23 2020-01-07 上海闻泰信息技术有限公司 Wearable device control method, wearable device control device, wearable device control equipment and storage medium
CN110703923A (en) * 2019-10-25 2020-01-17 胡团伟 Instruction output method and device based on wrist or hand skin three-dimensional shape change
US11016535B1 (en) * 2020-01-21 2021-05-25 Pixart Imaging Inc. Electronic device capable of detecting wearing state or touching state
IT202000011221A1 (en) * 2020-05-15 2021-11-15 St Microelectronics Srl SYSTEM AND METHOD OF DETECTING THE LIFTING AND LOWERING OF A USER'S FOOT FOR THE PURPOSE OF ENABLING A FUNCTIONALITY OF A USER'S DEVICE, AND THE USER'S DEVICE
CN112416130B (en) * 2020-11-24 2023-03-21 谢昊 Intelligent motion detection device and method
US20220173088A1 (en) * 2020-12-02 2022-06-02 Asti Global Inc., Taiwan Method of manufacturing a display module and full screen image display device including a display module prepared in accordance with the aforementioned method
CN113370272B (en) * 2021-05-27 2022-12-13 西安交通大学 Pose monitoring system and method of multi-segment continuum robot
CN113672093B (en) * 2021-08-27 2024-02-02 歌尔科技有限公司 Gesture recognition method and device, wearing equipment main body and wearing equipment
CN114722968A (en) * 2022-04-29 2022-07-08 中国科学院深圳先进技术研究院 Method for identifying limb movement intention and electronic equipment

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020033803A1 (en) * 2000-08-07 2002-03-21 The Regents Of The University Of California Wireless, relative-motion computer input device
US20090174578A1 (en) * 2006-07-21 2009-07-09 Brother Kogyo Kabushiki Kaisha Operating apparatus and operating system
US20110054360A1 (en) * 2009-08-27 2011-03-03 Electronics And Telecommunications Research Institute Finger motion detecting apparatus and method
US20110148568A1 (en) * 2009-12-18 2011-06-23 Electronics And Telecommunications Research Institute Apparatus and method for controlling contents player
CN103703495A (en) * 2013-08-23 2014-04-02 华为技术有限公司 Remote control device, information processing method and system
US20150051473A1 (en) * 2013-08-15 2015-02-19 Covidien Lp Systems and methods for photoacoustic spectroscopy
US20150185837A1 (en) * 2013-12-27 2015-07-02 Kofi C. Whitney Gesture-based waking and control system for wearable devices

Family Cites Families (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4819197A (en) * 1987-10-01 1989-04-04 Canadian Patents And Development Limited-Societe Canadienne Des Brevets Et D'exploitation Limitee Peak detector and imaging system
US6747632B2 (en) 1997-03-06 2004-06-08 Harmonic Research, Inc. Wireless control device
JP2001133300A (en) 1999-11-04 2001-05-18 Sony Corp Apparatus and method for recognizing action, and apparatus of presenting inner force sensor, and its control method
JP2002360530A (en) * 2001-06-11 2002-12-17 Waatekkusu:Kk Pulse wave sensor and pulse rate detector
US7123351B1 (en) 2002-08-20 2006-10-17 Schaefer Philip R Method and apparatus for measuring distances using light
ATE533092T1 (en) * 2003-12-17 2011-11-15 Eta Sa Mft Horlogere Suisse PORTABLE ELECTRONIC DEVICE EQUIPPED WITH A PRESSURE SENSOR
JP4379214B2 (en) 2004-06-10 2009-12-09 日本電気株式会社 Mobile terminal device
US9854975B2 (en) * 2006-06-12 2018-01-02 Koninklijke Philips N.V. Skin monitoring device, method of monitoring the skin, monitoring device, method of irradiating the skin, and use of an OLED
JP4736052B2 (en) * 2006-07-21 2011-07-27 ブラザー工業株式会社 Operation signal output device and operation system
US20080256494A1 (en) * 2007-04-16 2008-10-16 Greenfield Mfg Co Inc Touchless hand gesture device controller
KR20090034642A (en) * 2007-10-04 2009-04-08 박가람 Bracelet keyboard and key-in method therby
CN101655739B (en) * 2008-08-22 2012-07-04 原创奈米科技股份有限公司 Device for three-dimensional virtual input and simulation
US8289162B2 (en) * 2008-12-22 2012-10-16 Wimm Labs, Inc. Gesture-based user interface for a wearable portable device
CN102301314B (en) * 2009-02-05 2015-07-01 株式会社eRCC Input device, wearable computer, and input method
US8294105B2 (en) 2009-05-22 2012-10-23 Motorola Mobility Llc Electronic device with sensing assembly and method for interpreting offset gestures
JP4548542B1 (en) * 2009-06-30 2010-09-22 ソニー株式会社 Information processing apparatus, information processing method, and program
KR20110022520A (en) * 2009-08-27 2011-03-07 한국전자통신연구원 Apparatus and method for detecting motion of finger
CN201548915U (en) * 2009-10-21 2010-08-11 昆盈企业股份有限公司 Wearable input device
US9241635B2 (en) * 2010-09-30 2016-01-26 Fitbit, Inc. Portable monitoring devices for processing applications and processing analysis of physiological conditions of a user associated with the portable monitoring device
US9188460B2 (en) * 2010-09-30 2015-11-17 Fitbit, Inc. Methods, systems and devices for generating real-time activity data updates to display devices
JP2012181639A (en) 2011-03-01 2012-09-20 Stanley Electric Co Ltd Operation input device
US8930300B2 (en) * 2011-03-31 2015-01-06 Qualcomm Incorporated Systems, methods, and apparatuses for classifying user activity using temporal combining in a mobile device
KR101765771B1 (en) * 2011-05-05 2017-08-07 맥심 인터그래이티드 프로덕츠 인코포레이티드 Method for detecting gestures using a multi-segment photodiode and one or fewer illumination sources
US20120280900A1 (en) * 2011-05-06 2012-11-08 Nokia Corporation Gesture recognition using plural sensors
CN102801409B (en) * 2011-05-23 2014-11-26 烟台瑞和控电信息系统有限公司 Gesture-recognition-based intelligent switch
US20130033418A1 (en) * 2011-08-05 2013-02-07 Qualcomm Incorporated Gesture detection using proximity or light sensors
CN203241934U (en) * 2012-05-15 2013-10-16 意法半导体有限公司 System for identifying hand gestures, user input device and processor
GB2502087A (en) * 2012-05-16 2013-11-20 St Microelectronics Res & Dev Gesture recognition
US9348462B2 (en) * 2012-06-13 2016-05-24 Maxim Integrated Products, Inc. Gesture detection and recognition based upon measurement and tracking of light intensity ratios within an array of photodetectors
EP2834048B1 (en) 2012-09-21 2017-11-01 iRobot Corporation Proximity sensing on mobile robots
CN104027103A (en) 2013-03-06 2014-09-10 精工爱普生株式会社 BIOLOGICAL INFORMATION DETECTING DEVICE and HEART RATE METER
US20160293143A1 (en) 2013-11-22 2016-10-06 Sharp Kabushiki Kaisha Communication terminal, method of controlling communication terminal and control program
US9582034B2 (en) * 2013-11-29 2017-02-28 Motiv, Inc. Wearable computing device
CN110045824B (en) 2014-02-10 2022-06-17 苹果公司 Motion gesture input detected using optical sensors

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020033803A1 (en) * 2000-08-07 2002-03-21 The Regents Of The University Of California Wireless, relative-motion computer input device
US20090174578A1 (en) * 2006-07-21 2009-07-09 Brother Kogyo Kabushiki Kaisha Operating apparatus and operating system
US20110054360A1 (en) * 2009-08-27 2011-03-03 Electronics And Telecommunications Research Institute Finger motion detecting apparatus and method
US20110148568A1 (en) * 2009-12-18 2011-06-23 Electronics And Telecommunications Research Institute Apparatus and method for controlling contents player
US20150051473A1 (en) * 2013-08-15 2015-02-19 Covidien Lp Systems and methods for photoacoustic spectroscopy
CN103703495A (en) * 2013-08-23 2014-04-02 华为技术有限公司 Remote control device, information processing method and system
US20150185837A1 (en) * 2013-12-27 2015-07-02 Kofi C. Whitney Gesture-based waking and control system for wearable devices

Cited By (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11422635B2 (en) 2014-02-10 2022-08-23 Apple Inc. Optical sensing device
US20170031453A1 (en) * 2014-02-12 2017-02-02 Koninklijke Philips N.V. Movement detection apparatus for detecting a hand movement
US9910508B2 (en) * 2014-02-12 2018-03-06 Koninklijke Philips, N.V. Movement detection apparatus for detecting a hand movement
US20180129297A1 (en) * 2014-02-12 2018-05-10 Koninklijke Philips N.V. Movement detection apparatus for detecting a hand movement
US20160162038A1 (en) * 2014-05-20 2016-06-09 Huawei Technologies Co., Ltd. Method For Performing Operation On Intelligent Wearing Device By Using Gesture, And Intelligent Wearing Device
US10094661B2 (en) * 2014-09-24 2018-10-09 Pixart Imaging Inc. Optical sensor and optical sensor system
US20180084385A1 (en) * 2015-04-15 2018-03-22 Huawei Technologies Co., Ltd. Message sending method in local area network, local area network gateway, and wearable device
US10863309B2 (en) * 2015-04-15 2020-12-08 Huawei Technologies Co., Ltd. Message sending method in local area network, local area network gateway, and wearable device
US20210303081A1 (en) * 2015-09-30 2021-09-30 Apple Inc. Systems and apparatus for object detection
US20190109905A1 (en) * 2016-09-21 2019-04-11 Semiconductor Components Industries, Llc Low power sensor communication using two or fewer wires
US10637927B2 (en) * 2016-09-21 2020-04-28 Semiconductor Components Industries, Llc Low power sensor communication using two or fewer wires
US20180372871A1 (en) * 2017-06-25 2018-12-27 Pixart Imaging Inc. Object state determining apparatus and object state determining method
US10598786B2 (en) * 2017-06-25 2020-03-24 Pixart Imaging Inc. Object state determining apparatus and object state determining method
CN112136162A (en) * 2018-05-15 2020-12-25 皇家飞利浦有限公司 Wrist-worn medical alert device for communicating emergency messages to care providers
US11422692B2 (en) 2018-09-28 2022-08-23 Apple Inc. System and method of controlling devices using motion gestures
US11790698B2 (en) 2019-11-01 2023-10-17 Samsung Electronics Co., Ltd. Electronic device for recognizing gesture of user using a plurality of sensor signals
WO2021126223A1 (en) * 2019-12-19 2021-06-24 Google Llc Direct manipulation of display device using wearable computing device
US11301040B2 (en) 2019-12-19 2022-04-12 Google Llc Direct manipulation of display device using wearable computing device
US20210303068A1 (en) * 2020-03-31 2021-09-30 Apple Inc. Skin-to-skin contact detection
US11397466B2 (en) * 2020-03-31 2022-07-26 Apple Inc. Skin-to-skin contact detection
US11397468B2 (en) 2020-03-31 2022-07-26 Apple Inc. Skin-to-skin contact detection
US11625098B2 (en) 2020-03-31 2023-04-11 Apple Inc. Skin-to-skin contact detection
US11941175B2 (en) 2020-03-31 2024-03-26 Apple Inc. Skin-to-skin contact detection
US11460919B1 (en) * 2021-03-16 2022-10-04 Zeit Technologies Corporation Wearable gesture detector

Also Published As

Publication number Publication date
EP3100134A1 (en) 2016-12-07
US20210157414A1 (en) 2021-05-27
US11422635B2 (en) 2022-08-23
AU2018204039B2 (en) 2020-07-23
AU2020257155A1 (en) 2020-11-19
KR20200119912A (en) 2020-10-20
KR20180093113A (en) 2018-08-20
CN110045824A (en) 2019-07-23
CN106062666A (en) 2016-10-26
KR102310776B1 (en) 2021-10-07
JP2017510912A (en) 2017-04-13
AU2020257155B2 (en) 2022-06-02
KR20190105664A (en) 2019-09-17
KR20160117479A (en) 2016-10-10
EP3100134B1 (en) 2019-10-16
CN106062666B (en) 2019-03-15
KR102019969B1 (en) 2019-09-09
WO2015119637A1 (en) 2015-08-13
JP6427598B2 (en) 2018-11-21
AU2014381638B2 (en) 2018-03-08
CN110045824B (en) 2022-06-17
AU2018204039A1 (en) 2018-06-21
AU2014381638A1 (en) 2016-09-22

Similar Documents

Publication Publication Date Title
US11422635B2 (en) Optical sensing device
US11301048B2 (en) Wearable device for detecting light reflected from a user
US10241755B2 (en) Method and apparatus for physical exercise assistance
US20150309536A1 (en) Systems and methods for a wearable touch-sensitive device
US20150323998A1 (en) Enhanced user interface for a wearable electronic device
Kikuchi et al. EarTouch: turning the ear into an input surface
US20110133934A1 (en) Sensing Mechanical Energy to Appropriate the Body for Data Input
CN109144181A (en) Gestures detection is lifted in equipment
KR20170003662A (en) Hand-worn device for surface gesture input
US20150297140A1 (en) User stress detection and mitigation
Zhang et al. FinDroidHR: Smartwatch gesture input with optical heartrate monitor
Boldu et al. Thumb-In-Motion: Evaluating Thumb-to-Ring Microgestures for Athletic Activity
KR20170064364A (en) Device and method for using friction sound
Zhang et al. FingOrbits: interaction with wearables using synchronized thumb movements
JP6725913B2 (en) Motion gesture input detected using optical sensor
CN113467647A (en) Skin-to-skin contact detection
WO2022228212A1 (en) Gesture interaction method, system, and apparatus
US20230214064A1 (en) Adaptive user interfaces for wearable devices
Vega Gálvez [mu] Jawstures: jaw-teeth microgestures for discreet hands-and-eyes-free mobile device interaction

Legal Events

Date Code Title Description
STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: ADVISORY ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: ADVISORY ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STCV Information on status: appeal procedure

Free format text: NOTICE OF APPEAL FILED

STCV Information on status: appeal procedure

Free format text: NOTICE OF APPEAL FILED

Free format text: APPEAL BRIEF (OR SUPPLEMENTAL BRIEF) ENTERED AND FORWARDED TO EXAMINER

STCV Information on status: appeal procedure

Free format text: APPEAL BRIEF (OR SUPPLEMENTAL BRIEF) ENTERED AND FORWARDED TO EXAMINER

STCV Information on status: appeal procedure

Free format text: EXAMINER'S ANSWER TO APPEAL BRIEF MAILED

STCV Information on status: appeal procedure

Free format text: ON APPEAL -- AWAITING DECISION BY THE BOARD OF APPEALS

STCV Information on status: appeal procedure

Free format text: BOARD OF APPEALS DECISION RENDERED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- AFTER EXAMINER'S ANSWER OR BOARD OF APPEALS DECISION