WO2014066703A2 - Smart contextual display for a wearable device - Google Patents

Smart contextual display for a wearable device Download PDF

Info

Publication number
WO2014066703A2
WO2014066703A2 PCT/US2013/066716 US2013066716W WO2014066703A2 WO 2014066703 A2 WO2014066703 A2 WO 2014066703A2 US 2013066716 W US2013066716 W US 2013066716W WO 2014066703 A2 WO2014066703 A2 WO 2014066703A2
Authority
WO
WIPO (PCT)
Prior art keywords
device
display
context
motion data
modifying
Prior art date
Application number
PCT/US2013/066716
Other languages
French (fr)
Other versions
WO2014066703A3 (en
Inventor
Christopher Verplaetse
Steven Patrick SZABADOS
Marco Kenneth DELLA TORRE
Original Assignee
Basis Science, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to US201261717642P priority Critical
Priority to US61/717,642 priority
Priority to US201261727074P priority
Priority to US61/727,074 priority
Application filed by Basis Science, Inc. filed Critical Basis Science, Inc.
Priority claimed from US14/438,207 external-priority patent/US20150277572A1/en
Publication of WO2014066703A2 publication Critical patent/WO2014066703A2/en
Publication of WO2014066703A3 publication Critical patent/WO2014066703A3/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G04HOROLOGY
    • G04CELECTROMECHANICAL CLOCKS OR WATCHES
    • G04C3/00Electromechanical clocks or watches independent of other time-pieces and in which the movement is maintained by electric means
    • G04C3/001Electromechanical switches for setting or display
    • G04C3/002Position, e.g. inclination dependent switches
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 – G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • GPHYSICS
    • G04HOROLOGY
    • G04GELECTRONIC TIME-PIECES
    • G04G9/00Visual time or date indication means
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2200/00Indexing scheme relating to G06F1/04 - G06F1/32
    • G06F2200/16Indexing scheme relating to G06F1/16 - G06F1/18
    • G06F2200/163Indexing scheme relating to constructional details of the computer
    • G06F2200/1637Sensing arrangement for detection of housing movement or orientation, e.g. for controlling scrolling or cursor movement on the display of an handheld computer

Abstract

A system and a method are disclosed for a display on a device based on context for the device. Motion data along two axes associated with the device is received and the motion data indicates a context for the device such as a position of a body part on which the device is worn. Additionally time points associated with the motion data are received. The display is modified if the motion data along two axes has values within predetermined ranges.

Description

SMART CONTEXTUAL DISPLAY FOR A WEARABLE DEVICE

INVENTORS:

CHRISTOPHER VERPLAETSE STEVEN PATRICK SZABADOS MARCO KENNETH DELLA TORRE

CROSS REFERENCE TO RELATED APPLICATIONS

[0001] This application claims the benefit of U.S. Provisional Patent Applications No. 60/717,642 filed October 24, 2012 and 61/727,074 filed November 15, 2012 under 35 USC § 119(e), the contents of both of which are herein incorporated by reference.

BACKGROUND

1. FIELD OF ART

[0002] The disclosure generally relates to the field of modifying and controlling a display for a wearable device based on a context of the device.

2. DESCRIPTION OF THE RELATED ART

[0003] Wearable devices such as watches, music players and the like enable users to interact with technology in a convenient and continuous manner, since they can be present on the body in the context of all lifestyle activities. Devices now have more and more functions. While additional functionality is supposed to add utility, increased functionality also means interacting with the device more to activate the various functions. Additionally, wearable devices are preferably small and thus any controls are also small. These aspects result in devices that have less than desired utility.

BRIEF DESCRIPTION OF DRAWINGS

[0004] The disclosed embodiments have other advantages and features which will be more readily apparent from the detailed description, the appended claims, and the

accompanying figures (or drawings). A brief introduction of the figures is below. [0005] Figure (FIG.) 1 illustrates one embodiment of a wearable device with a display.

[0006] FIG. 2 illustrates another view of an embodiment of a wearable device.

[0007] FIG. 3 illustrates a flow chart for activating a display according to one embodiment.

[0008] FIG. 4 illustrates a wearable device and axes around which motion is determined according to one embodiment.

DETAILED DESCRIPTION

[0009] The Figures (FIGS.) and the following description relate to preferred

embodiments by way of illustration only. It should be noted that from the following discussion, alternative embodiments of the structures and methods disclosed herein will be readily recognized as viable alternatives that may be employed without departing from the principles of what is claimed.

[0010] Reference will now be made in detail to several embodiments, examples of which are illustrated in the accompanying figures. It is noted that wherever practicable similar or like reference numbers may be used in the figures and may indicate similar or like functionality. The figures depict embodiments of the disclosed system (or method) for purposes of illustration only. One skilled in the art will readily recognize from the following description that alternative embodiments of the structures and methods illustrated herein may be employed without departing from the principles described herein.

CONFIGURATION OVERVIEW

[0011] One embodiment of the disclosed device and method includes a motion detection system to identify the context of a device. The context of the device is a function of its motion and relative position as well as time. The context of the device can be a position of the device relative to the user or an activity in which the user is engaged. In one embodiment, the operation of a screen (or display) of the device is modified in response to the identified context. Modifying operation of the display includes, but is not limited to, activating or deactivating the display, increasing or decreasing a brightness of the display, turning on a backlight on the display, and providing information on the display. In one embodiment, the identified context is the device facing the user and in response to that context, a screen (or display) of the device is activated. Activating the display includes turning on the display or turning on a backlight to make the display more visible. In other embodiments, the context for the device results in specified content being provided on the display of the device.

EXAMPLE DEVICE CONFIGURATION

[0012] Referring now to FIG. 1, it illustrates an embodiment of a wearable device 100. The exemplary device 100 is worn on the wrist attached through a fastening system 101. The fastening system 101 may be removable, exchangeable or customizable. The device 100 includes a display 102 and user interaction points 103.

[0013] FIG. 2 illustrates another view of an embodiment of the wearable device 100. The view includes the fastening system 101, display 102. Additionally the motion system 207 and processor 205 are shown. Example processors 205 include the TIMSP430 from TEXAS INSTRUMENTS and ARM Cortex-M class microcontrollers. The processor 205 receives data from the position system 207 and determines when to activate display 102. The motion system 207 is any kind of motion sensor. Example motion sensors include an accelerometer, a gyroscope, a pressure sensor, a compass and a magnetometer. EXAMPLE PROCESSING CONFIGURATION

[0014] An example of the disclosed device and method is described in reference to FIGS. 3 and 4. In this example, the context identified is the device 100 being positioned to face the user and in response the display 102 is activated or deactivated. Referring to FIG. 3, as the device 100 is worn, the processor 205 receives 309 data from the motion system 207 in three axes. The motion system 207 in this embodiment is an accelerometer. FIG. 4 illustrates another view of an embodiment of the device 100 and shows one example of the axes relative to which motion data and position data is determined by the motion system 207. The processor 205 compares 311 the data for each axis and determines when each axis is within a predetermined range. The predetermined ranges are indicative that the device has been turned to face the user. Responsive to the data indicating that the device has been turned to face the user, the processor activates 313 the display 102. The processor 205 determines the device has been turned to face the user based on data from one or more axes being within its predetermined range for a threshold period of time. In one embodiment, predetermined ranges for motion data are within an X-Y-Z Cartesian coordinate system for a user wearing a device 100 while in an upright position, for example:

X axis: +0.8g to lg

Y axis: -0.2g to 0.2g

Z axis: -0.2g to 0.2g

In some embodiments, the processor activates the display when the data along one or more axes falls within its predetermined threshold for more than 300 milliseconds (ms). In other embodiments, the data along the one or more axes for 250 or 500 ms. For various uses of the disclosed system, the optimal time period can be determined such that the display does not activate inadvertently and still turns on quickly enough to be responsive to the user's expectation.

[0015] The processor 205 continues to monitor motion and relative position data and deactivates the display 102 when the motion and relative position data fall outside the range that triggered activation. In some embodiments, the processor 205 deactivates the display after data from the motion system 207 for just one of the axes is no longer in the threshold range. The processor 205 can deactivate the display 102 instantaneously or after the data for the one axis remains outside the threshold for 500 ms.

[0016] In another embodiment, processor 205 continuously monitors the motion system 207 and activates the display 102 when there is a rotational motion of the device 100 with a principal rotational axis coincident with that of the user's wrist. If the user is upright, the wrist is along the Y-axis (referring to FIG. 4) and rotation in the negative Y direction yielding an increasing X acceleration due to increased coincidence with gravity indicates display 102 is being rotated toward user. For example, a change in the positive X acceleration of +0.25g over the course of a predetermined time period, e.g., 1 second, may indicate such a rotation. When the condition is met within a certain predefined timeframe, and when rotation stops with the device oriented with the display 102 (defined by the X/Y plane as in the example above) the processor 205 activates display 102. The processor 205 applies timing windows and orientation range limits to protect against the activation of the backlight during similar gesticulations such as running or showering.

[0017] In another embodiment, the processor 205 activates the display 102 when there is complex rotational motion of the device 100 with multiple axes of rotation. The first rotational component corresponding to rotation of the user's forearm about their elbow (primarily observed as a rotation about the Z axis), thus indicating that the user's forearm is being swung or lifted up toward the user's face. A second rotational component corresponding to rotation of the user's wrist (see the rotation about the "Y-axis", as in the example above, and in FIG. 4), in the negative Y direction, thus indicating display 102 is being rotated toward the user. For example, in order to capture a user bringing the device 100 from their side, into viewing position, the following operations may be detected:

1. An initial position determined by the gravity vector being coincident with the negative Y axis.

2. As the hand is brought up from the user's side, a rotation in the negative Z axis is detected as the gravity vector is observed to move from the negative Y direction to the negative X direction.

3. As the user rotates the forearm to view the display 102, a complex rotation is observed with a negative rotation in the Y axis

4. The device 100 is then observed to remain in the final orientation observed at the end of the previous step for a period of time, indicating that the user is viewing the device 100 and the display 102 should be activated. This period of time could be 300ms, as in the above example. Similarly, the device could disable the display 102 with a change in orientation, a rotation out of this position or a timeout. This timeout could also be 500ms, as in the example above.

[0018] In yet another embodiment, the processor 205 applies an algorithm to the orientation of the gravity vector, relative to the device's reference frame. The output of the algorithm identifies complex rotations that indicate a change in user context. For example, (referring to FIG. 4) when device 100 rotation causes the gravity vector to traverse the X/Y plane and indicate rotation in the negative Y direction, with rotation stopping at a device orientation indicating user viewing, and when these conditions are met within a certain predefined timeframe, the processor 205 will activate the display 102. In this embodiment, a window of rotational components may also be defined, whereby the timeframe to execute the rotations is 1 second. The algorithm uses timing windows and orientation range limits to protect against the activation of the display 102 during similar gesticulations such as running or showering.

[0019] In some embodiments the display 102 cannot be activated for a predetermined amount of time after it has been deactivated. Optionally, reactivation is blocked only after a threshold number of activations and deactivations within a given time period. This further protects against inadvertent activation of the display 102. For example, if the display has been activated and deactivated a number of times in a minute, it is likely that the activation is in error and thus it is beneficial to prevent the display 102 from reactivating for a period of time such as 5 seconds or 10 seconds.

ADDITIONAL EXEMPLARY PROCESS CONFIGURATIONS

[0020] In addition to activating the display in response to the device 100 be positioned to face the user, the processor 205 can provide specified content for display on display 102 based on an identified context of the device 100. The identification of context includes learning from the user's interactions with the device 100. For example, if a user accesses data from the device 100 around the same time every morning, the processor 205 can display the usual data in response to the device 100 facing the user at that time. If the device 100 is turned to face the user at another time of day, the processor 205 merely activates the display 102 but does not provide any particular content.

[0021] Aside from activating a backlight, the device 100 may use context information to modify other display parameters based on context, including the contrast, brightness, orientation or displayed content. The device may also use other feedback such as an audio cue or a physical feedback such as vibration.

[0022] A context may also be used to trigger input. For example, rather than (or in addition to) activating a backlight the device 100 may activate other functions such as a microphone to enable voice recording, a speaker to activate sound, atelephony or voice- activated feature, or the connection to a wireless network to synchronize its data or new display content. Example wireless networks include a LAN, MAN, WAN, mobile network, and a telecommunication network.

[0023] In yet another embodiment, the device 100 communicates via the wireless network to modify operation of a display on a remote device in the same way that operation of a display on the device 100 is modified. Operation of the remote display may be modified in addition to or in place of modifying operation of the display 104 on the device 100.

[0024] A combination of contexts may be detected in sequence to create additional contextual information. For example, if the motion system 207 detects that a user is stepping (e.g., walking, running), and the device 100 is facing the user, a recent step count may be displayed as well as the backlight being activated. This is an example of two detected contexts providing additional opportunity for customization of the user experience based on more than one detected context in parallel or in sequence. Another example would be using a detection of sleep to disable a backlight being activated. For example, if the motion system 207 detects a period of low motion, the device 100 may require a period of high motion before the automatic backlight would again be activated when the device is positioned to face the user. This would be advantageous because it is possible for the device 100 to be turned to face the user during sleep. If the display 102 were activated, this could wake the user. [0025] An additional identified context collected may also represent social gestures such as a handshake, "fist bump", or "high five". Since this action has a particular orientation and acceleration pattern associated with it, the context of this social gesture may be detected and used to modify the nature of the data displayed or stored around the event. For example, a precise timestamp, location, data signature or other information may be saved as a result of this event. The information could also be compared to other users' saved data as a means of identifying social gestures between users. For example, if another user from a similar location had engaged in a similar gesture at the same time, these two users could be linked on a network, share some information or have the event recorded on the device 100 for subsequent processing.

[0026] Another example of multiple contexts being used to generate behavior or user information would be the automatic detection of which hand the device 100 is being worn on. By using the accelerometer signals to detect step events, as well as orientation, the device 100 can infer where on the body or in which orientation the device 100 is being worn. For example, if the device 100 is being worn on the wrist, a first context of slowly stepping and a second context of the orientation of the device 100 could be used to determine which wrist the device 100 is being worn on.

[0027] The device 100 may incorporate other sensors and use them to improve the mechanism described above. These may be to provide additional contextual information, or to provide information for display based on contextual conditions. Examples of each of these sensors are summarized in Table 1 below.

Figure imgf000011_0001
Non-invasive,

internal

Techniques such as optical, ultrasound, laser, conductance,

physiological

capacitance

parameter

sensing

Thermal Skin Temperature, ambient Temperature, core temperature

Galvanic skin response, electrodermal activity, perspiration, sweat

Skin Surface

constituent analysis ( Cortisol, alcohol, adrenalin, glucose, urea,

Sensors

ammonia, lactate )

Ultraviolet light, visible light, moisture/humidity, air content (

Environmental

pollen, dust, allergens ), air quality

MOTION

[0028] Motion sensing can provide additional contextual information via the detection and recognition of signature motion environments, actions, and/or contexts such as: in-car, in-airplane, walking, running, swimming, exercising, brushing teeth, washing car, eating, drinking, shaking hands, arm wrestling. The example outlined in the algorithms above, whereby an accelerometer is used to detect the gesture corresponding to the wearer looking at the device 100 is another example. Other gestures such as a fist-bump, hand shake, wave, or signature are further examples. Motion and relative position analytics may also be calculated for display, such as step count, activity type, context specific analysis of recent activity, summaries for the day, week, month or other time period to date.

[0029] In some embodiments, multiple motion sensors can be used. For example both an accelerometer and gyroscope could be used. Additional types of motion sensors provides for more detailed inputs allowing for determination of additional contexts of the device 100.

[0030] In some embodiments, a detected context is an activity in which a user is engaged. An example activity is exercise. Exercises that involve repeated motion are detected by the processor 205 by identifying from motion data received from the motion system 207 repeated motions within a predetermined time period. Lifting weights and jumping jacks are examples of exercises that involve repeating the same motion.

NON-INVASIVE, INTERNAL PHYSIOLOGICAL PARAMETER SENSING

[0031] The device 100 may contain non-invasive, internal physiological parameter sensing such as the detection of blood flow, respiration or other parameters. These could be used to detect context, or as inputs to the display triggered by a context detected. For example, an accelerometer could detect the context of running, followed by the context of looking at the device 100, and the device 100 could display both a detected distance for the run and the maximum heart rate. Similarly, the device 100 could detect the blood alcohol level of the wearer and, if above a threshold, this context could trigger a visual alert to the user.

THERMAL

[0032] Thermal sensors could be used to detect body, skin and ambient temperature for the purpose of context detection or display parameters. For example, a sensor able to detect skin temperature could use this information to infer the context of the user exerting himself physically and change the display to reflect parameters relevant to physical effort. Thermal information may also be relevant to display based on other contexts. For example, if the wearer is sweating due to physical exertion, the difference between environmental temperature and skin temperature could provide a parameter to inform the time to recovery from the physical exertion. In this case the context could be detected by an accelerometer, but the displayed metric be derived (at least in part) from thermal sensors.

[0033] In another embodiment a chance in temperature identifies the context that the wearer of the device has changed locations. Depending on the weather, there is a temperature difference between inside and outside of buildings, vehicles etc. For example, a drop in ambient temperature from 90 degrees F to 75 degrees F indicates the user has left outside and gone into a building.

SKIN SURFACE SENSORS

[0034] Skin surface sensors detect parameters purely via measuring non-invasively from the skin surface. For example, a galvanic skin response or electrodermal activity sensor may be used to detect small changes in skin surface conductivity and thereby detect events such as emotional arousal. This can be used as context for an adaptive display modification, such as triggering the backlight or vibrating the device 100 to alert the user to the stress event and help them take action to mitigate it. For example, a device 100 could vibrate more strongly if a stress event was more pronounced. A perspiration sensor could also be used to generate a display of workout parameters such as intensity if the context of physical exertion was detected. This context could be detected by the perspiration sensor itself, or by another sensor such as the accelerometer.

ENVIRONMENTAL SENSORS

[0035] Sensors that detect signals pertaining to the environment around the wearer provide both a unique reference point for signals sourced from the wearer himself, as well as additional context to inform adaptive processing and display. For example, a device 100 that included an ultraviolet light sensor could modify the display when the user had received the recommended maximum exposure to ultraviolet light for the day. A user whose context has been determined to be sweating via a skin surface perspiration sensor may be exposed to a display of the humidity in the air around them as a result of this contextual awareness.

[0036] Environmental sensors can also be used to identify a change from an indoor to outdoor context (or vice versa). An ambient light sensor would sense the change from generally lower light indoors to generally brighter light outdoors. Depending on whether, a humidity sensor an also identify this context. Heating and cooling a building often results in less humidity than outdoors. Thus an increase or decrease in humidity is indicative of changing context from indoors to outdoors or the reverse.

ADDITIONAL CONSIDERATIONS

[0037] The disclosed embodiments beneficially allow for making a device more intuitive and therefore useful for the user. The more a device provides desired information without being specifically requested to do so, the more useful it is. For example, activating a display only when it is needed and without explicit instruction to do so by the user also provides additional security for the user as the display may be displaying health-related information such as stress levels. Additionally power is saved by only activating the display in it is needed.

[0038] Some portions of above description describe the embodiments in terms of algorithms and symbolic representations of operations on information. These algorithmic descriptions and representations are commonly used by those skilled in the data processing arts to convey the substance of their work effectively to others skilled in the art. These operations, while described functionally, computationally, or logically, are understood to be implemented by computer programs or equivalent electrical circuits, microcode, or the like. Furthermore, it has also proven convenient at times, to refer to these arrangements of operations as modules, without loss of generality. The described operations and their associated modules may be embodied in software, firmware, hardware, or any combinations thereof.

[0039] As used herein any reference to "one embodiment" or "an embodiment" means that a particular element, feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment. The appearances of the phrase "in one embodiment" in various places in the specification are not necessarily all referring to the same embodiment.

[0040] Some embodiments may be described using the expression "coupled" and "connected" along with their derivatives. It should be understood that these terms are not intended as synonyms for each other. For example, some embodiments may be described using the term "connected" to indicate that two or more elements are in direct physical or electrical contact with each other. In another example, some embodiments may be described using the term "coupled" to indicate that two or more elements are in direct physical or electrical contact. The term "coupled," however, may also mean that two or more elements are not in direct contact with each other, but yet still co-operate or interact with each other. The embodiments are not limited in this context.

[0041] As used herein, the terms "comprises," "comprising," "includes," "including," "has," "having" or any other variation thereof, are intended to cover a non-exclusive inclusion. For example, a process, method, article, or apparatus that comprises a list of elements is not necessarily limited to only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Further, unless expressly stated to the contrary, "or" refers to an inclusive or and not to an exclusive or. For example, a condition A or B is satisfied by any one of the following: A is true (or present) and B is false (or not present), A is false (or not present) and B is true (or present), and both A and B are true (or present).

[0042] In addition, use of the "a" or "an" are employed to describe elements and components of the embodiments herein. This is done merely for convenience and to give a general sense of the invention. This description should be read to include one or at least one and the singular also includes the plural unless it is obvious that it is meant otherwise.

[0043] Upon reading this disclosure, those of skill in the art will appreciate still additional alternative structural and functional designs for a system and a process for modifying operation of a device in response to an identified context through the disclosed principles herein. Thus, while particular embodiments and applications have been illustrated and described, it is to be understood that the disclosed embodiments are not limited to the precise construction and components disclosed herein. Various modifications, changes and variations, which will be apparent to those skilled in the art, may be made in the arrangement, operation and details of the method and apparatus disclosed herein without departing from the spirit and scope defined in the appended claims.

Claims

CLAIMS WHAT IS CLAIMED IS:
1. A method for controlling a display on a device, the method comprising:
receiving motion data along two axes associated with the device, the data associated with a plurality of time points;
determining a context of the device based on the motion data and the plurality of time points; and
modifying operation of the display in response to the determined context.
2. The method of claim 1 further comprising receiving motion data along a third axis associated with the device and associated with a plurality of time points and wherein determining a context of the device is based on the motion data along each of the three axes and associated pluralities of time points.
3. The method of any previous claim wherein the context comprises a position of a body part on which the device is worn.
4. The method of any one of claims 1 and 2 wherein the context comprises a motion of the body part on which the device is worn.
5. The method of claim any previous claim, wherein modifying operation of the display comprises activating the display and further comprising:
receiving a second set of motion data corresponding to each of the two axes
associated with the device;
determining a second context for the device based on the second set of motion data; and deactivating the display in response to one axis of the second set of motion data having a value outside a second predetermined range of values.
6. The method of any previous claim wherein modifying the display comprises turning on a backlight.
7. The method of any one of claims 1-6 wherein modifying the display comprises
modifying a brightness of the display.
8. The method of any one of claims 1 and 2 wherein the context is an activity of the user wearing the device.
9. The method of claim 8 wherein modifying the display comprises providing
information relevant to the activity.
10. The method of any one of claims 1-7 wherein modifying the display comprises
providing information for display.
11. A system for controlling a display on a device, the system comprising a processor configured to:
receive motion data along two axes associated with the device, the data associated with a plurality of time points;
determine a context of the device based on the motion data and the plurality of time points; and
modify operation of the display in response to the determined context.
12. The system of claim 11 wherein the processor is further configured to receive motion data along a third axis associated with the device and associated with a plurality of time points and wherein determining a context of the device is based on the motion data along each of the three axes and associated pluralities of time points.
13. The system of claim 11 or claim 12 wherein the context is a position of a body part on which the device is worn.
14. The system of claim 11 or claim 12 wherein the context comprises a motion of the body part on which the device is worn.
15. The system of any one of claims 11-14 wherein modifying operation of the display comprises activating the display and the processor if further configured to:
receive a second set of motion data corresponding to each of the two axes associated with the device;
determine a second context for the device based on the second set of motion data; and deactivate the display in response to one axis of the second set of motion data having a value outside a second predetermined range of values.
16. The system of any one of claims 11-15 wherein modifying the display comprises turning on a backlight.
17. The system of any one of claims 11-16 wherein modifying the display comprises modifying a brightness of the display.
18. The system of claim 11 or claim 12 wherein the context is an activity of the user wearing the device.
19. The system of claim 18 wherein modifying the display comprises providing
information relevant to the activity.
20. The system of any one of claims 11-17 wherein modifying the display comprises providing information for display.
21. A computer readable medium configured to store instructions, the instructions when executed by a processor cause the processor to: receive motion data along two axes associated with the device, the data associated with a plurality of time points;
determine a context of the device based on the motion data and the plurality of time points; and
modify operation of the display in response to the determined context.
22. The computer readable medium of claim 21 further comprising instructions that cause the processor to receive motion data along a third axis associated with the device and associated with a plurality of time points and wherein determining a context of the device is based on the motion data along each of the three axes and associated pluralities of time points.
23. The computer readable medium of claim 21 or claim 22 wherein the context is a position of a body part on which the device is worn.
24. The computer readable medium of claim 21 or claim 22 wherein the context
comprises a motion of the body part on which the device is worn.
25. The computer readable medium of any one of claims 21-24 wherein modifying
operation of the display comprises activating the display and further comprising instructions that cause the processor to:
receive a second set of motion data corresponding to each of the two axes associated with the device;
determine a second context for the device based on the second set of motion data; and deactivate the display in response to one axis of the second set of motion data having a value outside a second predetermined range of values.
26. The computer readable medium of any one of claims 21-25 wherein modifying the display comprises turning on a backlight.
27. The computer readable medium of any one of claims 21-26 wherein modifying the display comprises modifying a brightness of the display.
28. The computer readable medium of claim 21 or claim 22 wherein the context is an activity of the user wearing the device.
29. The computer readable medium of claim 28 wherein modifying the display comprises providing information relevant to the activity.
30. The computer readable medium of any one of claims 21-27 wherein modifying the display comprises providing information for display.
PCT/US2013/066716 2012-10-24 2013-10-24 Smart contextual display for a wearable device WO2014066703A2 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
US201261717642P true 2012-10-24 2012-10-24
US61/717,642 2012-10-24
US201261727074P true 2012-11-15 2012-11-15
US61/727,074 2012-11-15

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US14/438,207 US20150277572A1 (en) 2012-10-24 2013-10-24 Smart contextual display for a wearable device

Publications (2)

Publication Number Publication Date
WO2014066703A2 true WO2014066703A2 (en) 2014-05-01
WO2014066703A3 WO2014066703A3 (en) 2014-06-19

Family

ID=50545490

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2013/066716 WO2014066703A2 (en) 2012-10-24 2013-10-24 Smart contextual display for a wearable device

Country Status (1)

Country Link
WO (1) WO2014066703A2 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016097381A1 (en) * 2014-12-19 2016-06-23 Koninklijke Philips N.V. Dynamic wearable device behavior based on schedule detection
WO2018111614A1 (en) * 2016-12-15 2018-06-21 Holor, Llc Wearable multi-functional personal security device
US10088866B2 (en) 2015-03-18 2018-10-02 Motorola Mobility Llc Controlling the orientation of a device display based on usage context

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100214216A1 (en) * 2007-01-05 2010-08-26 Invensense, Inc. Motion sensing and processing on mobile devices
US20100328201A1 (en) * 2004-03-23 2010-12-30 Fujitsu Limited Gesture Based User Interface Supporting Preexisting Symbols
US20110205156A1 (en) * 2008-09-25 2011-08-25 Movea S.A Command by gesture interface
US20110224564A1 (en) * 2010-03-10 2011-09-15 Sotera Wireless, Inc. Body-worn vital sign monitor
US20110304531A1 (en) * 2010-06-10 2011-12-15 Peter Brooks Method and system for interfacing and interaction with location-aware devices

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100328201A1 (en) * 2004-03-23 2010-12-30 Fujitsu Limited Gesture Based User Interface Supporting Preexisting Symbols
US20100214216A1 (en) * 2007-01-05 2010-08-26 Invensense, Inc. Motion sensing and processing on mobile devices
US20110205156A1 (en) * 2008-09-25 2011-08-25 Movea S.A Command by gesture interface
US20110224564A1 (en) * 2010-03-10 2011-09-15 Sotera Wireless, Inc. Body-worn vital sign monitor
US20110304531A1 (en) * 2010-06-10 2011-12-15 Peter Brooks Method and system for interfacing and interaction with location-aware devices

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016097381A1 (en) * 2014-12-19 2016-06-23 Koninklijke Philips N.V. Dynamic wearable device behavior based on schedule detection
US20180000414A1 (en) * 2014-12-19 2018-01-04 Koninklijke Philips N.V. Dynamic wearable device behavior based on schedule detection
US10088866B2 (en) 2015-03-18 2018-10-02 Motorola Mobility Llc Controlling the orientation of a device display based on usage context
WO2018111614A1 (en) * 2016-12-15 2018-06-21 Holor, Llc Wearable multi-functional personal security device

Also Published As

Publication number Publication date
WO2014066703A3 (en) 2014-06-19

Similar Documents

Publication Publication Date Title
Dai et al. PerFallD: A pervasive fall detection system using mobile phones
TWI578179B (en) The portable electronic device based on the user and the portable electronic device to control the motion of a method for operating
US9168419B2 (en) Use of gyroscopes in personal fitness tracking devices
JP6478461B2 (en) Mobile devices with intuitive alerts
CN104571535B (en) Method and apparatus mood-based haptic feedback for generating
US10109175B2 (en) Notifications on a user device based on activity detected by an activity monitoring device
US9819754B2 (en) Methods, systems and devices for activity tracking device data synchronization with computing devices
JP6482765B2 (en) System and method for modifying haptic effect parameters
US20140266780A1 (en) Motion profile templates and movement languages for wearable devices
US9188460B2 (en) Methods, systems and devices for generating real-time activity data updates to display devices
CN105491948B (en) Dynamic sampling
CN104095615B (en) A personal sleep monitoring methods and monitoring systems
US9013264B2 (en) Multipurpose controller for electronic devices, facial expressions management and drowsiness detection
TWI598076B (en) Physical activity and fitness monitor
US20120316456A1 (en) Sensory user interface
US9808185B2 (en) Movement measure generation in a wearable electronic device
US20120316406A1 (en) Wearable device and platform for sensory input
US8421634B2 (en) Sensing mechanical energy to appropriate the body for data input
US10381109B2 (en) Multimode sensor devices
US7306567B2 (en) Easy wake wrist watch
US9374279B2 (en) Motion-activated display of messages on an activity monitoring device
US8903671B2 (en) Portable monitoring devices and methods of operating the same
US9017221B2 (en) Delayed goal celebration
US9044150B2 (en) Biometric monitoring device with heart rate measurement activated by a single user-gesture
US9042971B2 (en) Biometric monitoring device with heart rate measurement activated by a single user-gesture

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13848646

Country of ref document: EP

Kind code of ref document: A2

WWE Wipo information: entry into national phase

Ref document number: 14438207

Country of ref document: US

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13848646

Country of ref document: EP

Kind code of ref document: A2

122 Ep: pct application non-entry in european phase

Ref document number: 13848646

Country of ref document: EP

Kind code of ref document: A2