WO2015134128A1 - Detecting imminent use of a device - Google Patents
Detecting imminent use of a device Download PDFInfo
- Publication number
- WO2015134128A1 WO2015134128A1 PCT/US2015/013116 US2015013116W WO2015134128A1 WO 2015134128 A1 WO2015134128 A1 WO 2015134128A1 US 2015013116 W US2015013116 W US 2015013116W WO 2015134128 A1 WO2015134128 A1 WO 2015134128A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- inference
- imminent use
- imminent
- logic configured
- period
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/038—Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01J—MEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
- G01J1/00—Photometry, e.g. photographic exposure meter
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01P—MEASURING LINEAR OR ANGULAR SPEED, ACCELERATION, DECELERATION, OR SHOCK; INDICATING PRESENCE, ABSENCE, OR DIRECTION, OF MOVEMENT
- G01P15/00—Measuring acceleration; Measuring deceleration; Measuring shock, i.e. sudden change of acceleration
- G01P15/14—Measuring acceleration; Measuring deceleration; Measuring shock, i.e. sudden change of acceleration by making use of gyroscopes
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72448—User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
- H04M1/72454—User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02D—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
- Y02D30/00—Reducing energy consumption in communication networks
- Y02D30/70—Reducing energy consumption in communication networks in wireless communication networks
Definitions
- the present disclosure relates to the field of wireless communications, human-computer interaction and mobile user experience design.
- the present disclosure relates to apparatuses and methods of detecting imminent use of a device.
- a device can be configured to consume sensor data, such as accelerometer data, or other available information obtained from low power sources. From the sensor data or other available information, the device can be configured to determine an inference of imminent use. Based on the inference of imminent use, the device can be configured to provide information for power management applications or situation aware applications, and/or other applications, according to some implementations of the disclosure.
- a method of detecting imminent use of a device may comprise receiving sensor data by one or more sensors of the device, and determining an inference of imminent use of the device based at least in part on the sensor data. The method of receiving sensor data may comprise receiving measurements collected by one or more accelerometers over a period of time in one or more axes, receiving
- the method of determining the inference of imminent use may comprise detecting one or more reference motions associated with the inference of imminent use, where one or more reference motions associated with the inference of imminent use comprise at least one of a first motion that indicates the device being picked up from a supporting surface, a second motion that indicates the device being pulled out of a holder, or a third motion that indicates the device being picked up from an idle state.
- the method of determining the inference of imminent use may comprise detecting one or more user-specific actions associated with the inference of imminent use, where the one or more user-specific actions associated with the inference of imminent use comprise at least one of a first action that indicates a user is left-handed, or a second action that indicates the user is right-handed.
- the method of determining the inference of imminent use may comprise detecting one or more contextual triggers associated with the inference of imminent use, where the one or more contextual triggers associated with the inference of imminent use comprise at least one of a first trigger that causes the device to vibrate, a second trigger that causes the device to ring, a third trigger that causes the device to flash a light emitting diode, or a fourth trigger that causes the device to generate an alert message.
- the method of determining the inference of imminent use may comprise collecting contextual data related to a history of use of the device, and determining the inference of imminent use based at least in part on the contextual data.
- a device may comprise one or more sensors configured to receive sensor data, a non-transitory memory configured to store the sensor data, and a controller including one or more processors and an imminent use detector, where the one or more processors and the imminent use detector comprise logic configured to determine an inference of imminent use of the device based at least in part on the sensor data.
- a computer program product may comprise non-transitory medium storing instructions for execution by one or more computer systems.
- the instructions may comprise instructions for receiving sensor data by one or more sensors of the device, and instructions for determining an inference of imminent use of the device based at least in part on the sensor data.
- an apparatus may comprise means for receiving sensor data, and means for determining an inference of imminent use of the device based at least in part on the sensor data.
- FIG. 1 illustrates an exemplary flow chart of detecting imminent use of a device according to some aspects of the present disclosure.
- FIG. 2A illustrates a group of exemplary sensor observations for pick-up detection of a mobile device according to some aspects of the present disclosure.
- FIG. 2B illustrates another group of exemplary sensor observations for pick-up detection of a mobile device according to some aspects of the present disclosure.
- FIG. 2C illustrates yet another group of exemplary sensor observations for pick-up detection of a mobile device according to some aspects of the present disclosure.
- FIG. 2D illustrates yet another group of exemplary sensor observations for pick-up detection of a mobile device according to some aspects of the present disclosure.
- FIG. 3 illustrates an exemplary method of detecting face position of a mobile device according to some aspects of the present disclosure.
- FIG. 4A illustrates a group of exemplary sensor observations for stabilization detection of a mobile device according to some aspects of the present disclosure.
- FIG. 4B illustrates another group of exemplary sensor observations for angle stabilization detection of a mobile device according to some aspects of the present disclosure.
- FIG. 5A illustrates an exemplary embodiment of three axes angle stabilization of a mobile device according to some aspects of the present disclosure.
- FIG. 5B illustrates another exemplary embodiment of three axes angle stabilization of a mobile device according to some aspects of the present disclosure.
- FIG. 6 illustrates an exemplary block diagram of a mobile device according to some aspects of the present disclosure.
- FIG. 7A illustrates an exemplary application environment of an imminent use detector according to some aspects of the present disclosure.
- FIG. 7B illustrates another exemplary application environment of an imminent use detector according to some aspects of the present disclosure.
- FIG. 7C illustrates an exemplary application upon a determination of an inference of imminent use of a device according to some aspects of the present disclosure.
- FIG. 8A illustrates an exemplary flow chart of detecting imminent use of a device according to some aspects of the present disclosure.
- FIG. 8B illustrates an exemplary implementation of receiving sensor data of FIG. 8A according to some aspects of the present disclosure.
- FIG. 8C illustrates exemplary implementations of determining an inference of imminent use of the device of FIG. 8 A according to some aspects of the present disclosure.
- FIG. 1 illustrates an exemplary flow chart of detecting imminent use of a device according to some aspects of the present disclosure.
- the method of detecting imminent use of a device as performed by an imminent use detector may include the functions performed in blocks 101 to 107 of FIG. 1.
- the method receives sensor data from one or more sensors.
- the sensor data may be collected by one or more accelerometers, one or more proximity sensors, one or more ambient light sensors, or other types of sensors.
- the method may determine an initial location of a mobile device 102.
- Examples of initial location of the mobile device 102 may be on a desk (face up or face down), in a pocket, in a bag, being held in a hand, or other possible initial locations. If the initial location of the mobile device 102 is on a desk, the method moves to block 103; and if the initial location of the mobile device 102 is in a pocket or bag, the method moves to block 105; and if the initial location of the mobile device 102 is neither on a desk nor in a pocket or a bag, the method stays in block 101.
- one approach to determine whether the initial location of a mobile device 102 is on a desk and facing up is to examine the angle 109 between the accelerometer z-axis vector 111 and the gravity vector 113.
- the mobile device 102 may be considered to be placed on a desk (and face up) if the angle is smaller than a predetermined value (e.g. 5 degrees) for at least a predetermined period of time, such as at least 4 seconds.
- a predetermined value e.g. 5 degrees
- the mobile device 102 may be considered to be on a desk (and face down), in a pocket, or in a bag if proximity has been detected for a predetermined period of time, for example for 4 seconds.
- accelerometer information may be used to disambiguate between whether the mobile device 102 may be placed face down on a desk, placed in a pocket, or placed in a bag.
- the angle 109 between the accelerometer z-axis vector 111 of the mobile device 102 and gravity vector 113 can be computed. This angle may be about 180 degrees (e.g. with a tolerance of 5 degrees or less) if the mobile device 102 is being placed face down on a desk. On the other hand, this angle may be fluctuating or may not meet the above condition if the mobile device 102 is being placed in a pocket or being placed in a bag.
- the method may determine whether the mobile device 102 has been picked up from a supporting surface (e.g. a desk).
- the method of pick-up detection may take into consideration a combination of accelerometer and proximity sensor data to predict a pick-up action of the mobile device 102 using pre-trained statistical models. This approach is further described in the following sections in association with the description of FIG. 2A-2D. If the mobile device 102 has not been picked up from the supporting surface, the method returns to block 101. Alternatively, if the mobile device 102 has been picked up from the supporting surface, the method moves to block 107.
- the method may begin to perform application synchronization. In another exemplary implementation, if it is determined that the mobile device 102 has been picked up from the supporting surface, the method may turn on the display
- the method may determine whether the mobile device 102 has been picked up from a holder (such as a pocket or a bag). If the mobile device 102 has not been picked up from the holder, the method returns to block 101. Alternatively, if the mobile device 102 has been picked up from the holder, the method moves to block 107. Similarly, if it has been determined that the mobile device 102 has been picked up from the holder, the method may begin to perform application synchronization.
- a holder such as a pocket or a bag
- the method may determine whether a face position of the mobile device 102 has been detected.
- a face position refers to a position where a display of the mobile device 102 is being held facing to the user. The user may be in an upright position, such as in a sitting or standing position. If the face position of the mobile device 102 has not been detected, the method returns to block 101. Alternatively, if the face position of the mobile device 102 has been detected, the method may turn on the screen of the mobile device 102 automatically without user input.
- the mobile device 102 may be configured to display notifications, predicted applications to be used, and/or status information in response to a determination of an inference of imminent use.
- the face position detection performed in block 107 may further comprise a detection of angle stabilization and face up angle estimation.
- the detection of angle stabilization and face up angle estimation are further described in the following sections.
- the imminent use detector may generate various outputs to be used by other applications and components of the mobile device 102.
- the imminent use detector may produce an output to indicate the current position of the mobile device 102, i.e., whether it is on a supporting surface (e.g. desk), in a holder (e.g. bag or pocket), being held in the hand of a user, or the location of the mobile device 102 may be unknown.
- the imminent use detector may produce an output to indicate whether the mobile device 102 has been picked up, has not been picked out, or it has not yet been determined (unknown).
- the imminent use detector may produce an output to indicate whether the face position of the mobile device 102 has been detected, has not been detected, or it has not yet been determined (unknown).
- the pick-up detection performed in block 103 and block 105 of FIG. 1 may further comprise a detection of initial signal being triggered.
- the initial signal may be triggered when the mobile device 102 is detected to be moved from a static state position, for example being moved from a stationary position.
- a standard deviation of the accelerometer vector in a predetermined time window for example within a range of 0.2 second, may be computed.
- the standard deviation of the accelerometer vector may be compared to a predetermined threshold. If the standard deviation of the accelerometer vector exceeds the predetermined threshold, then the initial signal may be deemed to be triggered. Alternatively, if the standard deviation of the accelerometer vector does not exceed the predetermined threshold, then the initial signal may be deemed to be not triggered.
- a pick-up classification based on logistic regression of measured features may be configured to identify the validity of one or more pickup motions, and may further be configured to classify such pick-up motions.
- the features may include statistics of sensor data collected by the accelerometer within a time window, for example 0.15 second around the initial signal. According to aspects of the present disclosure, other window durations, such as 0.1 second, 0.3 second, 0.5 second, etc. may be used.
- various features may be selected to be observed, including but not limited to: 1) raw accelerometer vector over the time window; 2) adjusted accelerometer vector (defined as raw accelerometer vector minus the estimated gravity vector (relative to phone coordinates)); 3) standard deviation of the raw or adjusted accelerometer vector over the time window; 4) variance of the raw or adjusted accelerometer vector in an individual axis (e.g., x, y, or z axis) over the time window, 5) sum of variances of the raw or adjusted accelerometer vector in three axes (e.g., x, y, and z axes); 6) different time window durations; 7) different time window offsets with respect to the initial signal being triggered; 8) derivative of the raw or adjusted accelerometer vector prior to computing its variance over the time window; and/or 9) derivative of the raw or adjusted accelerometer vector prior to computing its standard deviation over the time window.
- the above features may be used in combination for performing logistic regression to determine pick-up detection and pick
- FIG. 2A-2D further illustrate methods of detecting whether the initial signal has been triggered, whether a mobile device has been picked up, as well as determining an inference of imminent use of the mobile device.
- FIG. 2A illustrates a group of exemplary sensor observations for pick-up detection of a mobile device according to some aspects of the present disclosure.
- each unit of 100 represents 1 second in the horizontal axis of an observation window.
- a mobile device may be detected to be taken out from a jacket pocket, which may be represented by the path of block 101 and block 105 of FIG. 1.
- Window 202 may show exemplary measurements of one or more accelerometers over time; window 204 may show exemplary measurements of one or more ambient light sensors over time; and window 206 may show exemplary measurements of one or more proximity sensors over time.
- a mobile device using such observations from one or more of windows 202, 204, or 206, a mobile device can be configured to perform pick-up detection, and the results from the pick-up detection can be used to determine an inference of imminent use of the mobile device.
- FIG. 2B illustrates another group of exemplary sensor observations for pick-up detection of a mobile device according to some aspects of the present disclosure. Similar to the figure shown in FIG. 2A, at approximately after the 30 second point, a mobile device may be detected to be taken out from a pant pocket, which may be represented by the path of block 101 and block 105 of FIG. 1.
- Window 212 may show exemplary measurements of one or more accelerometers over time; window 214 may show exemplary measurements of one or more ambient light sensors over time; and window 216 may show exemplary measurements of one or more proximity sensors over time.
- Each window 212 or 214 shows different sensor data characteristics than that of window 202 or 204.
- a mobile device can be configured to perform pick-up detection, and the results from the pick-up detection can be used to determine an inference of imminent use of the mobile device.
- FIG. 2C illustrates yet another group of exemplary sensor observations for pick-up detection of a mobile device according to some aspects of the present disclosure. Similar to the figure shown in FIG. 2A, at approximately after the 30 second point, a mobile device may be detected to be taken out from a jacket pocket and with the one or more ambient light sensors being turned off. This action may be represented by the path of block 101 and block 105 of FIG. 1.
- Window 222 may show exemplary measurements of one or more accelerometers over time; window 224 may show no measurements by the one or more ambient light sensors; and window 226 may show exemplary measurements of one or more proximity sensors over time.
- the window 222 may show different sensor data characteristics than that of window 202. According to aspects of the present disclosure, using such observations from one or more of windows 222 and/or 226, a mobile device can be configured to perform pick-up detection, and the results from the pick-up detection can be used to determine an inference of imminent use of the mobile device.
- FIG. 2D illustrates yet another group of exemplary sensor observations for pick-up detection of a mobile device according to some aspects of the present disclosure. Similar to the figure shown in FIG. 2A, at approximately after the 30 second point, a mobile device may be detected to be taken out from a backpack and with the one or more ambient light being sensors being turned off. This action may be represented by the path of block 101 and block 105 of FIG. 1.
- Window 232 may show exemplary measurements of one or more accelerometers over time; window 234 may show no measurements by the one or more ambient light sensors; and window 236 may show exemplary measurements of one or more proximity sensors over time. Each of the windows 232 or 236 may show different sensor data characteristics than that of window 202 or 206. According to aspects of the present disclosure, using such observations from one or more of windows 232 and/or 236, a mobile device can be configured to perform pick-up detection, and the results from the pick-up detection can be used to determine an inference of imminent use of the mobile device.
- logistic regression may be used to predict the outcome of an inference of imminent use of a device based on measurements obtained by one or more sensors (also referred to as predictor variables). For example, logistic regression may be used in estimating empirical values of the parameters in a qualitative statistical model. The possible outcomes of trials may be modeled, as a function of the measurements made by the one or more sensors. In addition, logistic regression may be employed to measure the relationship between the inference of imminent use of the device and one or more independent variables, which may be obtained by the one or more sensors as well as reference motions and behaviors previously obtained. By using probability scores as the predicted values, the inference of imminent use of the device may be determined.
- the inference of imminent use when determined, may be high, corresponding to a high probability of imminent use. Alternatively, the inference of imminent use may be low, corresponding to a low probability of imminent use. As explained further below, in other implementations, the inference of imminent use can be a yes-or-no result.
- logistic regression can be binomial, where the binomial logistic regression may be configured to handle situations where two possible outcomes may be expected, such as treating the inference of imminent use of the device as the outcome of a Bernoulli trial.
- logistic regression can be multinomial, where the multinomial logistic regression may be configured to handle situations where multiple outcomes may be expected.
- Logistic regression may be used to predict the probability of a particular outcome using the values of the sensor measurements (e.g., values of the predictor variables), which in turn may be translated into a probability value for the inference of imminent use of the device. In some applications, all that may be needed is the inference of imminent use of a device that simply represents a probability of imminent use of the device.
- the inference of imminent use of the device may be a specific yes-or-no prediction regarding the imminent use of the device.
- This categorical prediction can be based on the probability of a prediction, with the predicted probability being compared to certain threshold value; and the outcome of the comparison may be translated into an inference of imminent use of the device.
- successful pick-up detection of a mobile device may trigger operations of face position detection.
- one exemplary approach is to check whether the angle subtended by the gravity vector relative to an axis (for example the z-axis) of the mobile device has stabilized in a range indicative of face position.
- a sliding window may be selected, which may have a time period of 0.3 second and which may stop at 5 seconds after a proximity sensor has been closed, as indicated by one or more proximity sensors.
- a face-up angle may be considered to be stabilized if the angle does not change substantially within the given window.
- One way to determine the angle does not change substantially within the given window is to compute a difference between a maximum of face-up angles in the window and a minimum of face-up angles in the window. If the difference is less than a
- the face-up angle may be deemed to be stabilized.
- This approach may be employed in face position detection as described in association with FIG. 3, FIG. 4A-4B and FIG. 5A-5B in the following sections.
- FIG. 3 illustrates an exemplary method of detecting face position of a mobile device according to some aspects of the present disclosure.
- a mobile device may be detected to be taken out from a pant pocket.
- Window 302 may show exemplary measurements of one or more accelerometers over time;
- window 304 may show exemplary measurements of one or more ambient light sensors over time.
- Time segment 306 (bracketed in dotted rectangle) is enlarged on the right hand side of FIG. 3.
- the mobile device can be configured to determine a time the mobile device has been taken out as indicated by line 308.
- the mobile device may be further configured to determine a time period where a proximity open detector may be triggered, indicated by dotted timeline 310 and timeline 312.
- the mobile device may be configured to perform angle stabilization detection.
- the angle may be approximately 46.87 degrees.
- the mobile device can be configured to perform face position detection, and the results from the face position detection can be used to determine an inference of imminent use of the mobile device.
- the mobile device may be further configured to determine an inference of imminent use of the mobile device, and predict a lead time when the screen may be turned on. The predicted lead time may be indicated by the time period between timeline 312 and timeline 314. At timeline 314, the screen of the mobile device may be determined to be on.
- FIG. 4A illustrates a group of exemplary sensor observations for angle stabilization detection of a mobile device according to some aspects of the present disclosure.
- Window 402 may show exemplary measurements of one or more accelerometers over time; window 404 may show exemplary measurements of one or more ambient light sensors over time.
- a time period 406 (shaded in gray) may indicate a period where a proximity open detector may be triggered.
- the mobile device may be configured to perform angle stabilization detection with z-axis.
- This particular embodiment may show an application scenario that a user may be sitting, and the mobile device may be in a shirt pocket. Then the mobile device may be transitioned from the shirt pocket to being held at a low angle. In this example the angle may be approximately 22.29 degrees.
- FIG. 4B illustrates another group of exemplary sensor observations for angle stabilization detection of a mobile device according to some aspects of the present disclosure.
- Window 412 may show exemplary measurements of one or more accelerometers over time;
- window 414 may show exemplary measurements of one or more ambient light sensors over time.
- the mobile device may be configured to perform angle stabilization detection with 3 axes (e.g., x, y, and z axes) as well as face position detection.
- This particular embodiment may show an application scenario that a user may be sitting, and the mobile device may be in a purse. Then the mobile device may be transitioned from the purse to being held face down and the user may be walking away. The results obtained may in turn be used to determine an inference of imminent use of the mobile device.
- FIG. 5A illustrates an exemplary embodiment of three axes angle stabilization of a mobile device according to some aspects of the present disclosure.
- window 502 may show exemplary measurements of one or more accelerometers over time;
- window 504 may show exemplary measurements of one or more ambient light sensors over time.
- the mobile device may be configured to perform angle stabilization detection with three axes (e.g., x, y, and z axes) as well as face position detection.
- This particular embodiment may show an application scenario that a user may be sitting, and the mobile device may be on a desk.
- the mobile device may be transitioned from the desk to be held with a tilted face-up, at a high pick- up position, for example near the ear of the user.
- the results obtained may in turn be used to determine an inference of imminent use of the mobile device.
- FIG. 5B illustrates another exemplary embodiment of three axes angle stabilization of a mobile device according to some aspects of the present disclosure.
- window 512 may show exemplary measurements of one or more accelerometers over time;
- window 514 may show exemplary measurements of one or more ambient light sensors over time.
- the mobile device may be configured to perform angle stabilization detection with three axes (e.g., x, y, and z axes) as well as face position detection.
- This particular embodiment may show an application scenario that a user may be sitting, and the mobile device may be in a backpack.
- the mobile device may be transitioned from the backpack to be held not in a face-up position where the user is viewing the display, but at a high pick-up position, for example, near the user's ear while the user speaks during a telephone call.
- the results obtained may in turn be used to determine an inference of imminent use of the mobile device.
- FIG. 6 illustrates an exemplary block diagram of a mobile device according to some aspects of the present disclosure.
- mobile device 600 includes a transceiver 106 configured to communicate with other computing devices including but not limited to servers and other mobile devices, a camera 108 configured to function as an image sensor to generate images, which may be either individual photos or frames of video.
- the mobile device 600 may also include sensors 116, which may be used to provide sensor data with which the mobile device 600 can determine inferences of imminent use.
- sensors examples include but are not limited to, accelerometers, ambient light sensors, proximity sensors, quartz sensors, gyroscopes, micro-electromechanical system (MEMS) sensors used as linear accelerometers, as well as magnetometers.
- accelerometers ambient light sensors
- proximity sensors quartz sensors
- quartz sensors gyroscopes
- MEMS micro-electromechanical system
- the mobile device 600 may also include a user interface 110 that includes display 112 for displaying images.
- the user interface 110 may also include a keypad 114 or other input device through which the user can input information into the mobile device 600. If desired, the keypad 114 may be obviated by integrating a virtual keypad into the display 1 12 with a touch sensor.
- the user interface 110 may also include a microphone 117 and one or more speakers 118, for example, if the mobile platform is a cellular telephone.
- mobile device 600 may include other components unrelated to the present disclosure.
- the mobile device 600 further includes a control unit 120 that is connected to and communicates with transceiver 106, camera 108 and sensors 116, as well as user interface 110, along with any other desired features.
- the control unit 120 (also referred to as controller) may be provided by one or more processors 122 and associated memory/storage 124.
- the control unit 120 may also include software 126, as well as hardware 128, and firmware 130.
- the control unit 120 may include imminent use detector module 132 configured to detect inferences of imminent use of the mobile device 600.
- the imminent use detector module 132 may further include pick up detection module 134 configured to determine whether mobile device 600 has been picked up, and face position detection module 136 configured to determine face position of mobile device 600 after it has been picked up.
- the imminent use detector module 132 is illustrated separately from processor 122 and/or hardware 128 for clarity, but may be combined and/or
- control unit 120 can be configured to implement methods of imminent use detection.
- the control unit 120 can be configured to implement functions of the mobile device 600 described in FIGs. 1-5 and FIGs. 7-8.
- the disclosed methods and apparatuses can be applied to enable power savings in mobile devices, and simultaneously deliver better user experience with an "always-on", low-power inference engine on the mobile device that can accurately predict its imminent use in the next few seconds, for example between 1 and 60 second.
- the imminent use detector may be configured to be "always-on" to receive sensor data, for example from an
- accelerometer and the imminent use detector may be configured to perform the functions as described herein continuously.
- an imminent use detector can be configured to consume accelerometer data, along with other pieces of information made available from low power sources on the mobile device (e.g., grip sensors, time of the day, day of the week, ambient light sensor, etc.) to produce the desired inference of imminent use.
- information relating to incoming and outgoing phone calls and text messages, various notification methods (e.g., ringer, flashing LED, etc.), charging status, and information from Bluetooth scans may also be used to produce the desired inference of imminent use.
- the imminent use detector can be configured to reside as part of a low-power sensors subsystem.
- FIG. 7A illustrates an exemplary application environment of an imminent use detector according to some aspects of the present disclosure.
- an imminent use detector 702 may be configured to send control and/or configuration information to an Any-motion detector (AMD) 704.
- AMD Any-motion detector
- the any-motion detector 704 may be configured to set the configurations of inertial sensor(s) 706.
- the inertial sensor(s) 706 may then collect and send sensor data to the Any-motion detector 704, which in turn generates an AMD motion indicator to the imminent use detector 702.
- situation aware applications 710 may be configured to send register/deregister and data synchronization events to battery services (application/module) 712.
- the battery services 712 may be configured to send control and/or configuration information to the imminent use detector 702, which may be used to configure the imminent use detector 702, the any motion detector 704 and the inertial sensor(s) 706.
- the imminent use detector 702 may be configured to receive information, such as events, status updates, and other relevant information.
- the imminent use detector 702 may then use the received information, including sensor information from the AMD 704, control and/or configuration information from battery services 712, events and status updates to predict an inference of imminent use, which may also be referred to as an imminent use prediction.
- the imminent use detector 702 may send this information to configure the battery services 712 for controlling the power to be consumed by the mobile device.
- the battery services 712 may then use the imminent use prediction to inform the situation aware applications 710 to start/stop data synchronization, in some exemplary applications.
- the functions of the imminent use detector 702, AMD 704, inertial sensor(s) 706, situation aware applications 710, and battery services 712 may be performed by various blocks of the mobile device 600 as described in association with FIG. 6.
- FIG. 7B illustrates another exemplary application environment of an imminent use detector according to some aspects of the present disclosure.
- imminent use detector 702 may be configured to communicate with one or more sensor(s) 720 and one or more
- the imminent use detector 702 may include logic configured to perform common imminent use scenarios detection 716 as well as logic configured to perform user-specific imminent use scenarios detection 718.
- an event that may influence the prediction of an inference of imminent use of a device may comprise two components.
- the first component can be a supervised component that may be trained to recognize some universal gestures/scenarios associated with an act of actively using the device (e.g. a phone), such as pulling the device out of a pocket, picking up the device off a table/desk, or the ringing/vibrating that typically results in the user picking up the device.
- the second component may be a user-specific component, wherein imminent phone usage traits specific to the device's owner (or, the most frequent user) can be used to fine-tune the above supervised component. For instance, if the user is left-handed, such details may be collected from the user in a one-time fashion during registration, or be detected on-the-fly. In other situations, for example, a user may almost always ignore calls from certain phone numbers, in which case, it may be likely that there would not be an imminent usage of the device even though the device may be ringing.
- the one or more sensor(s) 720 may include, but not limited to, one or more accelerometer(s) 722, one or more ambient light sensor(s) 724, one or more proximity sensor(s) 726, one or more touch sensor(s) 728, one or more gyroscope(s) 730, etc.
- the one or more application(s) 732 may include, but not limited to, one or more situation aware application(s) 734, one or more power management application(s) 736, etc.
- the functions of the imminent use detector 702, one or more sensor(s) 720, and one or more application(s) 732 may be performed by various blocks of the mobile device 600 as described in association with FIG. 6.
- the one or more application(s) 732 may be configured to send control and/or configuration information to the imminent use detector 702.
- the imminent use detector 702 may be configured to generate and send sensor configuration information to the one or more sensor(s) 720.
- the imminent use detector 702 may be configured to determine inferences of imminent use, also referred to as imminent use prediction(s), from sensor data received from the one or more sensor(s) 720.
- the imminent use prediction(s) may be used to assist the situation aware applications(s) 734 as well as the power management application(s) 736, in some exemplary implementations.
- the inference of imminent use may be applied to assist intelligent data synchronization.
- Applications e.g., email, Facebook, Twitter, Photos
- Data synchronization requests typically send periodic data synchronization requests in the background regardless of whether the user is going to check for this new data in the near future.
- applications may subscribe to the determination of inference of imminent use from a low-power engine, and send data synchronization requests only when this low-power engine signals imminent device use.
- the imminent use detector trigger can be used in place of a screen-on trigger.
- some applications may turn off Wi-Fi in power-crunched scenarios and may attempt connecting to an available access point upon observing a "screen on" event. This may be less desirable as it can increase the latency associated with data delivery to the user, thereby degrading user experience. For example, it may not be desirable for a user to wait for a "spinning wheel", or wait for the data loading icon for a number of seconds. Using the imminent use detector trigger, such waiting time may be reduced.
- FIG. 7C illustrates an exemplary application upon a determination of an inference of imminent use of a device according to some aspects of the present disclosure.
- the mobile screen may be turned on automatically without waiting for the user to press the on/off button, and the mobile screen may be configured to present relevant information to the user.
- This can be a desirable user interface feature implemented in a lock screen widget. Using this user interface feature, the user may be able to take a glance of the mobile screen and be informed of status information, notifications of communication activities, and predicted applications to be used, which the user has previously programmed the lock screen widget to display.
- the mobile screen 732 may be configured to display information, including but not limited to: 1) percentage of battery life and a predicted battery life of the mobile device in terms of hours and minutes 734; 2) weather conditions at current location 736; 3) current time 738; 4) next alarm time 740.
- the mobile screen 732 may be configured to display notifications in chronological order with the number of minutes since arrival.
- the notifications may include, but not limited to: 1) next calendar appointment within the next two hours 742; 2) one or more missed calls 744; and one or more email messages 746.
- the user may use the down arrow 748 to access additional notifications; may tap on a notification to open the notification in the corresponding application (e.g. calendar, phone, or email); may dismiss an individual notification by swiping the corresponding lozenge; and may dismiss all notifications using the delete symbol (shown as "X") 750.
- the notifications may be shown in semi-transparent lozenges 752. In the event when there are no notifications, the screen area covered by the semi-transparent lozenges 752 may be blank, and a message such as "you have no new notifications" may be displayed.
- the mobile screen 732 may be configured to display a number of predicted applications 754 the user may use as well as a person the user may contact via phone, text messages (SMS), email, etc.
- the user may tap on an application or a contact to open the application (e.g. Facebook, Skype, email, etc.) to initiate the communication.
- the applications may be overlaid with the number of messages pending. For example, there may be twelve Facebook messages and five Email messages pending in the example shown in FIG. 7C.
- the mobile screen 732 may display a link to settings 756 to enable the user to change the settings based on the information received through a glance of the mobile screen 732. Tapping the home button 758 or back button 760 may dismiss the glance feature and transition the mobile screen 732 to display other predetermined user interface settings.
- the mobile screen may be turned off based on the inference of imminent use, for example when the inference of imminent use is low, without the user pressing the on/off button. This can be a beneficial power saving feature. For example, a user may sometimes leave the device on the desk with screen on.
- the imminent use detector may be configured to determine that there may be no imminent usage of the device, and turn off the display, which may be a heavy battery draining component.
- the inference of imminent use may also be used to trigger other higher power always-on context use cases, such as voice-based device wake-up (e.g., user may say "Hey Snapdragon" to start interacting with mobile device) and camera-based mobile user authentication, such as face recognition algorithm to authenticate user of mobile device.
- FIG. 8A illustrates an exemplary flow chart of detecting imminent use of a device according to some aspects of the present disclosure.
- the method receives sensor data by one or more sensors of the device.
- the method determines an inference of imminent use of the device based at least in part on the sensor data.
- the inference of imminent use may indicate the device may be used within a period of 1 to 60 second.
- the method may further perform data synchronization in accordance with the inference of imminent use, provide an application interface for one or more applications to use the inference of imminent use, or provide one or more commands to control an operation of the device based at least in part on the inference of imminent use.
- the method may further generate one or more commands to control an application in accordance with the inference of imminent use, turn on a screen in response to the inference of imminent use being above a first predetermined threshold value prior to receiving a user's command to use the device, or turn off the screen in response to the inference of imminent use being below a second predetermined threshold value prior to receiving the user's command to stop using the device
- FIG. 8B illustrates an exemplary implementation of receiving sensor data of FIG. 8 A according to some aspects of the present disclosure.
- a method may receive measurements of acceleration of the device over a period of time in one or more axes collected by one or more accelerometers, receive measurements of ambient light detected by the device over the period of time collected by one or more ambient light sensors, receive measurements of proximity of the device to other objects over the period of time collected by one or more proximity sensors, or receive measurements of the device being touched over the period of time collected by one or more touch sensors.
- FIG. 8C illustrates exemplary implementations of determining an inference of imminent use of the device of FIG. 8 A according to some aspects of the present disclosure.
- a method of determining an inference of imminent use may detect one or more reference motions associated with the inference of imminent use, detect one or more user-specific actions associated with the inference of imminent use, detect one or more contextual triggers associated with the inference of imminent use, or detect one or more situations associated with the inference of imminent use based at least in part on a history of use of the device, as shown in block 810.
- the method of determining an inference of imminent use may detect one or more reference motions associated with the inference of imminent use, where the one or more reference motions associated with the inference of imminent use comprise at least one of a first motion that indicates the device being picked up from a supporting surface, a second motion that indicates the device being pulled out of a holder, or a third motion that indicates the device being picked up from an idle state, as shown in block 812.
- the method of determining an inference of imminent use may detect one or more user-specific actions associated with the inference of imminent use, where the one or more user-specific actions associated with the inference of imminent use comprise at least one of a first action that indicates a user is left-handed, or a second action that indicates the user is right-handed, as shown in block 814.
- the method of determining an inference of imminent use may detect one or more contextual triggers associated with the inference of imminent use, where the one or more contextual triggers associated with the inference of imminent use comprise at least one of a first trigger that causes the device to vibrate, a second trigger that causes the device to ring, a third trigger that causes the device to flash a light emitting diode, or a fourth trigger that causes the device to generate an alert message, as shown in block 816.
- the method of determining an inference of imminent use may collect contextual data related to a history of use of the device, and determine the inference of imminent use based at least in part on the contextual data, as shown in block 818.
- FIG. 1 Note that various paragraphs herein, FIG. 1 , FIG. 6, FIG. 7A - FIG. 7B,
- FIG. 8A - FIG. 8C and their corresponding descriptions provide means for receiving sensor data of the device; means for determining an inference of imminent use of the device based at least in part on the sensor data; means for receiving measurements collected by one or more accelerometers over a period of time in one or more axes; means for receiving measurements collected by one or more ambient light sensors over the period of time; means for receiving measurements collected by one or more proximity sensors over the period of time; means for receiving measurements collected by one or more touch sensors over the period of time; means for collecting contextual data related to a history of use of the device; and means for determining the inference of imminent use based at least in part on the contextual data.
- a processing unit may be implemented within one or more application specific integrated circuits ("ASICs”), digital signal processors (“DSPs”), digital signal processing devices
- DSPDs programmable logic devices
- FPGAs field programmable gate arrays
- processors controllers, micro-controllers, microprocessors, electronic devices, other devices units designed to perform the functions described herein, or combinations thereof.
- a special purpose computer or a similar special purpose electronic computing device is capable of manipulating or transforming signals, typically represented as physical electronic or magnetic quantities within memories, registers, or other information storage devices, transmission devices, or display devices of the special purpose computer or similar special purpose electronic computing device.
- Wireless communication techniques described herein may be in connection with various wireless communications networks such as a wireless wide area network ("WW AN”), a wireless local area network (“WLAN”), a wireless personal area network (WPAN), and so on.
- WW AN wireless wide area network
- WLAN wireless local area network
- WPAN wireless personal area network
- the term “network” and “system” may be used interchangeably herein.
- a WW AN may be a Code Division Multiple Access
- CDMA Code Division Multiple Access
- TDMA Time Division Multiple Access
- FDMA Frequency Division Multiple Access
- OFDMA Orthogonal Frequency Division Multiple Access
- SC-FDMA Single-Carrier Frequency Division Multiple Access
- a CDMA network may implement one or more radio access technologies (“RATs”) such as cdma2000, Wideband-CDMA (“W-CDMA”), to name just a few radio technologies.
- RATs radio access technologies
- cdma2000 may include technologies implemented according to IS- 95, IS-2000, and IS-856 standards.
- GSM Global System for Mobile Communications
- D- AMPS Digital Advanced Mobile Phone System
- GSM and W-CDMA are described in documents from a consortium named “3rd Generation Partnership Project” ("3GPP”).
- Cdma2000 is described in documents from a consortium named “3rd Generation Partnership Project 2" (“3GPP2”).
- 3 GPP and 3GPP2 documents are publicly available.
- 4G Long Term Evolution (“LTE") communications networks may also be implemented in accordance with claimed subject matter, in an aspect.
- a WLAN may comprise an IEEE 802.1 lx network
- a WPAN may comprise a Bluetooth network, an IEEE 802.15x, for example.
- Wireless communication implementations described herein may also be used in connection with any combination of WW AN, WLAN or WPAN.
- a wireless transmitter or access point may comprise a femtocell, utilized to extend cellular telephone service into a business or home.
- one or more mobile devices may communicate with a femtocell via a code division multiple access ("CDMA") cellular communication protocol, for example, and the femtocell may provide the mobile device access to a larger cellular telecommunication network by way of another broadband network such as the Internet.
- CDMA code division multiple access
- Terrestrial transmitters may, for example, include ground-based transmitters that broadcast a PN code or other ranging code (e.g., similar to a GPS or CDMA cellular signal). Such a transmitter may be assigned a unique PN code so as to permit identification by a remote receiver. Terrestrial transmitters may be useful, for example, to augment an SPS in situations where SPS signals from an orbiting SV might be unavailable, such as in tunnels, mines, buildings, urban canyons or other enclosed areas. Another
- pseudolites implementation of pseudolites is known as radio-beacons.
- SV is intended to include terrestrial transmitters acting as pseudolites, equivalents of pseudolites, and possibly others.
- SPS signals and/or “SV signals”, as used herein, is intended to include SPS-like signals from terrestrial transmitters, including terrestrial transmitters acting as pseudolites or equivalents of pseudolites.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Signal Processing (AREA)
- Environmental & Geological Engineering (AREA)
- Computer Networks & Wireless Communication (AREA)
- Spectroscopy & Molecular Physics (AREA)
- Telephone Function (AREA)
- User Interface Of Digital Computer (AREA)
- Power Sources (AREA)
Priority Applications (4)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| EP15704158.3A EP3114543A1 (en) | 2014-03-07 | 2015-01-27 | Detecting imminent use of a device |
| CN201580009426.6A CN106062670A (zh) | 2014-03-07 | 2015-01-27 | 检测装置的即将使用 |
| KR1020167027425A KR20160130271A (ko) | 2014-03-07 | 2015-01-27 | 디바이스의 임박한 사용의 검출 |
| JP2016555688A JP6427594B2 (ja) | 2014-03-07 | 2015-01-27 | デバイスの差し迫った使用の検出 |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US14/201,576 US20150253351A1 (en) | 2014-03-07 | 2014-03-07 | Detecting Imminent Use of a Device |
| US14/201,576 | 2014-03-07 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2015134128A1 true WO2015134128A1 (en) | 2015-09-11 |
Family
ID=52469333
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/US2015/013116 Ceased WO2015134128A1 (en) | 2014-03-07 | 2015-01-27 | Detecting imminent use of a device |
Country Status (6)
| Country | Link |
|---|---|
| US (1) | US20150253351A1 (enExample) |
| EP (1) | EP3114543A1 (enExample) |
| JP (1) | JP6427594B2 (enExample) |
| KR (1) | KR20160130271A (enExample) |
| CN (1) | CN106062670A (enExample) |
| WO (1) | WO2015134128A1 (enExample) |
Families Citing this family (11)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US8661403B2 (en) | 2011-06-30 | 2014-02-25 | Truecar, Inc. | System, method and computer program product for predicting item preference using revenue-weighted collaborative filter |
| US20160364783A1 (en) * | 2014-06-13 | 2016-12-15 | Truecar, Inc. | Systems and methods for vehicle purchase recommendations |
| US9788277B2 (en) * | 2015-01-15 | 2017-10-10 | Mediatek Inc. | Power saving mechanism for in-pocket detection |
| WO2017163637A1 (ja) * | 2016-03-25 | 2017-09-28 | シャープ株式会社 | 情報処理装置、電子機器、情報処理装置の制御方法および制御プログラム |
| GB2550854B (en) | 2016-05-25 | 2019-06-26 | Ge Aviat Systems Ltd | Aircraft time synchronization system |
| KR101969395B1 (ko) * | 2017-04-26 | 2019-07-23 | 아주대학교산학협력단 | 어플리케이션의 전력 과소비 원인 분석 장치 및 방법 |
| KR20190017280A (ko) * | 2017-08-10 | 2019-02-20 | 엘지전자 주식회사 | 이동 단말기 및 이동 단말기의 제어 방법 |
| WO2019056160A1 (en) * | 2017-09-19 | 2019-03-28 | Qualcomm Incorporated | ARTIFICIAL INTELLIGENCE AGENT FOR INTELLIGENT TELEPHONE DISPLAY AND BRIGHTNESS CONTROL |
| KR102348693B1 (ko) * | 2017-10-24 | 2022-01-10 | 삼성전자주식회사 | 어플리케이션 프로그램을 제어하는 전자 장치 및 그 제어 방법 |
| CN112351482B (zh) * | 2020-10-29 | 2024-06-04 | 深圳Tcl新技术有限公司 | 自动控制终端休眠的方法及装置、计算机可读存储介质 |
| CN116088665A (zh) * | 2022-12-05 | 2023-05-09 | 展讯通信(天津)有限公司 | 休眠方法、芯片、电子设备及存储介质 |
Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| EP1662358A1 (en) * | 2004-11-24 | 2006-05-31 | Research In Motion Limited | System and Method for Selectively Activating a Communication Device |
| US20070075965A1 (en) * | 2005-09-30 | 2007-04-05 | Brian Huppi | Automated response to and sensing of user activity in portable devices |
| US20120280917A1 (en) * | 2011-05-03 | 2012-11-08 | Toksvig Michael John Mckenzie | Adjusting Mobile Device State Based on User Intentions and/or Identity |
Family Cites Families (11)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| NZ332525A (en) * | 1996-05-22 | 2000-01-28 | Geovector Corp | Method and apparatus for controlling electrical devices in response to sensed conditions |
| JP2002198894A (ja) * | 2000-12-27 | 2002-07-12 | Toshiba Corp | 無線通信端末 |
| JP2004046386A (ja) * | 2002-07-10 | 2004-02-12 | Hitachi Ltd | 携帯型情報端末および携帯型情報端末の省電力方法、記録媒体 |
| US7827000B2 (en) * | 2006-03-03 | 2010-11-02 | Garmin Switzerland Gmbh | Method and apparatus for estimating a motion parameter |
| JP2009296171A (ja) * | 2008-06-03 | 2009-12-17 | Panasonic Corp | 携帯通信端末 |
| US20100245289A1 (en) * | 2009-03-31 | 2010-09-30 | Miroslav Svajda | Apparatus and method for optical proximity sensing and touch input control |
| KR101531561B1 (ko) * | 2009-05-04 | 2015-06-25 | 삼성전자주식회사 | 휴대용 단말기에서 사용자 자세에 따른 자동 호 수발신을 위한 장치 및 방법 |
| JP5606205B2 (ja) * | 2010-07-28 | 2014-10-15 | 京セラ株式会社 | 携帯端末装置 |
| EP2659319A4 (en) * | 2010-11-19 | 2017-07-26 | Google, Inc. | Flexible functionality partitioning within intelligent-thermostat-controlled hvac systems |
| US8644884B2 (en) * | 2011-08-04 | 2014-02-04 | Qualcomm Incorporated | Sensor-based user interface control |
| US9075451B2 (en) * | 2012-02-24 | 2015-07-07 | Blackberry Limited | Handheld device with notification message viewing |
-
2014
- 2014-03-07 US US14/201,576 patent/US20150253351A1/en not_active Abandoned
-
2015
- 2015-01-27 JP JP2016555688A patent/JP6427594B2/ja not_active Expired - Fee Related
- 2015-01-27 KR KR1020167027425A patent/KR20160130271A/ko not_active Withdrawn
- 2015-01-27 WO PCT/US2015/013116 patent/WO2015134128A1/en not_active Ceased
- 2015-01-27 EP EP15704158.3A patent/EP3114543A1/en not_active Withdrawn
- 2015-01-27 CN CN201580009426.6A patent/CN106062670A/zh active Pending
Patent Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| EP1662358A1 (en) * | 2004-11-24 | 2006-05-31 | Research In Motion Limited | System and Method for Selectively Activating a Communication Device |
| US20070075965A1 (en) * | 2005-09-30 | 2007-04-05 | Brian Huppi | Automated response to and sensing of user activity in portable devices |
| US20120280917A1 (en) * | 2011-05-03 | 2012-11-08 | Toksvig Michael John Mckenzie | Adjusting Mobile Device State Based on User Intentions and/or Identity |
Also Published As
| Publication number | Publication date |
|---|---|
| JP6427594B2 (ja) | 2018-11-21 |
| US20150253351A1 (en) | 2015-09-10 |
| JP2017520813A (ja) | 2017-07-27 |
| CN106062670A (zh) | 2016-10-26 |
| EP3114543A1 (en) | 2017-01-11 |
| KR20160130271A (ko) | 2016-11-10 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US20150253351A1 (en) | Detecting Imminent Use of a Device | |
| US10728442B2 (en) | Initializing camera subsystem for face detection based on sensor inputs | |
| US10620685B2 (en) | Adjusting mobile device state based on user intentions and/or identity | |
| CN106020782B (zh) | 一种指纹解锁控制方法、及移动终端 | |
| EP3200483B1 (en) | Method and device for acquiring location information | |
| CN104657057A (zh) | 终端唤醒方法及装置 | |
| US20140189538A1 (en) | Recommendations for Applications Based on Device Context | |
| CN108376086A (zh) | 显示控制方法和装置、终端、计算机可读存储介质 | |
| CN108391002A (zh) | 显示控制方法和装置、终端、计算机可读存储介质 | |
| EP3160112B1 (en) | Reminding method and device | |
| CN106231137A (zh) | 基于电量的处理方法及装置 | |
| CN106231072B (zh) | 消息提醒的控制方法、装置以及终端设备 | |
| CN113875161A (zh) | 可穿戴设备位置系统架构 | |
| EP2996316B1 (en) | Methods and systems for communication management between an electronic device and a wearable electronic device | |
| CN106548606A (zh) | 告警方法、设备和系统 | |
| CN110602324A (zh) | 可穿戴设备的提醒方法及装置 | |
| US20170280384A1 (en) | Apparatuses and Methods for Controlling Always-On Displays for Mobile Devices | |
| WO2018188180A1 (zh) | 一种分享图片的方法和电子设备 | |
| CN116567139A (zh) | 终端坠落提示方法、装置、设备及存储介质 | |
| CN107707453B (zh) | 提醒方法及装置 | |
| CN108494960A (zh) | 电子设备及其控制方法 | |
| CN110290271B (zh) | 一种提醒方法、装置及介质 | |
| CN112654966A (zh) | 应用程序控制方法、装置、计算机可存储介质和电子设备 | |
| CN106569931A (zh) | 预计启动时长提醒的方法及装置 | |
| CN111240455A (zh) | 终端省电方法、终端省电装置及存储介质 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15704158 Country of ref document: EP Kind code of ref document: A1 |
|
| DPE1 | Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101) | ||
| REEP | Request for entry into the european phase |
Ref document number: 2015704158 Country of ref document: EP |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2015704158 Country of ref document: EP |
|
| ENP | Entry into the national phase |
Ref document number: 2016555688 Country of ref document: JP Kind code of ref document: A |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| ENP | Entry into the national phase |
Ref document number: 20167027425 Country of ref document: KR Kind code of ref document: A |