WO2006000639A1 - Controlling an electronic device - Google Patents

Controlling an electronic device Download PDF

Info

Publication number
WO2006000639A1
WO2006000639A1 PCT/FI2005/050226 FI2005050226W WO2006000639A1 WO 2006000639 A1 WO2006000639 A1 WO 2006000639A1 FI 2005050226 W FI2005050226 W FI 2005050226W WO 2006000639 A1 WO2006000639 A1 WO 2006000639A1
Authority
WO
WIPO (PCT)
Prior art keywords
motion
motion pattern
pattern
control
identified
Prior art date
Application number
PCT/FI2005/050226
Other languages
French (fr)
Inventor
Sami Ronkainen
Juha P. Matero
Original Assignee
Nokia Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nokia Corporation filed Critical Nokia Corporation
Priority to KR1020067026992A priority Critical patent/KR20070032709A/en
Priority to US11/597,883 priority patent/US20070225935A1/en
Priority to EP05756279A priority patent/EP1782165A4/en
Priority to JP2007517321A priority patent/JP2008503816A/en
Publication of WO2006000639A1 publication Critical patent/WO2006000639A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/316User authentication by observing the pattern of computer usage, e.g. typical user behaviour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/002Specific input/output arrangements not covered by G06F3/01 - G06F3/16
    • G06F3/005Input arrangements through a video camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0412Digitisers structurally integrated in a display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/04Arrangements for program control, e.g. control units using record carriers containing only program instructions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72454User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2200/00Indexing scheme relating to G06F1/04 - G06F1/32
    • G06F2200/16Indexing scheme relating to G06F1/16 - G06F1/18
    • G06F2200/163Indexing scheme relating to constructional details of the computer
    • G06F2200/1636Sensing arrangement for detection of a tap gesture on the housing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/12Details of telephonic subscriber devices including a sensor for measuring a physical value, e.g. temperature or motion

Definitions

  • the invention relates to identifying movement in a mobile environment and particularly to utilizing the identified movement in controlling a device.
  • control methods include those wherein the control is based on voice and gestures, for example.
  • the display can be im ⁇ plemented in such a manner that, irrespective of changes in the orientation of the device, the text of the display can always be read vertically. It is also known to zoom the display by turning the device.
  • the object of the invention is thus to provide an improved method and an apparatus for implementing the method in a manner that better takes into consideration the operating situation and/or environment of the de ⁇ vice. Accordingly, the object of the invention is a method of controlling an elec ⁇ tronic device, comprising identifying a motion pattern in the motion of the de ⁇ vice and eliminating the effect of the identified motion pattern from a control motion used for controlling the device.
  • the invention also relates to a software product comprising a software routine for receiving measurement information descriptive of a motion of the device, a software routine for identifying a motion pattern in the meas ⁇ urement information, and a software routine for eliminating the effect of the identified motion pattern from a control motion used for controlling the device and included in the measurement information.
  • the invention also relates to an electronic device comprising means for identifying a motion pattern in a motion of the device, and means for eliminating the effect of the identified motion pattern from a control motion used for controlling the device.
  • the invention is based on aiming at identifying, in an elec ⁇ tronic device, whether the device is susceptible to an identifiable motion pat ⁇ tern.
  • An identifiable motion pattern may be directed to an electronic device for instance when the device is subjected to mechanical vibration.
  • me ⁇ chanical vibration refers to a recurring motion directed to the device when the device is in a train or a car, for example.
  • an identifiable motion pattern may also refer to a motion pattern corresponding to the walk of a person carrying the device, for example.
  • the motion pattern is identi ⁇ fied and its effect is eliminated from the device control motion.
  • the control mo ⁇ tion is a gesture, such as a turn or a swing of the device, for example.
  • the con ⁇ trol motion may also be a tap on the device, for example.
  • the device according to the invention may be e.g. a mobile telephone, a portable computer or another corresponding device enabling mo ⁇ tion identification.
  • An advantage of the method and device of the invention is that the control motions intended to control the device can be identified con ⁇ siderably better and with fewer erroneous identifications once an identified dis ⁇ turbance is eliminated from the control motions.
  • FIG. 1 shows an embodiment of the method of the invention
  • Figure 2 illustrates the identification of a motion pattern according to an embodiment
  • Figure 3 illustrates the identification of a motion pattern according to an embodiment
  • Figure 4 illustrates a measurement signal filtered from a known mo ⁇ tion pattern
  • Figure 5 shows an electronic device according to an embodiment as a block diagram.
  • a given reference motion pattern is stored in an electronic device.
  • the reference motion pattern can be stored in the device for instance at the factory in connection with the manufacture of the device.
  • the stored reference motion patterns may describe the operating environment of the device, for in ⁇ stance that the device is in a train or carried by a person riding a bicycle.
  • the patterns stored in the device as a factory setting may be based for instance on a large number of operating situation examples, from which an average motion pattern is generated.
  • the device may comprise several alternative patterns for a given operating environment, such as a train.
  • the user may teach the device the desired patterns.
  • the user may teach the device a reference motion pattern corresponding to his walk by depressing a given key at the start and the end of the teaching.
  • the device stores the data between the keystrokes and analyses it by searching the data for acceleration signal values recurring in a certain manner, for exam ⁇ ple.
  • conditions may be set in the device as to when motion measurement is activated. As regards the condition to be checked, two different operating situations can be distinguished, device- originating and user-originating operating situations.
  • the device-originating operating situation according to step 104 refers to an operating situation wherein the device is aware of the event before the user is.
  • a device-terminating call is an example of a de ⁇ vice-originating event.
  • the mobile telephone is aware of the incoming call based on the signalling preceding the call, and is thus able to detect the start of the device-originating event on the basis of the start of said signalling.
  • Other examples of device-originating events that can be brought forward include for instance a short message arriving at the mobile telephone or a timer triggering off, e.g. an alarm clock or a calendar alarm in an electronic device.
  • a user-originating operating situation refers to an event origi ⁇ nating from the user.
  • the device may deduce the start of the use of the device on the basis of a given initial impulse, for example.
  • an initial impulse refers to a function by means of which the device is able to conclude the start of the use.
  • opening of the keypad lock may be mentioned.
  • Figure 1 shows an embodiment of a device-originating event, but it can also be applied to a user- originating event with the exception of steps 104 and 110.
  • the start of a device-originating or user- originating situation initiates motion measurement in the device in accordance with step 106.
  • conditions may be set on motion measurement, continuous motion status measurement in the device is also feasible.
  • the device may operate such that the device continuously aims at identifying gestures by comparing a measured motion with the threshold values of one or more gestures.
  • the device may also tape its motion in a memory for a given time, such as for the duration of 10 seconds, for example. If uncertainty exists at a given point in time whether the user performed a gesture, the taped data may be reverted to and attempts may be made to identity the motion pattern in the data. This may improve the gesture identification performed at said point in time, once the identified motion pattern can be filtered off.
  • Motion status measurement can also be performed periodically in the device.
  • Step 106 describes motion measurement in an electronic device.
  • Motion may be measured by means of one or more motion parame ⁇ ters, such as an acceleration parameter, for example.
  • Acceleration measure ⁇ ment may be performed for instance in three mutually perpendicular linear di ⁇ rections: directions x, y and z.
  • angular acceleration may also be measured in the device by means of a magnetometer or a gyroscope, for example.
  • step 108 an attempt is made to identify a motion pattern possibly detectable in the motion of the device.
  • the motion pattern may be identified in two different manners, either by comparing the motion with a previously stored/taught reference motion pattern or by aiming at identifying some new motion pattern in the data measured.
  • Attempts may be made to identify a motion pattern motion parameter-specifically for instance by studying the x-oriented linear component and the y-oriented linear component separately. In identifying a motion pattern, several motion parameters may also be studied together as a whole. In this case, the sum vector composed of the acceleration components can be com ⁇ pared with a predetermined threshold value.
  • the orientation of the device may be checked from time to time and, if necessary, take it into consideration when amending the direction of the sum vector.
  • the comparison can be carried out for a given predetermined period of time. If the correlation between the motion parameter and the reference pattern is sufficiently high during the period of time measured, it may be stated that the reference motion pattern was found in the motion parameter.
  • a recurring motion pattern i.e. periodicity in a signal, is identified in the measured signal by means of an autocorrelation function. Autocorrelation indicates the correlation between the signal values and the previous values, i.e.
  • the procedure may be for instance such that a reference sample of a given length is taken from the signal to be measured, such as a z acceleration signal.
  • the sampling can be timed for instance at such a point of the signal when the signal distinctly deviates from the basic level indicating immobility.
  • the reference sample taken can then be slid over the z signal to be measured, and if the reference sample corresponds with some predetermined accuracy to a later signal sample, the conclusion is that the motion pattern has recurred.
  • threshold conditions can be set on the recurrence of the motion pattern, such as that the detected pattern recurs sufficiently often and that the congruity of the pattern in respect of the measured data is sufficiently significant, for example.
  • the de ⁇ vice attention is also paid in the de ⁇ vice to the fact that the duration in time and amplitude of the motion pattern may change slidingly in time.
  • the motion pattern may also be visible in the de ⁇ vice different when the device is in the pocket or the hand, for example.
  • other irregularities in a recurring motion pat ⁇ tern, detected at given points in time may be taken into account in the device. For example, even if no periodicity were detected in the signal at a given point in time, it does not necessarily mean that periodicity has disappeared from the signal.
  • a threshold condition which may be a given time threshold value, for example, may be set on the disappearance of periodicity.
  • method step 110 once the motion pattern is measured, in ⁇ formation on the event is given to the user of the device in a device -originating operating situation.
  • method step 112 the effect of the identified motion pattern on one or more motion parameters is corrected.
  • a signal according to the measured motion pattern is directly subtracted from the meas ⁇ ured motion parameter in order to obtain a corrected motion parameter value.
  • threshold values employed for general mo ⁇ tion identification are adjusted in the device. For example, if a mobile telephone allows an incoming call to be answered, i.e.
  • the device can be con ⁇ trolled by a swinging gesture of the magnitude of threshold value 'k', the threshold value may be raised to level '1.3*k', for example, during an identified motion pattern, the new level being employed for controlling the device in the manner illustrated by step 114.
  • the gestures employed for controlling the de ⁇ vice may be stored in the device in advance or the user himself may teach the device the desired control gestures, which may be e.g. turns, swings, tilts, taps or the like.
  • a given threshold value set of acceleration signal values during a given period of time for example, is gen ⁇ erated for each gesture. Later, a gesture may be detected in the device such that one or more acceleration signals measured fulfil the threshold condition determined for it in advance.
  • a threshold condition refers to a series of acceleration component values in a given order and during a given time, for example.
  • the order and/or time limits may be in ⁇ terpreted more strictly or loosely depending on whether the intention is to em ⁇ phasize that the system does not accidentally interpret some user motions un ⁇ intentionally as gestures or that the device will not erroneously fail to identify the correct gestures performed by the user.
  • the device when the device detects that the user is performing a gesture, the device aims at separately identifying the periodicity associated with the gesture. There is no need to eliminate such gesture-related periodicity.
  • gesture- related periodicity is that if the gesture performed by the user is a tap, the me ⁇ chanics of the device may remain vibrating for a moment, wherefore a gesture- related periodic component is visible in the motion of the device.
  • the device aims at identifying a change occurring in an identified motion pattern at the beginning of a control motion. In other words, for example, if the user of a mobile telephone is in a car, the device is subjected to mechanical vibration as a motion pattern. If a call is incoming to the mobile telephone, the device measures the mechanical vibration before issuing an alarm to the user.
  • Figures 2, 3 and 4 illustrate the identification steps of the mo ⁇ tion pattern and the gesture described in connection with Figure 1.
  • said figures show a signal 200, 300, 400 to be measured, as a uniplanar Y signal component, but in practice the signal to be meas ⁇ ured/compared may also be a sum vector composed of several components.
  • a person can be thought to be walking, whereby a periodically recurring motion pattern is formed in the Y signal com- ponent 200 and includes signal peaks 200A and 200B.
  • a motion pattern 202 descriptive of a person's walk, has been stored in the device or taught to the device in advance.
  • the motion pattern 202 is slid on the time axis over the sig ⁇ nal 200 measured and at point 202', the data stored in the motion pattern 202 and the signal peak 200B measured are observed to be congruent enough in order for the signal 200 measured to be interpreted, in the device, to represent a person's walk. It is evident that at the initial moment of the measurement, the device does not necessarily know that a person is walking, for which reason the measured signal may have to be compared in the device with several mo ⁇ tion patterns descriptive of different operating situations. [0030] Figure 3 illustrates an error identification problem in an elec ⁇ tronic device employing motion identification.
  • FIG. 5 shows an electronic device 500 according to an embodiment.
  • the device 500 comprises a control unit 502 that can be imple ⁇ mented by software in a general-purpose processor, for example.
  • the task of the control unit is to coordinate the operation of the device.
  • the control unit 502 communicates with a memory unit 504 in the device.
  • Motion patterns and/or gestures for example, can be either stored in the memory as a factory setting or taught by the user.
  • the device may also comprise a user in ⁇ terface 506.
  • the user interface may comprise a keyboard, a display, a microphone and a loudspeaker.
  • the keyboard and the display can be used to control the operation of the device by means of menus, for example.
  • a given gesture can be taught for instance by the user selecting a teaching function from a menu by means of a keyboard and a display, and selecting the starting and end times of the teaching by means of the keyboard.
  • the device may be con ⁇ trolled not only with the keyboard, but also by means of voice or gestures, for example.
  • the electronic device also comprises an acceleration measurement unit 508, which can be implemented by means of one or more linear acceleration sensors and/or one or more angular accel ⁇ eration sensors, for example.
  • the device may comprise an identi ⁇ fication unit 510, which aims at identifying a given motion pattern in the data measured by the measurement unit 508.
  • the identification unit may aim at identifying the motion pattern either by comparing the data measured with a reference pattern stored in the memory 504 or by aiming at identifying the mo ⁇ tion pattern by means of a previously stored reference pattern.
  • the identification unit 510 may compare the motion information measured by the measurement unit with the control mo ⁇ tions, such as gestures, stored in the memory.
  • the identification unit may eliminate the effect of an identified motion pattern from the control motion, thus promoting the identification of the control motion.
  • the invention is implementable in an electronic device by software storable in a processor, for example.
  • the software in ⁇ cludes one or more software routines for executing the method steps of the method according to the invention.
  • the invention is also implementable with an application-specific integrated circuit (ASIC) or with separate logics compo ⁇ nents.
  • ASIC application-specific integrated circuit

Abstract

An electronic device (500) comprising means (510) for identifying a motion pattern in a motion of the device, and means (510) for eliminating the effect of the motion pattern from a control motion employed for controlling the device. The identification unit (510) eliminates the effect of an identified motion pattern from the control motion by comparing the motion information measured by the measurement unit (508) with the control motion stored in the memory (504).

Description

CONTROLLING AN ELECTRONIC DEVICE
BACKGROUND OF THE INVENTION [0001] The invention relates to identifying movement in a mobile environment and particularly to utilizing the identified movement in controlling a device. [0002] Several different manners of controlling an electronic device, such as a mobile telephone, have been developed. Alongside with keyboard- based control, control methods include those wherein the control is based on voice and gestures, for example. In a prior art device, the display can be im¬ plemented in such a manner that, irrespective of changes in the orientation of the device, the text of the display can always be read vertically. It is also known to zoom the display by turning the device. Solutions based on accelera¬ tion sensors have also been utilized for instance for replacing the keyboard of a computer in such a manner that a given finger position is associated with a given character input. [0003] Said prior art applications are associated with the significant drawback that the control does not take into consideration the environment or the operating situation wherein the device is being used or identification is be¬ ing performed. For this reason, should any essential motion-based disturbing factors be associated with the environment or operating situation, the risk of misrecognitions is apparent and substantially compromises the usability of the device.
BRIEF DESCRIPTION OF THE INVENTION [0004] The object of the invention is thus to provide an improved method and an apparatus for implementing the method in a manner that better takes into consideration the operating situation and/or environment of the de¬ vice. Accordingly, the object of the invention is a method of controlling an elec¬ tronic device, comprising identifying a motion pattern in the motion of the de¬ vice and eliminating the effect of the identified motion pattern from a control motion used for controlling the device. [0005] The invention also relates to a software product comprising a software routine for receiving measurement information descriptive of a motion of the device, a software routine for identifying a motion pattern in the meas¬ urement information, and a software routine for eliminating the effect of the identified motion pattern from a control motion used for controlling the device and included in the measurement information. [0006] The invention also relates to an electronic device comprising means for identifying a motion pattern in a motion of the device, and means for eliminating the effect of the identified motion pattern from a control motion used for controlling the device. [0007] Preferred embodiments of the invention are described in the dependent claims. [0008] The invention is based on aiming at identifying, in an elec¬ tronic device, whether the device is susceptible to an identifiable motion pat¬ tern. An identifiable motion pattern may be directed to an electronic device for instance when the device is subjected to mechanical vibration. Herein, me¬ chanical vibration refers to a recurring motion directed to the device when the device is in a train or a car, for example. In association with the description of the invention, an identifiable motion pattern may also refer to a motion pattern corresponding to the walk of a person carrying the device, for example. [0009] In accordance with the invention, the motion pattern is identi¬ fied and its effect is eliminated from the device control motion. The control mo¬ tion is a gesture, such as a turn or a swing of the device, for example. The con¬ trol motion may also be a tap on the device, for example. [0010] The device according to the invention may be e.g. a mobile telephone, a portable computer or another corresponding device enabling mo¬ tion identification. [0011] An advantage of the method and device of the invention is that the control motions intended to control the device can be identified con¬ siderably better and with fewer erroneous identifications once an identified dis¬ turbance is eliminated from the control motions.
BRIEF DESCRIPTION OF THE FIGURES [0012] In the following, the invention will be described in more detail in connection with preferred embodiments with reference to the accompanying drawings, in which Figure 1 shows an embodiment of the method of the invention; Figure 2 illustrates the identification of a motion pattern according to an embodiment; Figure 3 illustrates the identification of a motion pattern according to an embodiment; Figure 4 illustrates a measurement signal filtered from a known mo¬ tion pattern; Figure 5 shows an electronic device according to an embodiment as a block diagram.
DETAILED DESCRIPTION OF THE INVENTION [0013] In the following, an embodiment of the method according to the invention will be described by means of Figure 1. In the initial step 102 of the method, a given reference motion pattern is stored in an electronic device. The reference motion pattern can be stored in the device for instance at the factory in connection with the manufacture of the device. The stored reference motion patterns may describe the operating environment of the device, for in¬ stance that the device is in a train or carried by a person riding a bicycle. The patterns stored in the device as a factory setting may be based for instance on a large number of operating situation examples, from which an average motion pattern is generated. Alternatively, the device may comprise several alternative patterns for a given operating environment, such as a train. [0014] In addition to the reference motion patterns stored as a fac¬ tory setting, the user may teach the device the desired patterns. For example, the user may teach the device a reference motion pattern corresponding to his walk by depressing a given key at the start and the end of the teaching. The device stores the data between the keystrokes and analyses it by searching the data for acceleration signal values recurring in a certain manner, for exam¬ ple. [0015] Generally, it is advantageous to keep the motion measure¬ ment, which consumes much energy, switched off in an electronic device, such as a mobile telephone. For example, conditions may be set in the device as to when motion measurement is activated. As regards the condition to be checked, two different operating situations can be distinguished, device- originating and user-originating operating situations. The device-originating operating situation according to step 104 refers to an operating situation wherein the device is aware of the event before the user is. For example, in the case of a mobile telephone, a device-terminating call is an example of a de¬ vice-originating event. The mobile telephone is aware of the incoming call based on the signalling preceding the call, and is thus able to detect the start of the device-originating event on the basis of the start of said signalling. Other examples of device-originating events that can be brought forward include for instance a short message arriving at the mobile telephone or a timer triggering off, e.g. an alarm clock or a calendar alarm in an electronic device. [0016] A user-originating operating situation refers to an event origi¬ nating from the user. In a user-originating operating situation, the device may deduce the start of the use of the device on the basis of a given initial impulse, for example. Herein, an initial impulse refers to a function by means of which the device is able to conclude the start of the use. As an example of an initial impulse, opening of the keypad lock may be mentioned. Figure 1 shows an embodiment of a device-originating event, but it can also be applied to a user- originating event with the exception of steps 104 and 110. [0017] In an embodiment, the start of a device-originating or user- originating situation initiates motion measurement in the device in accordance with step 106. [0018] Although conditions may be set on motion measurement, continuous motion status measurement in the device is also feasible. For ex¬ ample, in a user-originating situation, the device may operate such that the device continuously aims at identifying gestures by comparing a measured motion with the threshold values of one or more gestures. The device may also tape its motion in a memory for a given time, such as for the duration of 10 seconds, for example. If uncertainty exists at a given point in time whether the user performed a gesture, the taped data may be reverted to and attempts may be made to identity the motion pattern in the data. This may improve the gesture identification performed at said point in time, once the identified motion pattern can be filtered off. Motion status measurement can also be performed periodically in the device. As an example may be mentioned recurring signal¬ ling, based on location determination, for example, between a mobile tele¬ phone and a network, allowing the motion status to be measured always when the device has to be activated also otherwise because of the signalling. [0019] Step 106 describes motion measurement in an electronic device. Motion may be measured by means of one or more motion parame¬ ters, such as an acceleration parameter, for example. Acceleration measure¬ ment may be performed for instance in three mutually perpendicular linear di¬ rections: directions x, y and z. In addition to acceleration measurement in said linear directions, angular acceleration may also be measured in the device by means of a magnetometer or a gyroscope, for example. [0020] In step 108, an attempt is made to identify a motion pattern possibly detectable in the motion of the device. In principle, the motion pattern may be identified in two different manners, either by comparing the motion with a previously stored/taught reference motion pattern or by aiming at identifying some new motion pattern in the data measured. [0021] Attempts may be made to identify a motion pattern motion parameter-specifically for instance by studying the x-oriented linear component and the y-oriented linear component separately. In identifying a motion pattern, several motion parameters may also be studied together as a whole. In this case, the sum vector composed of the acceleration components can be com¬ pared with a predetermined threshold value. In the case of a three-dimensional vector, the orientation of the device may be checked from time to time and, if necessary, take it into consideration when amending the direction of the sum vector. [0022] When the motion parameter measured is compared with a previously stored reference motion pattern, the comparison can be carried out for a given predetermined period of time. If the correlation between the motion parameter and the reference pattern is sufficiently high during the period of time measured, it may be stated that the reference motion pattern was found in the motion parameter. In a preferred embodiment, a recurring motion pattern, i.e. periodicity in a signal, is identified in the measured signal by means of an autocorrelation function. Autocorrelation indicates the correlation between the signal values and the previous values, i.e. in that case, there is no need to util¬ ize previously stored reference motion patterns or usage context data in the identification of the motion pattern. [0023] In the identification of a new motion pattern, the procedure may be for instance such that a reference sample of a given length is taken from the signal to be measured, such as a z acceleration signal. The sampling can be timed for instance at such a point of the signal when the signal distinctly deviates from the basic level indicating immobility. The reference sample taken can then be slid over the z signal to be measured, and if the reference sample corresponds with some predetermined accuracy to a later signal sample, the conclusion is that the motion pattern has recurred. It is evident that threshold conditions can be set on the recurrence of the motion pattern, such as that the detected pattern recurs sufficiently often and that the congruity of the pattern in respect of the measured data is sufficiently significant, for example. Once the motion pattern is found, attempts can still be made separately to identify the length and correct position in the time domain of the pattern. This refers to the fact that at the initial stage, the reference sample did not necessarily hit the right position in the time domain, but was in the middle of changes occurring in the signal, for example. Once the correct position and length in the time do¬ main are found for the reference sample, the sample can be employed in cor¬ recting a measured motion parameter. [0024] In a preferred embodiment, attention is also paid in the de¬ vice to the fact that the duration in time and amplitude of the motion pattern may change slidingly in time. The motion pattern may also be visible in the de¬ vice different when the device is in the pocket or the hand, for example. [0025] Furthermore, other irregularities in a recurring motion pat¬ tern, detected at given points in time, may be taken into account in the device. For example, even if no periodicity were detected in the signal at a given point in time, it does not necessarily mean that periodicity has disappeared from the signal. In other words, a threshold condition, which may be a given time threshold value, for example, may be set on the disappearance of periodicity. In this case, if periodicity is not detected during a period of time longer than the threshold value, it may be concluded to have disappeared. [0026] In method step 110, once the motion pattern is measured, in¬ formation on the event is given to the user of the device in a device -originating operating situation. [0027] In method step 112, the effect of the identified motion pattern on one or more motion parameters is corrected. In an embodiment, a signal according to the measured motion pattern is directly subtracted from the meas¬ ured motion parameter in order to obtain a corrected motion parameter value. According to another embodiment, threshold values employed for general mo¬ tion identification are adjusted in the device. For example, if a mobile telephone allows an incoming call to be answered, i.e. the device can be con¬ trolled by a swinging gesture of the magnitude of threshold value 'k', the threshold value may be raised to level '1.3*k', for example, during an identified motion pattern, the new level being employed for controlling the device in the manner illustrated by step 114. The gestures employed for controlling the de¬ vice may be stored in the device in advance or the user himself may teach the device the desired control gestures, which may be e.g. turns, swings, tilts, taps or the like. In association with teaching or storage, a given threshold value set of acceleration signal values during a given period of time, for example, is gen¬ erated for each gesture. Later, a gesture may be detected in the device such that one or more acceleration signals measured fulfil the threshold condition determined for it in advance. Herein, a threshold condition refers to a series of acceleration component values in a given order and during a given time, for example. At the identification stage, the order and/or time limits may be in¬ terpreted more strictly or loosely depending on whether the intention is to em¬ phasize that the system does not accidentally interpret some user motions un¬ intentionally as gestures or that the device will not erroneously fail to identify the correct gestures performed by the user. In a preferred embodiment, when the device detects that the user is performing a gesture, the device aims at separately identifying the periodicity associated with the gesture. There is no need to eliminate such gesture-related periodicity. An example of gesture- related periodicity is that if the gesture performed by the user is a tap, the me¬ chanics of the device may remain vibrating for a moment, wherefore a gesture- related periodic component is visible in the motion of the device. [0028] In a preferred embodiment, the device aims at identifying a change occurring in an identified motion pattern at the beginning of a control motion. In other words, for example, if the user of a mobile telephone is in a car, the device is subjected to mechanical vibration as a motion pattern. If a call is incoming to the mobile telephone, the device measures the mechanical vibration before issuing an alarm to the user. At the instant of the alarm, if the mobile telephone is in a pocket, for example, the device is momentarily sub¬ jected to a different acceleration than previously when the user takes the de¬ vice from the pocket into the hand. A momentary acceleration associated with the user's reaction can be ignored. The mechanical vibration caused by the movement of the car can also be seen in the device in a different manner in the hand than what the vibration looked like when the device was in the pocket. [0029] Figures 2, 3 and 4 illustrate the identification steps of the mo¬ tion pattern and the gesture described in connection with Figure 1. For the sake of simplicity, said figures show a signal 200, 300, 400 to be measured, as a uniplanar Y signal component, but in practice the signal to be meas¬ ured/compared may also be a sum vector composed of several components. In the example shown in Figure 2, a person can be thought to be walking, whereby a periodically recurring motion pattern is formed in the Y signal com- ponent 200 and includes signal peaks 200A and 200B. A motion pattern 202, descriptive of a person's walk, has been stored in the device or taught to the device in advance. The motion pattern 202 is slid on the time axis over the sig¬ nal 200 measured and at point 202', the data stored in the motion pattern 202 and the signal peak 200B measured are observed to be congruent enough in order for the signal 200 measured to be interpreted, in the device, to represent a person's walk. It is evident that at the initial moment of the measurement, the device does not necessarily know that a person is walking, for which reason the measured signal may have to be compared in the device with several mo¬ tion patterns descriptive of different operating situations. [0030] Figure 3 illustrates an error identification problem in an elec¬ tronic device employing motion identification. The assumption is that a thresh¬ old value 302 is specified in the device, and a signal whose amplitude in the device exceeds said value is interpreted as a gesture that initiates a predeter¬ mined function in the device. In the case of Figure 3, a signal peak 300A caused by walking would be erroneously interpreted as a gesture initiating a function. However, the gesture the user means is executed only at point 300B of the signal to be measured, at which point the gesture meant by the user is summed to the walking signal peak. [0031] Figure 4 shows a signal according to Figure 3, from which the recurring motion pattern caused by walking is filtered off. Signal peak 400B, exceeding a threshold value 402 and descriptive of an actual gesture performed by the user is easily detectable in a remaining measured signal 400. [0032] Figure 5 shows an electronic device 500 according to an embodiment. The device 500 comprises a control unit 502 that can be imple¬ mented by software in a general-purpose processor, for example. The task of the control unit is to coordinate the operation of the device. For example, the control unit 502 communicates with a memory unit 504 in the device. Motion patterns and/or gestures, for example, can be either stored in the memory as a factory setting or taught by the user. The device may also comprise a user in¬ terface 506. For example, in the case of a mobile telephone, the user interface may comprise a keyboard, a display, a microphone and a loudspeaker. The keyboard and the display can be used to control the operation of the device by means of menus, for example. In a mobile telephone, a given gesture can be taught for instance by the user selecting a teaching function from a menu by means of a keyboard and a display, and selecting the starting and end times of the teaching by means of the keyboard. Naturally, the device may be con¬ trolled not only with the keyboard, but also by means of voice or gestures, for example. [0033] The electronic device according to Figure 5 also comprises an acceleration measurement unit 508, which can be implemented by means of one or more linear acceleration sensors and/or one or more angular accel¬ eration sensors, for example. Furthermore, the device may comprise an identi¬ fication unit 510, which aims at identifying a given motion pattern in the data measured by the measurement unit 508. The identification unit may aim at identifying the motion pattern either by comparing the data measured with a reference pattern stored in the memory 504 or by aiming at identifying the mo¬ tion pattern by means of a previously stored reference pattern. [0034] Furthermore, the identification unit 510 may compare the motion information measured by the measurement unit with the control mo¬ tions, such as gestures, stored in the memory. The identification unit may eliminate the effect of an identified motion pattern from the control motion, thus promoting the identification of the control motion. [0035] The invention is implementable in an electronic device by software storable in a processor, for example. In this case, the software in¬ cludes one or more software routines for executing the method steps of the method according to the invention. The invention is also implementable with an application-specific integrated circuit (ASIC) or with separate logics compo¬ nents. [0036] It is obvious to a person skilled in the art that, as technology advances, the basic idea of the invention can be implemented in a variety of ways. Consequently, the invention and its embodiments are not restricted to the above examples, but can vary within the scope of the claims.

Claims

CLAIMS 1. A method of controlling an electronic device, comprising: identifying (108) a motion pattern in the motion of the device, c h a r a c t e r i z e d by: eliminating (112) the effect of the identified motion pattern from a control motion used for controlling the device. 2. A method as claimed in claim 1 , wherein the motion pattern iden¬ tified in the motion of the device is a recurring motion pattern. 3. A method as claimed in claim 1 , wherein the motion pattern is identified before the start of the control motion. 4. A method as claimed in claim 1 , wherein: a motion pattern detected before the start of the control motion and a motion pattern during the control motion are identified in the device; only the motion pattern detected before the start of the control mo¬ tion is eliminated from the control motion. 5. A method as claimed in claim 1 , wherein: in a device-originating event, a predetermined period of time is al¬ lowed to pass before information about the event is given to a user of the de¬ vice; the motion pattern is identified during said period of time. 6. A method as claimed in claim 1 , wherein: in a user-originating event, an initiation impulse indicative of the start of the use of the device is received from a user; the motion pattern is identified after reception of the initiation im¬ pulse. 7. A method as claimed in claim 1 , wherein: the motion of the device is measured by means of one or more mo¬ tion parameters. 8. A method as claimed in claim 7, wherein: one or more motion parameters measured are compared with a ref¬ erence motion pattern stored in advance in the device; the reference motion pattern is accepted as identified when the comparison between one or more motion parameters measured and the refer¬ ence motion pattern fulfils a predetermined threshold condition. 9. A method as claimed in claim 7, wherein: an autocorrelation function is generated from the values of one or more motion parameters measured; a recurring motion pattern is identified from the autocorrelation func¬ tion generated. 10. A method as claimed in claim 7, wherein: the one or more motion parameters measured are compared with a control motion pattern stored in the device in advance; the control motion is accepted as identified when the comparison between one or more motion parameters measured and the control motion pattern fulfils a predetermined threshold condition. 11. A method as claimed in claim 1 , wherein the electronic device is a mobile telephone. 12. A method as claimed in claim 2, wherein the recurring motion pattern is mechanical vibration. 13. A software product, c h a r a c t e r i z e d in that the software product comprises: a software routine for receiving measurement information descrip¬ tive of a motion of the device, a software routine for identifying a motion pattern in the measure¬ ment information, and a software routine for eliminating the effect of the motion pattern from a control motion used for controlling the device and included in the meas¬ urement information. 14. An electronic device, c h a r a c t e r i z e d in that the device comprises: means for identifying (510) a motion pattern in a motion of the de¬ vice; and means for eliminating (510) the effect of the motion pattern from a control motion used for controlling the device. 15. A device as claimed in claim 14, wherein the motion pattern to be identified is a recurring motion pattern. 16. A device as claimed in claim 14, wherein the identification means are configured to identify the motion pattern before the start of the con¬ trol motion. 17. A device as claimed in claim 14, wherein: the identification means are configured to identify a motion pattern detected before the start of the control motion and a motion pattern during the control motion; and the elimination means are configured to eliminate only the motion pattern detected before the start of the control motion from the control motion. 18. A device as claimed in claim 14, comprising: means for detecting the start of a device-originating event; means to wait a predetermined period of time before a user of the device is given information about the event; and the identification means are configured to identify the motion pattern during said period of time. 19. A device as claimed in claim 14, comprising: means for receiving an initiation impulse indicative of the start of the use from a user of the device; and the identification means are configured to identify the motion pattern after reception of the initiation impulse. 20. A device as claimed in claim 14, comprising: means for measuring the motion of the device by means of one or more motion parameters. 21. A device as claimed in claim 20, the identification means being configured to: compare one or more motion parameters measured with a refer¬ ence motion pattern stored in advance in the device; accept the reference motion pattern as identified when the compari¬ son between one or more motion parameters measured and the reference mo¬ tion pattern fulfils a predetermined threshold condition. 22. A device as claimed in claim 20, wherein the identification means are configured to: generate an autocorrelation function from the values of one or more motion parameters measured; and identify a recurring motion pattern from the autocorrelation function generated. 23. A device as claimed in claim 20, comprising: means for comparing one or more motion parameters measured with a control motion pattern stored in advance in the device; means for accepting the control motion as identified when the com¬ parison between one or more motion parameters measured and the control motion pattern fulfils a predetermined threshold condition. 24. A device as claimed in claim 14, the electronic device being a mobile telephone.
PCT/FI2005/050226 2004-06-24 2005-06-22 Controlling an electronic device WO2006000639A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
KR1020067026992A KR20070032709A (en) 2004-06-24 2005-06-22 Control of electronic devices
US11/597,883 US20070225935A1 (en) 2004-06-24 2005-06-22 Controlling an Electronic Device
EP05756279A EP1782165A4 (en) 2004-06-24 2005-06-22 Controlling an electronic device
JP2007517321A JP2008503816A (en) 2004-06-24 2005-06-22 Electronic device control

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
FI20045239 2004-06-24
FI20045239A FI119746B (en) 2004-06-24 2004-06-24 Control of an electronic device

Publications (1)

Publication Number Publication Date
WO2006000639A1 true WO2006000639A1 (en) 2006-01-05

Family

ID=32524613

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/FI2005/050226 WO2006000639A1 (en) 2004-06-24 2005-06-22 Controlling an electronic device

Country Status (7)

Country Link
US (1) US20070225935A1 (en)
EP (1) EP1782165A4 (en)
JP (1) JP2008503816A (en)
KR (1) KR20070032709A (en)
CN (1) CN100456213C (en)
FI (1) FI119746B (en)
WO (1) WO2006000639A1 (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1806643A1 (en) * 2006-01-06 2007-07-11 Tcl & Alcatel Mobile Phones Limited Method for entering commands and/or characters for a portable communication device equipped with a tilt sensor
EP2343626A3 (en) * 2009-12-25 2011-08-24 Fujitsu Limited Detected information correction apparatus and method
EP2418565A1 (en) * 2010-08-12 2012-02-15 Research In Motion Limited Method and electronic device with motion compensation
US8456430B2 (en) 2009-08-21 2013-06-04 Motorola Mobility Llc Tactile user interface for an electronic device
US8952832B2 (en) 2008-01-18 2015-02-10 Invensense, Inc. Interfacing application programs and motion sensors of a device
US8960002B2 (en) 2007-12-10 2015-02-24 Invensense, Inc. Vertically integrated 3-axis MEMS angular accelerometer with integrated electronics
US8997564B2 (en) 2007-07-06 2015-04-07 Invensense, Inc. Integrated motion processing unit (MPU) with MEMS inertial sensing and embedded digital electronics
EP2414798A4 (en) * 2009-03-30 2015-05-06 Kionix Inc Directional tap detection algorithm using an accelerometer
US9292102B2 (en) 2007-01-05 2016-03-22 Invensense, Inc. Controlling and accessing content using motion processing on mobile devices
US10732718B2 (en) 2009-06-12 2020-08-04 Samsung Electronics Co., Ltd. Apparatus and method for motion detection in portable terminal

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7920694B2 (en) * 2006-02-03 2011-04-05 Immersion Corporation Generation of consistent haptic effects
US8902154B1 (en) * 2006-07-11 2014-12-02 Dp Technologies, Inc. Method and apparatus for utilizing motion user interface
US20090265671A1 (en) * 2008-04-21 2009-10-22 Invensense Mobile devices with motion gesture recognition
US8555282B1 (en) 2007-07-27 2013-10-08 Dp Technologies, Inc. Optimizing preemptive operating system with motion sensing
US8996332B2 (en) * 2008-06-24 2015-03-31 Dp Technologies, Inc. Program setting adjustments based on activity identification
US8279242B2 (en) * 2008-09-26 2012-10-02 Microsoft Corporation Compensating for anticipated movement of a device
US9529437B2 (en) 2009-05-26 2016-12-27 Dp Technologies, Inc. Method and apparatus for a motion state aware device
US20140168057A1 (en) * 2012-12-13 2014-06-19 Qualcomm Incorporated Gyro aided tap gesture detection
US9691382B2 (en) * 2013-03-01 2017-06-27 Mediatek Inc. Voice control device and method for deciding response of voice control according to recognized speech command and detection output derived from processing sensor data
CN110413135A (en) * 2018-04-27 2019-11-05 开利公司 Posture metering-in control system and operating method
CN110415387A (en) * 2018-04-27 2019-11-05 开利公司 Posture metering-in control system including the mobile device being arranged in the receiving member carried by user
CN110415389B (en) 2018-04-27 2024-02-23 开利公司 Gesture access control system and method for predicting location of mobile device relative to user
FR3089319A1 (en) * 2018-12-04 2020-06-05 Orange Method for evaluating the bodily activity of a user

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2347593A (en) * 1999-01-06 2000-09-06 Motorola Inc Use of motion to input information into a radio telephone
US6369794B1 (en) * 1998-09-09 2002-04-09 Matsushita Electric Industrial Co., Ltd. Operation indication outputting device for giving operation indication according to type of user's action
JP2002171316A (en) * 2000-11-30 2002-06-14 Toshiba Corp Mobile communication terminal
JP2002207703A (en) * 2001-01-11 2002-07-26 Sony Corp Electronic equipment
US6573883B1 (en) * 1998-06-24 2003-06-03 Hewlett Packard Development Company, L.P. Method and apparatus for controlling a computing device with gestures
WO2003077087A2 (en) * 2002-03-13 2003-09-18 Philips Intellectual Property & Standards Gmbh Portable electronic device having means for registering its arrangement in space
WO2004082248A1 (en) * 2003-03-11 2004-09-23 Philips Intellectual Property & Standards Gmbh Configurable control of a mobile device by means of movement patterns
US20040227742A1 (en) * 2002-08-06 2004-11-18 Sina Fateh Control of display content by movement on a fixed spherical space

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000148351A (en) * 1998-09-09 2000-05-26 Matsushita Electric Ind Co Ltd Operation instruction output device giving operation instruction in accordance with kind of user's action and computer-readable recording medium
US6466198B1 (en) * 1999-11-05 2002-10-15 Innoventions, Inc. View navigation and magnification of a hand-held device with a display
JP3582433B2 (en) * 1999-12-02 2004-10-27 日本電気株式会社 Information processing apparatus and information processing method
US6847351B2 (en) * 2001-08-13 2005-01-25 Siemens Information And Communication Mobile, Llc Tilt-based pointing for hand-held devices
KR20020091002A (en) * 2001-11-06 2002-12-05 주식회사 와이어리스리퍼블릭 Apparatus and method for capturing and working acceleration, and application thereof, and computer readable recording medium storing programs for realizing the acceleration capturing and working methods
DE60215504T2 (en) * 2002-10-07 2007-09-06 Sony France S.A. Method and apparatus for analyzing gestures of a human, e.g. for controlling a machine by gestures

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6573883B1 (en) * 1998-06-24 2003-06-03 Hewlett Packard Development Company, L.P. Method and apparatus for controlling a computing device with gestures
US6369794B1 (en) * 1998-09-09 2002-04-09 Matsushita Electric Industrial Co., Ltd. Operation indication outputting device for giving operation indication according to type of user's action
GB2347593A (en) * 1999-01-06 2000-09-06 Motorola Inc Use of motion to input information into a radio telephone
JP2002171316A (en) * 2000-11-30 2002-06-14 Toshiba Corp Mobile communication terminal
JP2002207703A (en) * 2001-01-11 2002-07-26 Sony Corp Electronic equipment
WO2003077087A2 (en) * 2002-03-13 2003-09-18 Philips Intellectual Property & Standards Gmbh Portable electronic device having means for registering its arrangement in space
US20040227742A1 (en) * 2002-08-06 2004-11-18 Sina Fateh Control of display content by movement on a fixed spherical space
WO2004082248A1 (en) * 2003-03-11 2004-09-23 Philips Intellectual Property & Standards Gmbh Configurable control of a mobile device by means of movement patterns

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP1782165A4 *

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1806643A1 (en) * 2006-01-06 2007-07-11 Tcl & Alcatel Mobile Phones Limited Method for entering commands and/or characters for a portable communication device equipped with a tilt sensor
US7562459B2 (en) 2006-01-06 2009-07-21 Tcl Communication Technology Holdings, Ltd. Method for entering commands and/or characters for a portable communication device equipped with a tilt sensor
US7810247B2 (en) 2006-01-06 2010-10-12 Ipg Electronics 504 Limited Method for entering commands and/or characters for a portable communication device equipped with a tilt sensor
USRE45411E1 (en) 2006-01-06 2015-03-17 Drnc Holdings, Inc. Method for entering commands and/or characters for a portable communication device equipped with a tilt sensor
US9292102B2 (en) 2007-01-05 2016-03-22 Invensense, Inc. Controlling and accessing content using motion processing on mobile devices
US10288427B2 (en) 2007-07-06 2019-05-14 Invensense, Inc. Integrated motion processing unit (MPU) with MEMS inertial sensing and embedded digital electronics
US8997564B2 (en) 2007-07-06 2015-04-07 Invensense, Inc. Integrated motion processing unit (MPU) with MEMS inertial sensing and embedded digital electronics
US8960002B2 (en) 2007-12-10 2015-02-24 Invensense, Inc. Vertically integrated 3-axis MEMS angular accelerometer with integrated electronics
US9846175B2 (en) 2007-12-10 2017-12-19 Invensense, Inc. MEMS rotation sensor with integrated electronics
US8952832B2 (en) 2008-01-18 2015-02-10 Invensense, Inc. Interfacing application programs and motion sensors of a device
US9342154B2 (en) 2008-01-18 2016-05-17 Invensense, Inc. Interfacing application programs and motion sensors of a device
US9811174B2 (en) 2008-01-18 2017-11-07 Invensense, Inc. Interfacing application programs and motion sensors of a device
EP2414798A4 (en) * 2009-03-30 2015-05-06 Kionix Inc Directional tap detection algorithm using an accelerometer
US10732718B2 (en) 2009-06-12 2020-08-04 Samsung Electronics Co., Ltd. Apparatus and method for motion detection in portable terminal
US8456430B2 (en) 2009-08-21 2013-06-04 Motorola Mobility Llc Tactile user interface for an electronic device
US8942759B2 (en) 2009-12-25 2015-01-27 Fujitsu Limited Detected information correction apparatus and method
EP2343626A3 (en) * 2009-12-25 2011-08-24 Fujitsu Limited Detected information correction apparatus and method
EP2418565A1 (en) * 2010-08-12 2012-02-15 Research In Motion Limited Method and electronic device with motion compensation

Also Published As

Publication number Publication date
CN100456213C (en) 2009-01-28
EP1782165A4 (en) 2010-03-10
JP2008503816A (en) 2008-02-07
KR20070032709A (en) 2007-03-22
CN1969250A (en) 2007-05-23
US20070225935A1 (en) 2007-09-27
FI119746B (en) 2009-02-27
FI20045239A (en) 2005-12-25
EP1782165A1 (en) 2007-05-09
FI20045239A0 (en) 2004-06-24

Similar Documents

Publication Publication Date Title
EP1782165A1 (en) Controlling an electronic device
US8125312B2 (en) System and method for locking and unlocking access to an electronic device
KR100537279B1 (en) Portable terminal with motion detecting function and method of motion detecting thereof
US20190018500A1 (en) Performing an action associated with a motion based input
US20080259742A1 (en) Methods and systems for controlling alarm clocks
US9176576B2 (en) Input device
US20120007836A1 (en) Touch screen unlocking device and method
CN100504317C (en) Movement detection device and movement detection method
KR101734450B1 (en) Multisensory speech detection
EP2821879A1 (en) Method for entering commands and/or characters for a portable communication device equipped with a tilt sensor
CN101771749B (en) The control method of hand-hold communication device and system
KR20120003908A (en) Directional tap detection algorithm using an accelerometer
GB2347593A (en) Use of motion to input information into a radio telephone
WO2011061581A1 (en) Mobile terminals for determining a location proximate a vehicle
CN102833421A (en) Mobile terminal and reminding method
CA2611043C (en) System and method for locking and unlocking access to an electronic device
US20140016668A1 (en) Input device
KR100795750B1 (en) Method and apparatus for releasing of locking mode in portable terminal
US9008639B2 (en) Controlling audio of a device
US20090328201A1 (en) Password input device, computer security system using the same and method thereof
CN108765900B (en) Method and system for preventing intelligent equipment from being lost by using wearable equipment
KR101244885B1 (en) Antitheft portable terminal and method thereof
KR101226845B1 (en) Method and apparatus for receipt notification degree of mobile terminal
GB2357673A (en) Movement based notification of an event
CN107244305B (en) Intelligent lock control device, vehicle and control method of intelligent lock

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KM KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NG NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SM SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
DPEN Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed from 20040101)
WWE Wipo information: entry into national phase

Ref document number: 11597883

Country of ref document: US

Ref document number: 2007225935

Country of ref document: US

WWE Wipo information: entry into national phase

Ref document number: 200580020357.5

Country of ref document: CN

WWE Wipo information: entry into national phase

Ref document number: 1020067026992

Country of ref document: KR

WWE Wipo information: entry into national phase

Ref document number: 2007517321

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE

WWW Wipo information: withdrawn in national office

Country of ref document: DE

WWE Wipo information: entry into national phase

Ref document number: 2005756279

Country of ref document: EP

WWP Wipo information: published in national office

Ref document number: 1020067026992

Country of ref document: KR

WWP Wipo information: published in national office

Ref document number: 2005756279

Country of ref document: EP

WWP Wipo information: published in national office

Ref document number: 11597883

Country of ref document: US