WO2009016607A2 - Apparatus, methods, and computer program products providing context-dependent gesture recognition - Google Patents
Apparatus, methods, and computer program products providing context-dependent gesture recognition Download PDFInfo
- Publication number
- WO2009016607A2 WO2009016607A2 PCT/IB2008/053088 IB2008053088W WO2009016607A2 WO 2009016607 A2 WO2009016607 A2 WO 2009016607A2 IB 2008053088 W IB2008053088 W IB 2008053088W WO 2009016607 A2 WO2009016607 A2 WO 2009016607A2
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- predefined
- context
- sensor
- movement
- context information
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72448—User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
- H04M1/72454—User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72448—User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
- H04M1/72457—User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to geographic location
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M2250/00—Details of telephonic subscriber devices
- H04M2250/12—Details of telephonic subscriber devices including a sensor for measuring a physical value, e.g. temperature or motion
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M2250/00—Details of telephonic subscriber devices
- H04M2250/22—Details of telephonic subscriber devices including a touch pad, a touch sensor or a touch detector
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Environmental & Geological Engineering (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Telephone Function (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The invention enables the use of context- dependent gestures, for example, in order to assist in the automation of one or more tasks, hi one exemplary embodiment, an apparatus senses a predefined gesture and, in conjunction with context information (e.g., location information), performs a predefined action in response to the gesture. As non-limiting examples, the gesture may involve movement of the apparatus (e.g., shaking, tapping) or movement relative to the apparatus (e.g., using a touch screen). In one exemplary embodiment of the invention, a method includes : obtaining context information for an apparatus, wherein the context information includes a predefined context (91); and in response to sensing a predefined movement associated with the predefined context, performing, by the apparatus, a predefined action, wherein the predefined movement includes a movement of or in relation to the apparatus (92).
Description
APPARATUS, METHODS, AND COMPUTER PROGRAM PRODUCTS PROVIDING CONTEXT-DEPENDENT GESTURE RECOGNITION
TECHNICAL FIELD:
The exemplary embodiments of this invention relate generally to electronic devices and, more specifically, relate to a user interface.
BACKGROUND:
Some conventional electronic devices are capable of automating various tasks. For example, most cellular phones can store phone numbers for a speed dial function, for example, whereby a user can place a phone call by pressing and holding one button. As another example, some cellular phones support some form of voice recognition that enables a user to operate the phone by speaking commands. In addition to convenience, such options also provide additional accessibility.
Some electronic devices recognize "gestures" in a limited manner. For example, a game console, the Nintendo® Wii®, utilizes a motion-sensitive controller that enables a user to interact with the console by controller-movement. As another example, various computer programs, such as Opera® and Firefox® web browsers, recognize "mouse gestures" (i.e., gestures performed in conjunction with predefined mouse/cursor movements, usually in conjunction with certain mouse button or key presses).
SUMMARY:
The below summary section is intended to be merely exemplary and non-limiting.
In one exemplary embodiment of the invention, a method comprising: obtaining context information for an apparatus, wherein the context information comprises a predefined context; and in response to sensing a predefined movement associated with the predefined context, performing, by the apparatus, a predefined action, wherein the predefined movement comprises a movement of or in relation to the apparatus.
In another exemplary embodiment of the invention, a program storage device readable by an apparatus, tangibly embodying a program of instructions executable by the apparatus for performing operations, said operations comprising: obtaining context information for the apparatus, wherein the context information comprises a predefined context; and in response to sensing a predefined movement associated with the predefined context, performing, by the apparatus, a predefined action, wherein the predefined movement comprises a movement of or in relation to the apparatus.
In another exemplary embodiment of the invention, an apparatus comprising: a context- sensing component configured to obtain context information comprising a predefined context; a movement-sensing component configured to sense movement of or in relation to the apparatus; and a processor configured, in response to the movement-sensing component sensing a predefined movement associated with the predefined context, to perform a predefined action.
In another exemplary embodiment of the invention, an apparatus comprising: means for obtaining context information comprising a predefined context; means for sensing movement of or in relation to the apparatus; and means for performing a predefined action in response to the MSC sensing a predefined movement associated with the predefined context.
BRIEF DESCRIPTION OF THE DRAWINGS: The foregoing and other aspects of exemplary embodiments of this invention are made more evident in the following Detailed Description, when read in conjunction with the attached Drawing Figures, wherein:
FIG. 1 shows a simplified block diagram of various exemplary electronic devices that are suitable for use in practicing the exemplary embodiments of this invention;
FIG. 2 shows a simplified block diagram of other exemplary electronic devices that are suitable for use in practicing the exemplary embodiments of this invention; and
FIG. 3 depicts a flowchart illustrating one non-limiting example of a method for practicing the exemplary embodiments of this invention.
DETAILED DESCRIPTION:
As utilized herein, a "gesture" is herein defined as one or more intentional movements with respect to at least one input portion of a user interface (e.g., a mouse, keyboard, joystick, touchpad, touch screen, keypad, controller). For example, as noted above, a mouse gesture typically is performed by moving the mouse cursor in a predefined manner. As a further example, in the Opera® web browser, holding down the right mouse button and subsequently moving the mouse cursor in a "L" shape (i.e., down then right, as in the shape of an uppercase "L") causes the browser to close the active tab. This use of the mouse to perform an automated task may be considered as a predefined mouse gesture, for example. Note that a gesture may comprise one or more associated inputs or actions, such as a cursor or stylus movement coupled with a key press, for example.
As utilized herein, a "context" is defined as a location, setting or environment. For example, the context within which a gesture is performed may be indicative of one or more of the environment (e.g., situation, environmental conditions, other conditions), time and/or place (e.g., relative location, geographical location, location relative to one or more predefined locations, regions, areas or objects) in which the gesture is performed.
In various situations, a user may wish to operate an electronic device under difficult conditions. For example, a construction worker wearing work gloves may desire to make a phone call with a mobile phone. As another example, a user may have other tasks to focus on, such as operating a vehicle or monitoring meters, and is not available to dial or input a series of numbers to make a phone call. In such situations, it may be desirable to automate one or more tasks associated with placing a call, for example.
The exemplary embodiments of the invention enable the use of context-dependent gestures, for example, in order to assist in the automation of one or more tasks, such as making a phone call or starting or stopping a timer, as non-limiting examples. In one
exemplary embodiment, an apparatus (e.g., an electronic device or a mobile device) senses a predefined gesture and, in conjunction with context information (e.g., location information), performs a predefined action in response to the gesture. As non-limiting examples, the gesture may comprise movement relative to the apparatus such as shaking, tapping, stylus movement or other predefined movement of or relative to the apparatus or a component in communication therewith. As a further non-limiting example, a mobile phone equipped with one or more acceleration sensors (e.g., accelerometers) may receive commands and perform various actions in response to a user shaking, tapping or moving the phone in a specified manner. By associating the gesture(s) with a context, the presence of accidental gestures (e.g., that might result in accidental commands) can be mitigated or avoided. Furthermore, since the command or action performed is similarly coupled with the context, this enables an increased number of commands (e.g., tapping the phone twice while it is on a boat performs a different action than tapping the phone twice while it is in a truck).
In one non-limiting, exemplary embodiment, the apparatus comprises a storage structure (e.g., a memory, a database) that stores or associates at least three pieces of information: a context, a gesture and an action. The triplet <context, gesture, action> could be stored, for example, in a relational database or table within the apparatus. In further exemplary embodiments, the apparatus can update a determination of the current context by mechanisms of context sensitivity (e.g., as explained in further detail herein). When the apparatus detects a gesture, it may search the relational database or table using the context and gesture as keys. If a match is found, the apparatus then executes the corresponding action (e.g., as dictated by the triplet).
It should be noted that the "triplet" referred to herein is utilized as a non-limiting example. In further exemplary embodiments, the stored data utilized for performing an action in response to a gesture and a context may comprise a plurality of contexts, a plurality of gestures and/or a plurality of actions. For example, in response to a certain gesture performed when a certain context or condition is met, a series of actions may be performed. As another example, one or more actions may be performed in response to one or more different types of gestures performed within one or more contexts. The specific configuration of these three items may be application-specific, for example.
Furthermore, the configuration may comprise any suitable configuration that is capable of implementing or operating in conjunction with the exemplary embodiments of the invention. In other exemplary embodiments, additional data or information may be associated with the stored triplet(s).
There are a number of mechanisms by which the apparatus can sense the current context and/or update such information. The following are presented as non-limiting examples. In practice, any suitable mechanism may be utilized.
The apparatus may sense or determine the current context based on a cradle or other receptacle for the device. For example, the apparatus may determine that it is currently located within a certain automobile when it is situated in a cradle that is known (e.g., predetermined or predefined) to be located within the certain automobile, The apparatus may determine the current context by utilizing the global positioning system (GPS) or another location-determining system, such as wireless network determined device location (e.g., cell location) or triangulation.
The apparatus may utilize one or more sensors to determine the current context. As non- limiting examples, the one or more sensors may comprise: a thermal sensor, an electromechanical servo (servo force balance), a strain gauge, a resonance sensor, a magnetic sensor, a magnetic induction sensor, an optical sensor, a surface acoustic wave (S AW) sensor, an acoustic sensor, a light sensor, an infrared radiation sensor, a receiver and/or a DC response sensor. As further non-limiting examples, the one or more sensors may comprise a radio frequency identification (RFID) component, a Bluetooth® component or another short range wireless component.
The apparatus may utilize a connection to determine the current context. As non-limiting examples, the connection may comprise a serial port connection, a parallel port connection, a small computer system interface (SCSI) connection, a universal serial bus (USB) connection or a fϊrewire (IEEE 1394) connection.
The apparatus may utilize one or more networks or network elements to determine the current context. For example, an apparatus may utilize an access node (e.g., base station,
access point, Node B, router, wireless access point wireless router) to which it is currently connected in order to determine a location of the device.
In some exemplary embodiments, the above-identified examples may comprise a means for determining a context (e.g., a current context, a recent context).
There are a number of mechanisms by which the apparatus can sense a gesture. For example, one or more accelerometers and/or gyroscopes may be utilized. As additional non-limiting examples, the apparatus can sense a gesture utilizing: a piezo-film, a piezoelectric sensor, a shear mode accelerometer, a surface micromachined capacitive sensor (a micro electro-mechanical system or MEMS), a bulk micromachined capacitive sensor, a bulk micromachined piezo resistive sensor, a capacitive spring mass based sensor, an electromechanical servo (servo force balance), a null-balance sensor, a strain gauge, a resonance sensor, a magnetic sensor, a magnetic induction sensor, an optical sensor, a surface acoustic wave (SAW) sensor, a laser accelerometer, a DC response sensor, a modally tuned impact hammer and/or a seat pad accelerometer. The above are presented as non-limiting examples. In practice, any suitable mechanism may be utilized.
In some exemplary embodiments, the above-identified examples may comprise a means for sensing a gesture.
In response to sensing a gesture within a predefined context, the apparatus performs an action. The action may comprise any suitable, desired action including, but not limited to: placing a phone call, starting or stopping a clock, sending a predefined text message to a predefined target, taking a photograph, playing an audio file (e.g., a predefined audio message, an audio alert, a song), receiving an audio input (e.g., enabling the device to receive an audio command or audio input) and recording sound. The predefined action may comprise any action that can be accomplished by the corresponding software and/or hardware, including, as a non-limiting example, specific application-related actions that normally would be activated via a user input or another user input (e.g., via key pressed on a keypad or keyboard).
Reference is made to FIG. 1 for illustrating a simplified block diagram of various exemplary electronic devices that are suitable for use in practicing the exemplary
embodiments of this invention. In FIG. 1, an electronic device 50 includes at least one data processor (DP) 52, at least one memory (MEM) 54 coupled to the DP 52, a context sensing component (CSC) 56 coupled to the DP 52 and a gesture sensing component (GSC) 58 coupled to the DP 52. The MEM 54 may store a program (PROG) 60. The CSC 56 is configured to sense a context of the electronic device 50 as further described herein. The GSC 58 is configured to sense a gesture as further described herein.
The PROG 60 is assumed to include program instructions that, when executed by the associated DP 52, enable the electronic device 50 to operate in accordance with the exemplary embodiments of this invention, as discussed herein.
In general, the various exemplary embodiments of the electronic device 50 can include, but are not limited to, mobile phones (e.g., cellular phones), personal digital assistants (PDAs), portable computers, image capture devices such as digital cameras, gaming devices, music storage and playback appliances, mobile Internet appliances, as well as portable units or terminals that incorporate combinations of such functions.
The exemplary embodiments of this invention may be implemented by computer software executable by the DP 52 of the electronic device 50, or by hardware (e.g., one or more circuits or integrated circuits), or by a combination of software and hardware.
The MEM 54 may be of any type suitable to the local technical environment and may be implemented using any suitable data storage technology, such as semiconductor-based memory devices, magnetic memory devices and systems, optical memory devices and systems, fixed memory and removable memory, as non-limiting examples. The DP 52 may be of any type suitable to the local technical environment, and may include one or more of general purpose computers, special purpose computers, microprocessors, digital signal processors (DSPs) and processors based on a multi-core processor architecture, as non-limiting examples.
In further exemplary embodiments, the electronic device 50 may further comprise at least one suitable transceiver (TRANS) 62 coupled to the DP 52. The TRANS 62 may enable bidirectional communication with one or more other electronic devices. In further
exemplary embodiments, the TRANS 62 enables bidirectional wireless communication with one or more other electronic devices or components in a wireless communication network (see FIG. 2 and accompanying description below). In such an exemplary embodiment, the electronic device 50 may comprise, as non-limiting examples: a cellular phone, a personal digital assistant (PDA) having wireless communication capabilities, a portable computer having wireless communication capabilities, an image capture device such as a digital camera having wireless communication capabilities, a gaming device having wireless communication capabilities, a music storage and playback appliance having wireless communication capabilities, an Internet appliance permitting wireless Internet access and browsing, as well as a portable unit or terminal that incorporates combinations of such functions. In conjunction with the TRANS 62, further exemplary embodiments of the electronic device 50 may include at least one antenna (ANT) 64.
In further exemplary embodiments, the CSC 56 may comprise the TRANS 62. As a non- limiting example, and as discussed elsewhere herein, the TRANS 62 may be configured to sense a current location of the electronic device 50.
In further exemplary embodiments, the electronic device 50 may comprise a user interface (UI) 66 comprising at least one input component (INP) 68 and/or at least one output component (OUT) 70. The INP 68 may comprise any suitable input component including, but not limited to: one or more keys, one or more buttons, a keypad, a rocker pad, a touch-sensitive component such as a touchpad or a touch screen, an audio input component such as a microphone and/or an optical input component such as a camera or other light sensor. The OUT 70 may comprise any suitable output component including, but not limited to: a visual component such as a display screen or light (e.g., a light emitting diode), a tactile component such as a Braille output or a vibratory component or an audio-producing component such as a speaker or piezoelectric component.
In further exemplary embodiments, the GSC 58 may comprise one or more components of the INP 68. As a non-limiting example, the INP 68 may include a touch screen that acts as the GSC 58 by being sensitive to a number of taps on the touch screen. In further exemplary embodiments, the OUT 70 may be utilized in conjunction with the predefined action performed in response to sensing a predefined movement (by the GSC 58) in light
of a predefined context (as obtained or sensed by the CSC 56).
In some exemplary embodiments, the MEM 54 may store data comprising at least one triplet, such as a triplet <context, gesture, action>, for example. As explained in further detail above, data stored on the MEM 54 may comprise any suitable combination that is capable of or used for operating the electronic device 50 in conjunction with the exemplary embodiments of the invention.
Reference is made to FIG. 2 for illustrating a simplified block diagram of other exemplary electronic devices that are suitable for use in practicing the exemplary embodiments of this invention. In FIG. 2, a wireless network 12 is adapted for communication with a user equipment (UE) 14 via an access node (AN) 16. The UE 14 includes a data processor (DP) 18, a memory (MEM) 20 coupled to the DP 18, a context sensing component (CSC) 38 coupled to the DP 18, a gesture sensing component (GSC) 40 coupled to the DP 18, and a suitable RF transceiver (TRANS) 22 (having a transmitter
(TX) and a receiver (RX)) coupled to the DP 18. The MEM 20 stores a program (PROG)
24. The TRANS 22 is for bidirectional wireless communications with the AN 16. Note that the TRANS 22 has at least one antenna to facilitate communication. The CSC 38 is configured to sense a context of the UE 14 as further described herein. The GSC 40 is configured to sense a gesture as further described herein.
The AN 16 includes a data processor (DP) 26, a memory (MEM) 28 coupled to the DP 26, and a suitable RF transceiver (TRANS) 30 (having a transmitter (TX) and a receiver (RX)) coupled to the DP 26. The MEM 28 stores a program (PROG) 32. The TRANS 30 is for bidirectional wireless communications with the UE 14. Note that the TRANS 30 has at least one antenna to facilitate communication. The AN 16 is coupled via a data path 34 to one or more external networks or systems, such as the internet 36, for example.
The PROG 24 is assumed to include program instructions that, when executed by the associated DP 18, enables the UE 14 to operate in accordance with the exemplary embodiments of this invention, as discussed herein.
In general, the various exemplary embodiments of the UE 14 can include, but are not
limited to, cellular phones, personal digital assistants (PDAs) having wireless communication capabilities, portable computers having wireless communication capabilities, image capture devices such as digital cameras having wireless communication capabilities, gaming devices having wireless communication capabilities, music storage and playback appliances having wireless communication capabilities, Internet appliances permitting wireless Internet access and browsing, as well as portable units or terminals that incorporate combinations of such functions.
The embodiments of this invention may be implemented by computer software executable by the DP 18 of the UE 14, or by hardware, or by a combination of software and hardware.
The MEMs 20, 28 may be of any type suitable to the local technical environment and may be implemented using any suitable data storage technology, such as semiconductor-based memory devices, magnetic memory devices and systems, optical memory devices and systems, fixed memory and removable memory, as non-limiting examples. The DPs 18, 26 may be of any type suitable to the local technical environment, and may include one or more of general purpose computers, special purpose computers, microprocessors, digital signal processors (DSPs) and processors based on a multi-core processor architecture, as non-limiting examples.
Below are provided further descriptions of non-limiting, exemplary embodiments. The below-described exemplary embodiments are separately numbered for clarity and identification. This numbering should not be construed as wholly separating the below descriptions since various aspects of one or more exemplary embodiments may be practiced in conjunction with one or more other aspects or exemplary embodiments.
(1) In one non-limiting, exemplary embodiment, and as illustrated in FIG. 3, a method comprising: obtaining context information for an apparatus, wherein the context information comprises apredefined context (91); and in response to sensing a predefined movement associated with the predefined context, performing, by the apparatus, a predefined action, wherein the predefined movement comprises a movement of or in relation to the apparatus (92).
A method as above, wherein obtaining the context information comprises at least one of: determining that the apparatus is situated in a known receptacle (e.g., cradle) or known placement/location, utilizing a global positioning system (GPS), utilizing at least one sensor, utilizing a communication connection to determine a current context or current location, or utilizing one or more networks or network elements to determine a current context (e.g., utilizing an access node such as a base station, access point or Node B). A method as in any of the above, wherein the at least one sensor comprises at least one of the following: a thermal sensor, an electromechanical servo (servo force balance), a strain gauge, a resonance sensor, a magnetic sensor, a magnetic induction sensor, an optical sensor, a surface acoustic wave (SAW) sensor, an acoustic sensor, a light sensor, an infrared radiation sensor, a receiver and/or a DC response sensor. A method as in any of the above, wherein the at least one sensor comprises a radio frequency identification (RFID) component, a Bluetooth® component or another short range wireless component.
A method as in any of the above, wherein the predefined movement is sensed by utilizing at least one sensor. A method as in any of the above, wherein the at least one sensor comprises at least one accelerometer or at least one gyroscope. A method as in any of the above, wherein the at least one sensor comprises at least one of the following: a piezo- film, a piezoelectric sensor, a shear mode accelerometer, a surface micromachined capacitive sensor (a micro electro-mechanical system or MEMS), a bulk micromachined capacitive sensor, a bulk micromachined piezo resistive sensor, a capacitive spring mass based sensor, an electromechanical servo (servo force balance), a null-balance sensor, a strain gauge, a resonance sensor, a magnetic sensor, a magnetic induction sensor, an optical sensor, a surface acoustic wave (SAW) sensor, a laser accelerometer, a DC response sensor, a modally tuned impact hammer and/or a seat pad accelerometer.
A method as in any of the above, wherein the predefined action comprises at least one of: placing a phone call, starting or stopping a clock, sending a predefined text message to a predefined target, taking a photograph, playing an audio message (e.g., a predefined audio message or alert), playing a song, receiving an audio input (e.g., enabling the device to receive an audio command or audio input) or recording sound.
A method as in any of the above, wherein the predefined context, the predefined movement and the predefined action comprise at least one triplet of information stored in a storage portion (e.g., a memory) of the apparatus. A method as in any of the above, wherein the apparatus comprises a mobile phone (e.g., a cellular phone), a personal digital assistant (PDA), a portable computer, an image capture device such as a digital camera, a gaming device, a music storage and playback appliance, a mobile Internet appliance, or a portable unit or terminal.
A method as in any above, wherein the method is implemented by a computer program. A method as in any above, wherein the method is implemented by a program storage device readable by a machine, tangibly embodying a program of instructions executable by the machine for performing operations, said operations comprising the steps of performing the method.
(2) In another non-limiting, exemplary embodiment, a program storage device readable by an apparatus, tangibly embodying a program of instructions executable by the apparatus (e.g., at least one processor of the apparatus) for performing operations, said operations comprising: obtaining context information for the apparatus, wherein the context information comprises a predefined context; and in response to sensing a predefined movement associated with the predefined context, performing, by the apparatus, a predefined action, wherein the predefined movement comprises a movement of or in relation to the apparatus.
A program storage device as recited above, wherein obtaining the context information comprises at least one of: determining that the apparatus is situated in a known receptacle (e.g., cradle) or known placement/location, utilizing a global positioning system (GPS), utilizing at least one sensor, utilizing a communication connection to determine a current context or current location, or utilizing one or more networks or network elements to determine a current context (e.g., utilizing an access node such as a base station, access point or Node B). A program storage device as in any of the above, wherein the at least one sensor comprises a thermal sensor, an electromechanical servo (servo force balance), a strain gauge, a resonance sensor, a magnetic sensor, a magnetic induction sensor, an optical sensor, a surface acoustic wave (SAW) sensor, an acoustic sensor, a light sensor,
an infrared radiation sensor, a receiver and/or a DC response sensor. A program storage device as in any of the above, wherein the at least one sensor comprises a radio frequency identification (RFID) component, a Bluetooth® component or another short range wireless component.
A program storage device as in any of the above, wherein the predefined movement is sensed by utilizing at least one sensor. A program storage device as in any of the above, wherein the at least one sensor comprises at least one accelerometer or at least one gyroscope. A program storage device as in any of the above, wherein the at least one sensor comprises at least one of the following: a piezo-film, a piezoelectric sensor, a shear mode accelerometer, a surface micromachined capacitive sensor (a micro electromechanical system or MEMS), a bulk micromachined capacitive sensor, a bulk micromachined piezo resistive sensor, a capacitive spring mass based sensor, an electromechanical servo (servo force balance), a null-balance sensor, a strain gauge, a resonance sensor, a magnetic sensor, a magnetic induction sensor, an optical sensor, a surface acoustic wave (SAW) sensor, a laser accelerometer, a DC response sensor, a modally tuned impact hammer and/or a seat pad accelerometer.
A program storage device as in any of the above, wherein the predefined action comprises at least one of: placing a phone call, starting or stopping a clock, sending a predefined text message to a predefined target, taking a photograph, playing an audio message (e.g., a predefined audio message or alert), playing a song, receiving an audio input (e.g., enabling the device to receive an audio command or audio input) and recording sound.
A program storage device as in any of the above, wherein the predefined context, the predefined movement and the predefined action comprise a triplet of information stored in a memory of the apparatus. A program storage device as in any of the above, wherein the apparatus comprises one of a mobile phone (e.g., a cellular phone), a personal digital assistant (PDA), a portable computer, an image capture device such as a digital camera, a gaming device, a music storage and playback appliance, a mobile Internet appliance, or a portable unit or terminal.
A program storage device as in any of the above, wherein the program storage device
comprises a memory or other computer-readable medium.
(3) In another non-limiting, exemplary embodiment, an apparatus comprising: a context-sensing component (CSC) configured to obtain context information comprising a predefined context; a movement-sensing component (MSC) configured to sense movement of or in relation to the apparatus; and a processor configured, in response to the MSC sensing a predefined movement associated with the predefined context, to perform a predefined action.
An apparatus as recited above, wherein the CSC comprises at least one of: a connection or sensor configured to connect to or sense a known receptacle (e.g., cradle) or known placement/location, a global positioning system (GPS) component, at least one sensor, a communication connection configured to determine a current context or current location, or a communication component configured to connect to one or more networks or network elements (e.g., utilizing an access node such as a base station, access point or Node B). An apparatus as in any of the above, wherein the CSC comprises at least one of: a thermal sensor, an electromechanical servo (servo force balance), a strain gauge, a resonance sensor, a magnetic sensor, a magnetic induction sensor, an optical sensor, a surface acoustic wave (SAW) sensor, an acoustic sensor, a light sensor, an infrared radiation sensor, a receiver and/or a DC response sensor. An apparatus as in any of the above, wherein the CSC comprises a radio frequency identification (RFID) component, a Bluetooth® component or another short range wireless component.
An apparatus as in any of the above, wherein the MSC comprises at least one sensor. An apparatus as in any of the above, wherein the MSC comprises at least one accelerometer or at least one gyroscope. An apparatus as in any of the above, wherein the MSC comprises at least one of the following: a piezo-film, a piezoelectric sensor, a shear mode accelerometer, a surface micromachined capacitive sensor (a micro electro-mechanical system or MEMS), a bulk micromachined capacitive sensor, a bulk micromachined piezo resistive sensor, a capacitive spring mass based sensor, an electromechanical servo (servo force balance), a null-balance sensor, a strain gauge, a resonance sensor, a magnetic sensor, a magnetic induction sensor, an optical sensor, a surface acoustic wave (SAW) sensor, a laser accelerometer, a DC response sensor, a modally tuned impact hammer
and/or a seat pad accelerometer.
An apparatus as in any of the above, wherein the predefined action comprises at least one of: placing a phone call utilizing a transceiver, starting or stopping a clock, sending a predefined text message to a predefined target, taking a photograph, playing an audio message (e.g., a predefined audio message or alert), playing a song, receiving an audio input (e.g., enabling the device to receive an audio command or audio input) and recording sound.
An apparatus as in any of the above, wherein the predefined context, the predefined movement and the predefined action comprise a triplet of information stored in a memory of the apparatus. An apparatus as in any of the above, wherein the apparatus comprises one of a mobile phone (e.g., a cellular phone), a personal digital assistant (PDA), a portable computer, an image capture device such as a digital camera, a gaming device, a music storage and playback appliance, a mobile Internet appliance, or a portable unit or terminal.
An apparatus as in any of the above, further comprising at least one transceiver. An apparatus as in any of the above, further comprising at least one antenna. An apparatus as in any of the above, further comprising at least one memory. An apparatus as in any of the above, further comprising at least one input component. An apparatus as in any of the above, further comprising at least one output component.
(4) In another non-limiting, exemplary embodiment, an apparatus comprising: means for obtaining context information (MOCI) comprising a predefined context; means for sensing movement (MSM) of or in relation to the apparatus; and means for performing a predefined action in response to the MSC sensing a predefined movement associated with the predefined context.
An apparatus as recited above, wherein the MOCI comprises at least one of: a connection or sensor configured to connect to or sense a known receptacle (e.g., cradle) or known placement/location, a global positioning system (GPS) component, at least one sensor, a communication connection configured to determine a current context or current location,
or a communication component configured to connect to one or more networks or network elements (e.g., utilizing an access node such as a base station, access point or Node B). An apparatus as in any of the above, wherein the MOCI comprises at least one of: a thermal sensor, an electromechanical servo (servo force balance), a strain gauge, a resonance sensor, a magnetic sensor, a magnetic induction sensor, an optical sensor, a surface acoustic wave (SAW) sensor, an acoustic sensor, a light sensor, an infrared radiation sensor, a receiver and/or a DC response sensor. An apparatus as in any of the above, wherein the MOCI comprises a radio frequency identification (RFID) component, a Bluetooth® component or another short range wireless component.
An apparatus as in any of the above, wherein the MSM comprises at least one sensor. An apparatus as in any of the above, wherein the MSM comprises at least one accelerometer or at least one gyroscope. An apparatus as in any of the above, wherein the MSM comprises at least one of the following: a piezo-film, a piezoelectric sensor, a shear mode accelerometer, a surface micromachined capacitive sensor (a micro electro-mechanical system or MEMS), a bulk micromachined capacitive sensor, a bulk micromachined piezo resistive sensor, a capacitive spring mass based sensor, an electromechanical servo (servo force balance), a null-balance sensor, a strain gauge, a resonance sensor, a magnetic sensor, a magnetic induction sensor, an optical sensor, a surface acoustic wave (SAW) sensor, a laser accelerometer, a DC response sensor, a modally tuned impact hammer and/or a seat pad accelerometer.
An apparatus as in any of the above, wherein the predefined action comprises at least one of: placing a phone call utilizing a transceiver, starting or stopping a clock, sending a predefined text message to a predefined target, taking a photograph, playing an audio message (e.g., a predefined audio message or alert), playing a song, receiving an audio input (e.g., enabling the device to receive an audio command or audio input) and recording sound.
An apparatus as in any of the above, wherein the predefined context, the predefined movement and the predefined action comprise a triplet of information stored in a memory of the apparatus. An apparatus as in any of the above, wherein the apparatus comprises one of a mobile phone (e.g., a cellular phone), a personal digital assistant (PDA), a
portable computer, an image capture device such as a digital camera, a gaming device, a music storage and playback appliance, a mobile Internet appliance, or a portable unit or terminal.
An apparatus as in any of the above, further comprising at least one means for communication. An apparatus as in any of the above, further comprising at least one antenna. An apparatus as in any of the above, further comprising at least one means for storing. An apparatus as in any of the above, further comprising at least one means for receiving an input. An apparatus as in any of the above, further comprising at least one means for outputting.
(5) In another non-limiting, exemplary embodiment, an apparatus comprising: context-sensing circuitry (CSC) configured to obtain context information comprising a predefined context; movement-sensing circuitry (MSC) configured to sense movement of or in relation to the apparatus; and processing circuitry configured, in response to the MSC sensing a predefined movement associated with the predefined context, to perform a predefined action.
An apparatus as in the previous, further comprising one or more aspects of the exemplary embodiments of the invention as described in further detail herein.
The exemplary embodiments of the invention, as discussed above and as particularly described with respect to exemplary methods, may be implemented as a computer program product comprising program instructions embodied on a tangible computer- readable medium. Execution of the program instructions results in operations comprising steps of utilizing the exemplary embodiments or steps of the method.
The exemplary embodiments of the invention, as discussed above and as particularly described with respect to exemplary methods, may be implemented in conjunction with a program storage device readable by a machine, tangibly embodying a program of instructions executable by the machine for performing operations. The operations comprise steps of utilizing the exemplary embodiments or steps of the method.
Aspects of the above-presented descriptions of exemplary embodiments of the invention may be combined in various manners (e.g., various combinations of dependent claims and/or elements recited therein) provided said combinations are not unfeasible or impracticable.
It should be noted that the terms "connected," "coupled," or any variant thereof, mean any connection or coupling, either direct or indirect, between two or more elements, and may encompass the presence of one or more intermediate elements between two elements that are "connected" or "coupled" together. The coupling or connection between the elements can be physical, logical, or a combination thereof. As employed herein two elements may be considered to be "connected" or "coupled" together by the use of one or more wires, cables and/or printed electrical connections, as well as by the use of electromagnetic energy, such as electromagnetic energy having wavelengths in the radio frequency region, the microwave region and the optical (both visible and invisible) region, as several non-limiting and non-exhaustive examples.
In general, the various embodiments may be implemented in hardware or special purpose circuits, software, logic or any combination thereof. For example, some aspects may be implemented in hardware, while other aspects may be implemented in firmware or software which may be executed by a controller, microprocessor or other computing device, although the invention is not limited thereto. While various aspects of the invention may be illustrated and described as block diagrams, flow charts, or using some other pictorial representation, it is well understood that these blocks, apparatus, systems, techniques or methods described herein may be implemented in, as non-limiting examples, hardware, software, firmware, special purpose circuits or logic, general purpose hardware or controller or other computing devices, or some combination thereof.
The exemplary embodiments of the inventions may be practiced in various components such as integrated circuit modules. The design of integrated circuits is by and large a highly automated process. Complex and powerful software tools are available for converting a logic level design into a semiconductor circuit design ready to be etched and formed on a semiconductor substrate.
Programs, such as those provided by Synopsys, Inc. of Mountain View, California and Cadence Design, of San Jose, California automatically route conductors and locate components on a semiconductor chip using well established rules of design as well as libraries of pre-stored design modules. Once the design for a semiconductor circuit has been completed, the resultant design, in a standardized electronic format (e.g., Opus, GDSII, or the like) may be transmitted to a semiconductor fabrication facility or "fab" for fabrication.
The foregoing description has provided by way of exemplary and non-limiting examples a full and informative description of the invention. However, various modifications and adaptations may become apparent to those skilled in the relevant arts in view of the foregoing description, when read in conjunction with the accompanying drawings and the appended claims. However, all such and similar modifications of the teachings of this invention will still fall within the scope of this invention.
Furthermore, some of the features of the preferred embodiments of this invention could be used to advantage without the corresponding use of other features. As such, the foregoing description should be considered as merely illustrative of the principles of the invention, and not in limitation thereof.
Claims
1. A method comprising: obtaining context information for an apparatus, wherein the context information comprises a predefined context; and in response to sensing a predefined movement associated with the predefined context, performing, by the apparatus, a predefined action, wherein the predefined movement comprises a movement of or in relation to the apparatus.
2. A method as in claim 1, wherein obtaining the context information comprises determining a location of the apparatus.
3. A method as in claim 1 or 2, wherein obtaining the context information comprises utilizing at least one sensor.
4. A method as in any one of claims 1 -3, wherein the predefined movement is sensed by utilizing at least one sensor.
5. A method as in any one of claims 1 -4, wherein the predefined action comprises at least one of: placing a phone call, starting or stopping a clock, sending a predefined text message, taking a photograph, producing an audio output, receiving an audio input or recording an audio input.
6. A program storage device readable by an apparatus, tangibly embodying a program of instructions executable by the apparatus for performing operations, said operations comprising: obtaining context information for the apparatus, wherein the context information comprises a predefined context; and in response to sensing a predefined movement associated with the predefined context, performing, by the apparatus, a predefined action, wherein the predefined movement comprises a movement of or in relation to the apparatus.
7. A program storage device as in claim 6, wherein obtaining the context information comprises determining a location of the apparatus.
8. A program storage device as in claim 6 or 7, wherein obtaining the context information comprises utilizing at least one sensor.
9. A program storage device as in any one of claims 6-8, wherein the predefined movement is sensed by utilizing at least one sensor.
10. A program storage device as in any one of claims 6-9, wherein the predefined action comprises at least one of: placing a phone call, starting or stopping a clock, sending a predefined text message, taking a photograph, producing an audio output, receiving an audio input or recording an audio input.
11. An apparatus comprising: a context-sensing component configured to obtain context information comprising a predefined context; a movement-sensing component configured to sense movement of or in relation to the apparatus; and a processor configured, in response to the movement-sensing component sensing a predefined movement associated with the predefined context, to perform a predefined action.
12. An apparatus as in claim 11 , wherein the context information comprises a location of the apparatus and the context-sensing component is configured to determine the location of the apparatus.
13. An apparatus as in claim 11 or 12, wherein the context-sensing component comprises at least one sensor.
14. An apparatus as in any one of claims 11-13, wherein the movement-sensing component comprises at least one sensor.
15. An apparatus as in any one of claims 11-14, wherein the apparatus comprises a mobile device.
16. An apparatus comprising: means for obtaining context information comprising a predefined context; means for sensing movement of or in relation to the apparatus; and means for performing a predefined action in response to the MSC sensing a predefined movement associated with the predefined context.
17. An apparatus as in claim 16, wherein the context information comprises a location of the apparatus and the means for obtaining context information comprises means for determining the location of the apparatus.
18. An apparatus as in claim 16 or 17, wherein the means for obtaining context information comprises at least one sensor.
19. An apparatus as in any one of claims 16-18, wherein the means for sensing movement comprises at least one sensor.
20. An apparatus as in any one of claims 16-19, wherein the apparatus comprises a mobile device.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US96312707P | 2007-08-01 | 2007-08-01 | |
US60/963,127 | 2007-08-01 |
Publications (2)
Publication Number | Publication Date |
---|---|
WO2009016607A2 true WO2009016607A2 (en) | 2009-02-05 |
WO2009016607A3 WO2009016607A3 (en) | 2009-03-26 |
Family
ID=40202857
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/IB2008/053088 WO2009016607A2 (en) | 2007-08-01 | 2008-07-31 | Apparatus, methods, and computer program products providing context-dependent gesture recognition |
Country Status (2)
Country | Link |
---|---|
US (1) | US8896529B2 (en) |
WO (1) | WO2009016607A2 (en) |
Cited By (29)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2010056548A1 (en) | 2008-10-29 | 2010-05-20 | Invensense Inc. | Controlling and accessing content using motion processing on mobile devices |
US8250921B2 (en) | 2007-07-06 | 2012-08-28 | Invensense, Inc. | Integrated motion processing unit (MPU) with MEMS inertial sensing and embedded digital electronics |
WO2012168886A2 (en) * | 2011-06-09 | 2012-12-13 | Nokia Corporation | Method and apparatus for contextual gesture recognition |
CN103246368A (en) * | 2012-02-01 | 2013-08-14 | 罗技欧洲公司 | System and method for spurious signal detection and compensation on an input device |
WO2013119149A1 (en) * | 2012-02-06 | 2013-08-15 | Telefonaktiebolaget L M Ericsson (Publ) | A user terminal with improved feedback possibilities |
WO2014140853A3 (en) * | 2013-03-15 | 2014-12-24 | Orcam Technologies Ltd. | Apparatus and method for automatic action selection based on image context |
US8952832B2 (en) | 2008-01-18 | 2015-02-10 | Invensense, Inc. | Interfacing application programs and motion sensors of a device |
US8960002B2 (en) | 2007-12-10 | 2015-02-24 | Invensense, Inc. | Vertically integrated 3-axis MEMS angular accelerometer with integrated electronics |
WO2015200537A3 (en) * | 2014-06-24 | 2016-04-21 | Apple Inc. | Input device and user interface interactions |
US9532111B1 (en) | 2012-12-18 | 2016-12-27 | Apple Inc. | Devices and method for providing remote control hints on a display |
WO2019074775A1 (en) * | 2017-10-13 | 2019-04-18 | Microsoft Technology Licensing, Llc | Context based operation execution |
US11057682B2 (en) | 2019-03-24 | 2021-07-06 | Apple Inc. | User interfaces including selectable representations of content items |
US11070889B2 (en) | 2012-12-10 | 2021-07-20 | Apple Inc. | Channel bar user interface |
US11194546B2 (en) | 2012-12-31 | 2021-12-07 | Apple Inc. | Multi-user TV user interface |
US11245967B2 (en) | 2012-12-13 | 2022-02-08 | Apple Inc. | TV side bar user interface |
US11290762B2 (en) | 2012-11-27 | 2022-03-29 | Apple Inc. | Agnostic media delivery system |
US11461397B2 (en) | 2014-06-24 | 2022-10-04 | Apple Inc. | Column interface for navigating in a user interface |
US11467726B2 (en) | 2019-03-24 | 2022-10-11 | Apple Inc. | User interfaces for viewing and accessing content on an electronic device |
US11520858B2 (en) | 2016-06-12 | 2022-12-06 | Apple Inc. | Device-level authorization for viewing content |
US11543938B2 (en) | 2016-06-12 | 2023-01-03 | Apple Inc. | Identifying applications on which content is available |
US11609678B2 (en) | 2016-10-26 | 2023-03-21 | Apple Inc. | User interfaces for browsing content from multiple content applications on an electronic device |
US11683565B2 (en) | 2019-03-24 | 2023-06-20 | Apple Inc. | User interfaces for interacting with channels that provide content that plays in a media browsing application |
US11720229B2 (en) | 2020-12-07 | 2023-08-08 | Apple Inc. | User interfaces for browsing and presenting content |
US11797606B2 (en) | 2019-05-31 | 2023-10-24 | Apple Inc. | User interfaces for a podcast browsing and playback application |
US11843838B2 (en) | 2020-03-24 | 2023-12-12 | Apple Inc. | User interfaces for accessing episodes of a content series |
US11863837B2 (en) | 2019-05-31 | 2024-01-02 | Apple Inc. | Notification of augmented reality content on an electronic device |
US11899895B2 (en) | 2020-06-21 | 2024-02-13 | Apple Inc. | User interfaces for setting up an electronic device |
US11934640B2 (en) | 2021-01-29 | 2024-03-19 | Apple Inc. | User interfaces for record labels |
US11962836B2 (en) | 2019-03-24 | 2024-04-16 | Apple Inc. | User interfaces for a media browsing application |
Families Citing this family (39)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6831632B2 (en) * | 2001-04-09 | 2004-12-14 | I. C. + Technologies Ltd. | Apparatus and methods for hand motion tracking and handwriting recognition |
US20080168402A1 (en) | 2007-01-07 | 2008-07-10 | Christopher Blumenberg | Application Programming Interfaces for Gesture Operations |
US20080168478A1 (en) | 2007-01-07 | 2008-07-10 | Andrew Platzer | Application Programming Interfaces for Scrolling |
US8645827B2 (en) | 2008-03-04 | 2014-02-04 | Apple Inc. | Touch event model |
US9684521B2 (en) | 2010-01-26 | 2017-06-20 | Apple Inc. | Systems having discrete and continuous gesture recognizers |
US8566045B2 (en) * | 2009-03-16 | 2013-10-22 | Apple Inc. | Event recognition |
US9417700B2 (en) | 2009-05-21 | 2016-08-16 | Edge3 Technologies | Gesture recognition systems and related methods |
US20110148786A1 (en) | 2009-12-18 | 2011-06-23 | Synaptics Incorporated | Method and apparatus for changing operating modes |
US9465532B2 (en) * | 2009-12-18 | 2016-10-11 | Synaptics Incorporated | Method and apparatus for operating in pointing and enhanced gesturing modes |
US20110179381A1 (en) * | 2010-01-21 | 2011-07-21 | Research In Motion Limited | Portable electronic device and method of controlling same |
US8396252B2 (en) | 2010-05-20 | 2013-03-12 | Edge 3 Technologies | Systems and related methods for three dimensional gesture recognition in vehicles |
US8666144B2 (en) | 2010-09-02 | 2014-03-04 | Edge 3 Technologies, Inc. | Method and apparatus for determining disparity of texture |
US8655093B2 (en) | 2010-09-02 | 2014-02-18 | Edge 3 Technologies, Inc. | Method and apparatus for performing segmentation of an image |
WO2012030872A1 (en) | 2010-09-02 | 2012-03-08 | Edge3 Technologies Inc. | Method and apparatus for confusion learning |
US8582866B2 (en) | 2011-02-10 | 2013-11-12 | Edge 3 Technologies, Inc. | Method and apparatus for disparity computation in stereo images |
KR20120035529A (en) | 2010-10-06 | 2012-04-16 | 삼성전자주식회사 | Apparatus and method for adaptive gesture recognition in portable terminal |
KR101873787B1 (en) * | 2011-02-10 | 2018-07-03 | 삼성전자주식회사 | Method for processing multi-touch input in touch screen terminal and device thereof |
US8970589B2 (en) | 2011-02-10 | 2015-03-03 | Edge 3 Technologies, Inc. | Near-touch interaction with a stereo camera grid structured tessellations |
US9672609B1 (en) | 2011-11-11 | 2017-06-06 | Edge 3 Technologies, Inc. | Method and apparatus for improved depth-map estimation |
US20130211843A1 (en) * | 2012-02-13 | 2013-08-15 | Qualcomm Incorporated | Engagement-dependent gesture recognition |
KR101919008B1 (en) | 2012-02-24 | 2018-11-19 | 삼성전자주식회사 | Method for providing information and mobile terminal thereof |
KR101894395B1 (en) | 2012-02-24 | 2018-09-04 | 삼성전자주식회사 | Method for providing capture data and mobile terminal thereof |
KR102008495B1 (en) * | 2012-02-24 | 2019-08-08 | 삼성전자주식회사 | Method for sharing content and mobile terminal thereof |
US9600169B2 (en) * | 2012-02-27 | 2017-03-21 | Yahoo! Inc. | Customizable gestures for mobile devices |
WO2014119894A1 (en) * | 2013-01-29 | 2014-08-07 | Samsung Electronics Co., Ltd. | Method of performing function of device and device for performing the method |
KR102161050B1 (en) * | 2013-01-29 | 2020-10-05 | 삼성전자주식회사 | Method for executing function of device, and device thereof |
US10721448B2 (en) | 2013-03-15 | 2020-07-21 | Edge 3 Technologies, Inc. | Method and apparatus for adaptive exposure bracketing, segmentation and scene organization |
US9733716B2 (en) | 2013-06-09 | 2017-08-15 | Apple Inc. | Proxy gesture recognizer |
US20150036835A1 (en) * | 2013-08-05 | 2015-02-05 | Christina Summer Chen | Earpieces with gesture control |
US11243611B2 (en) * | 2013-08-07 | 2022-02-08 | Nike, Inc. | Gesture recognition |
CN110413054B (en) * | 2013-08-12 | 2023-04-28 | 苹果公司 | Context-sensitive actions in response to touch input |
US9423946B2 (en) | 2013-08-12 | 2016-08-23 | Apple Inc. | Context sensitive actions in response to touch input |
KR102165818B1 (en) * | 2013-09-10 | 2020-10-14 | 삼성전자주식회사 | Method, apparatus and recovering medium for controlling user interface using a input image |
US9582737B2 (en) * | 2013-09-13 | 2017-02-28 | Qualcomm Incorporated | Context-sensitive gesture classification |
US10721594B2 (en) | 2014-06-26 | 2020-07-21 | Microsoft Technology Licensing, Llc | Location-based audio messaging |
FR3051931A1 (en) | 2016-05-30 | 2017-12-01 | Orange | DETERMINING A MOBILITY CONTEXT OF A CARRIER OF EQUIPMENT PROVIDED WITH INERTIAL SENSORS |
US10832071B2 (en) * | 2016-09-01 | 2020-11-10 | International Business Machines Corporation | Dynamic determination of human gestures based on context |
US10051107B1 (en) | 2017-03-16 | 2018-08-14 | Microsoft Technology Licensing, Llc | Opportunistic timing of device notifications |
KR101810400B1 (en) * | 2017-03-22 | 2017-12-19 | 삼성전자주식회사 | Apparatus and method for adaptive gesture recognition in portable terminal |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050212767A1 (en) * | 2004-03-23 | 2005-09-29 | Marvit David L | Context dependent gesture response |
US20060028429A1 (en) * | 2004-08-09 | 2006-02-09 | International Business Machines Corporation | Controlling devices' behaviors via changes in their relative locations and positions |
WO2006063671A1 (en) * | 2004-12-16 | 2006-06-22 | Vodafone Holding Gmbh | Mobile terminal for use in telecommunications networks |
EP1752737A2 (en) * | 2005-08-11 | 2007-02-14 | Ftw. Forschungszentrum Telekommunikation Wien Betriebs GmbH | Portable navigation device and method for radio navigation |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6834249B2 (en) * | 2001-03-29 | 2004-12-21 | Arraycomm, Inc. | Method and apparatus for controlling a computing system |
US7365736B2 (en) * | 2004-03-23 | 2008-04-29 | Fujitsu Limited | Customizable gesture mappings for motion controlled handheld devices |
US20100100439A1 (en) * | 2008-06-12 | 2010-04-22 | Dawn Jutla | Multi-platform system apparatus for interoperable, multimedia-accessible and convertible structured and unstructured wikis, wiki user networks, and other user-generated content repositories |
-
2008
- 2008-07-31 US US12/221,320 patent/US8896529B2/en active Active
- 2008-07-31 WO PCT/IB2008/053088 patent/WO2009016607A2/en active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050212767A1 (en) * | 2004-03-23 | 2005-09-29 | Marvit David L | Context dependent gesture response |
US20060028429A1 (en) * | 2004-08-09 | 2006-02-09 | International Business Machines Corporation | Controlling devices' behaviors via changes in their relative locations and positions |
WO2006063671A1 (en) * | 2004-12-16 | 2006-06-22 | Vodafone Holding Gmbh | Mobile terminal for use in telecommunications networks |
EP1752737A2 (en) * | 2005-08-11 | 2007-02-14 | Ftw. Forschungszentrum Telekommunikation Wien Betriebs GmbH | Portable navigation device and method for radio navigation |
Cited By (53)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9292102B2 (en) | 2007-01-05 | 2016-03-22 | Invensense, Inc. | Controlling and accessing content using motion processing on mobile devices |
US8250921B2 (en) | 2007-07-06 | 2012-08-28 | Invensense, Inc. | Integrated motion processing unit (MPU) with MEMS inertial sensing and embedded digital electronics |
US8997564B2 (en) | 2007-07-06 | 2015-04-07 | Invensense, Inc. | Integrated motion processing unit (MPU) with MEMS inertial sensing and embedded digital electronics |
US10288427B2 (en) | 2007-07-06 | 2019-05-14 | Invensense, Inc. | Integrated motion processing unit (MPU) with MEMS inertial sensing and embedded digital electronics |
US8960002B2 (en) | 2007-12-10 | 2015-02-24 | Invensense, Inc. | Vertically integrated 3-axis MEMS angular accelerometer with integrated electronics |
US9846175B2 (en) | 2007-12-10 | 2017-12-19 | Invensense, Inc. | MEMS rotation sensor with integrated electronics |
US9342154B2 (en) | 2008-01-18 | 2016-05-17 | Invensense, Inc. | Interfacing application programs and motion sensors of a device |
US8952832B2 (en) | 2008-01-18 | 2015-02-10 | Invensense, Inc. | Interfacing application programs and motion sensors of a device |
US9811174B2 (en) | 2008-01-18 | 2017-11-07 | Invensense, Inc. | Interfacing application programs and motion sensors of a device |
EP2353065A4 (en) * | 2008-10-29 | 2012-06-27 | Invensense Inc | Controlling and accessing content using motion processing on mobile devices |
WO2010056548A1 (en) | 2008-10-29 | 2010-05-20 | Invensense Inc. | Controlling and accessing content using motion processing on mobile devices |
EP2353065A1 (en) * | 2008-10-29 | 2011-08-10 | Invensense Inc. | Controlling and accessing content using motion processing on mobile devices |
WO2012168886A3 (en) * | 2011-06-09 | 2013-01-31 | Nokia Corporation | Method and apparatus for contextual gesture recognition |
WO2012168886A2 (en) * | 2011-06-09 | 2012-12-13 | Nokia Corporation | Method and apparatus for contextual gesture recognition |
EP2624105A3 (en) * | 2012-02-01 | 2016-04-13 | Logitech Europe S.A. | System and method for spurious signal detection and compensation on an input device |
CN103246368A (en) * | 2012-02-01 | 2013-08-14 | 罗技欧洲公司 | System and method for spurious signal detection and compensation on an input device |
US9554251B2 (en) | 2012-02-06 | 2017-01-24 | Telefonaktiebolaget L M Ericsson | User terminal with improved feedback possibilities |
WO2013119149A1 (en) * | 2012-02-06 | 2013-08-15 | Telefonaktiebolaget L M Ericsson (Publ) | A user terminal with improved feedback possibilities |
US11290762B2 (en) | 2012-11-27 | 2022-03-29 | Apple Inc. | Agnostic media delivery system |
US11070889B2 (en) | 2012-12-10 | 2021-07-20 | Apple Inc. | Channel bar user interface |
US11317161B2 (en) | 2012-12-13 | 2022-04-26 | Apple Inc. | TV side bar user interface |
US11245967B2 (en) | 2012-12-13 | 2022-02-08 | Apple Inc. | TV side bar user interface |
US11297392B2 (en) | 2012-12-18 | 2022-04-05 | Apple Inc. | Devices and method for providing remote control hints on a display |
US9532111B1 (en) | 2012-12-18 | 2016-12-27 | Apple Inc. | Devices and method for providing remote control hints on a display |
US10116996B1 (en) | 2012-12-18 | 2018-10-30 | Apple Inc. | Devices and method for providing remote control hints on a display |
US11822858B2 (en) | 2012-12-31 | 2023-11-21 | Apple Inc. | Multi-user TV user interface |
US11194546B2 (en) | 2012-12-31 | 2021-12-07 | Apple Inc. | Multi-user TV user interface |
WO2014140853A3 (en) * | 2013-03-15 | 2014-12-24 | Orcam Technologies Ltd. | Apparatus and method for automatic action selection based on image context |
US10303348B2 (en) | 2014-06-24 | 2019-05-28 | Apple Inc. | Input device and user interface interactions |
US10732807B2 (en) | 2014-06-24 | 2020-08-04 | Apple Inc. | Input device and user interface interactions |
US10019142B2 (en) | 2014-06-24 | 2018-07-10 | Apple Inc. | Input device and user interface interactions |
US9792018B2 (en) | 2014-06-24 | 2017-10-17 | Apple Inc. | Input device and user interface interactions |
KR20160147012A (en) * | 2014-06-24 | 2016-12-21 | 애플 인크. | Input device and user interface interactions |
US11461397B2 (en) | 2014-06-24 | 2022-10-04 | Apple Inc. | Column interface for navigating in a user interface |
WO2015200537A3 (en) * | 2014-06-24 | 2016-04-21 | Apple Inc. | Input device and user interface interactions |
US11520467B2 (en) | 2014-06-24 | 2022-12-06 | Apple Inc. | Input device and user interface interactions |
US11520858B2 (en) | 2016-06-12 | 2022-12-06 | Apple Inc. | Device-level authorization for viewing content |
US11543938B2 (en) | 2016-06-12 | 2023-01-03 | Apple Inc. | Identifying applications on which content is available |
US11609678B2 (en) | 2016-10-26 | 2023-03-21 | Apple Inc. | User interfaces for browsing content from multiple content applications on an electronic device |
US11966560B2 (en) | 2016-10-26 | 2024-04-23 | Apple Inc. | User interfaces for browsing content from multiple content applications on an electronic device |
WO2019074775A1 (en) * | 2017-10-13 | 2019-04-18 | Microsoft Technology Licensing, Llc | Context based operation execution |
US11445263B2 (en) | 2019-03-24 | 2022-09-13 | Apple Inc. | User interfaces including selectable representations of content items |
US11683565B2 (en) | 2019-03-24 | 2023-06-20 | Apple Inc. | User interfaces for interacting with channels that provide content that plays in a media browsing application |
US11750888B2 (en) | 2019-03-24 | 2023-09-05 | Apple Inc. | User interfaces including selectable representations of content items |
US11467726B2 (en) | 2019-03-24 | 2022-10-11 | Apple Inc. | User interfaces for viewing and accessing content on an electronic device |
US11962836B2 (en) | 2019-03-24 | 2024-04-16 | Apple Inc. | User interfaces for a media browsing application |
US11057682B2 (en) | 2019-03-24 | 2021-07-06 | Apple Inc. | User interfaces including selectable representations of content items |
US11797606B2 (en) | 2019-05-31 | 2023-10-24 | Apple Inc. | User interfaces for a podcast browsing and playback application |
US11863837B2 (en) | 2019-05-31 | 2024-01-02 | Apple Inc. | Notification of augmented reality content on an electronic device |
US11843838B2 (en) | 2020-03-24 | 2023-12-12 | Apple Inc. | User interfaces for accessing episodes of a content series |
US11899895B2 (en) | 2020-06-21 | 2024-02-13 | Apple Inc. | User interfaces for setting up an electronic device |
US11720229B2 (en) | 2020-12-07 | 2023-08-08 | Apple Inc. | User interfaces for browsing and presenting content |
US11934640B2 (en) | 2021-01-29 | 2024-03-19 | Apple Inc. | User interfaces for record labels |
Also Published As
Publication number | Publication date |
---|---|
US20090037849A1 (en) | 2009-02-05 |
WO2009016607A3 (en) | 2009-03-26 |
US8896529B2 (en) | 2014-11-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8896529B2 (en) | Apparatus, methods, and computer program products providing context-dependent gesture recognition | |
US20200125144A1 (en) | Foldable electronic device for controlling user interface and operating method thereof | |
US20210034210A1 (en) | Method for providing function or content associated with application, and electronic device for carrying out same | |
WO2020258929A1 (en) | Folder interface switching method and terminal device | |
US9395867B2 (en) | Method and system for displaying an image on an electronic device | |
US20170123598A1 (en) | System and method for focus on touch with a touch sensitive screen display | |
JP5925655B2 (en) | Image display control device, image display device, program, and image display method | |
US20140024414A1 (en) | Electronic device, operation control method, and operation control program | |
EP2350800A1 (en) | Live preview of open windows | |
KR20120035529A (en) | Apparatus and method for adaptive gesture recognition in portable terminal | |
WO2018082657A1 (en) | Method for searching for icon, and terminal | |
CN107390931B (en) | Response control method and device for touch operation, storage medium and mobile terminal | |
CN108491148B (en) | Application sharing method and terminal | |
JP2015045583A (en) | Portable communication terminal, information display program, and information display method | |
JP2013522714A (en) | Character input method for portable terminal and portable terminal supporting the same | |
CN113741784A (en) | Payment function switching method and electronic equipment | |
CN108920040B (en) | Application icon sorting method and mobile terminal | |
US11169697B2 (en) | Electronic device and method for displaying contextual information of application | |
CN110795189A (en) | Application starting method and electronic equipment | |
CN111566608B (en) | Apparatus and method for providing functionality associated with a keyboard layout | |
JP7329150B2 (en) | Touch button, control method and electronic device | |
CN108491125B (en) | Operation control method of application store and mobile terminal | |
KR20190122331A (en) | Electronic device for inputting character and operating method thereof | |
CN109684006B (en) | Terminal control method and device | |
JP2013182463A (en) | Portable terminal device, touch operation control method, and program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 08789516 Country of ref document: EP Kind code of ref document: A2 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 08789516 Country of ref document: EP Kind code of ref document: A2 |