EP2475969A2 - Verfahren und system zur steuerung einer benutzeroberfläche einer vorrichtung mithilfe des menschliches atems - Google Patents
Verfahren und system zur steuerung einer benutzeroberfläche einer vorrichtung mithilfe des menschliches atemsInfo
- Publication number
- EP2475969A2 EP2475969A2 EP10816236A EP10816236A EP2475969A2 EP 2475969 A2 EP2475969 A2 EP 2475969A2 EP 10816236 A EP10816236 A EP 10816236A EP 10816236 A EP10816236 A EP 10816236A EP 2475969 A2 EP2475969 A2 EP 2475969A2
- Authority
- EP
- European Patent Office
- Prior art keywords
- interface
- user interface
- processing module
- operable
- human breath
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/0485—Scrolling or panning
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04886—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
Definitions
- Certain embodiments of the invention relate to controlling a computer or electronic system. More specifically, certain embodiments of the invention relate to a method and system for controlling a user interface of a device using human breath.
- voice connections fulfill the basic need to communicate, and mobile voice connections continue to filter even further into the fabric of every day life, the mobile access to services via the Internet has become the next step in the mobile communication revolution.
- most mobile devices are equipped with a user interface that allows users to access the services provided via the Internet.
- some mobile devices may have browsers, and software and/or hardware buttons may be provided to enable navigation and/or control of the user interface.
- Some mobile devices such as Smartphones are equipped with touch screen capability that allows users to navigate or control the user interface via touching with one hand while the device is held in another hand.
- a system and/or method for controlling a user interface of a device using human breath substantially as shown in and/or described in connection with at least one of the figures, as set forth more completely in the claims.
- FIG. 1A is a block diagram of an exemplary system for controlling a user interface of a plurality of devices using human breath, in accordance with an embodiment of the invention.
- FIG. 1 B is a block diagram of an exemplary sensing module to detect human breath, in accordance with an embodiment of the invention.
- FIG. 1 C is a block diagram of another embodiment of an exemplary system for controlling a user interface of a device using human breath, in accordance with an embodiment of the invention.
- FIG. 1 D is a block diagram of an exemplary processor interacting with a device being controlled, in accordance with an embodiment of the invention.
- FIG. 1 E is a block diagram of an exemplary system for side loading of information between two or more devices, in accordance with an embodiment of the invention.
- FIG. 2A is a diagram illustrating an exemplary MEMS sensing and processing module embedded in a device, in accordance with an embodiment of the invention.
- FIG. 2B is a diagram illustrating an exemplary MEMS sensing and processing module located on a stand alone device that is communicatively coupled to a device via a USB interface, in accordance with an embodiment of the invention.
- FIG. 2C is a diagram illustrating an exemplary MEMS sensing and processing module located on a stylus, in accordance with an embodiment of the invention.
- FIG. 2D is a diagram illustrating an exemplary MEMS sensing and processing module located on a headset for military personnel, in accordance with an embodiment of the invention.
- FIG. 2E is a diagram illustrating an exemplary MEMS sensing and processing module located on a headrest of a seating apparatus, in accordance with an embodiment of the invention.
- FIG. 2F is a diagram illustrating an exemplary MEMS sensing and processing module located inside an automobile, in accordance with an embodiment of the invention.
- FIG. 2G is a diagram illustrating an exemplary MEMS sensing and processing module located on detachable eyewear, in accordance with an embodiment of the invention.
- FIG. 2H is a diagram illustrating an exemplary MEMS sensing and processing module located on a neckset, in accordance with an embodiment of the invention.
- FIG. 21 is a diagram illustrating an exemplary MEMS sensing and processing module located on a clip, in accordance with an embodiment of the invention.
- FIG. 2J is a diagram illustrating an exemplary MEMS sensing and processing module embedded in a fabric, in accordance with an embodiment of the invention.
- FIG. 3A is a diagram illustrating an exemplary electronic device that may be controlled via a sectional user interface, in accordance with an embodiment of the invention.
- FIG. 3B is a diagram illustrating several exemplary configurations of a sectional user interface, in accordance with an embodiment of the invention.
- FIG. 3C is a diagram illustrating several exemplary fixed regions of a sectional user interface, in accordance with an embodiment of the invention.
- FIG. 3D is a diagram illustrating several exemplary content regions of a sectional user interface, in accordance with an embodiment of the invention.
- FIG. 3E illustrates interacting with a sectional user interface of an electronic device via respiratory and tactual input, in accordance with an embodiment of the invention.
- FIG. 3F illustrates an exemplary sectional user interface which may provide an indication of a sequence of categories and/or icons when scrolling, in accordance with an embodiment of the invention.
- FIG. 3G illustrates interacting with an exemplary sectional user interface via respiratory and tactual input, in accordance with an embodiment of the invention.
- FIG. 3H illustrates another exemplary sectional user interface which may provide an indication of a sequence of categories and/or icons when scrolling, in accordance with an embodiment of the invention.
- FIG. 4A illustrates launching an application via a user interface utilizing respiratory and tactual input, in accordance with an embodiment of the invention.
- FIG. 4B illustrates exemplary interaction with an application running on an electronic device, in accordance with an embodiment of the invention.
- FIG. 4C illustrates exemplary interaction with an application running on an electronic device, in accordance with an embodiment of the invention.
- FIG. 5 is a block diagram of an exemplary user interface interacting with a MEMS sensing and processing module and a host system, in accordance with an embodiment of the invention.
- FIG. 6 is a flowchart illustrating exemplary steps for processing signals that control a device using human breath.
- FIG. 7 A is a flow chart illustrating exemplary steps for controlling a user interface of a device using human breath, in accordance with an embodiment of the invention.
- FIG. 7B is a flow chart illustrating exemplary steps for side loading of information, in accordance with an embodiment of the invention.
- Certain aspects of the invention may be found in a method and system for controlling a user interface of a device using human breath.
- exemplary aspects of the invention may comprise a device comprising an embedded micro-electromechanical system (MEMS) sensing and processing module.
- MEMS sensing and processing module may detect movement caused by expulsion of human breath by a user.
- one or more control signals may be generated.
- the generated control signals may be utilized to control the user interface of a device and may enable navigation and/or selection of components in the user interface.
- the generated one or more control signals may be communicated to the device being controlled via one or more of an external memory interface, a universal asynchronous receiver transmitter (UART) interface, an enhanced serial peripheral interface (eSPI), a general purpose input/output (GPIO) interface, a pulse-code modulation (PCM) and/or an inter-IC sound (l 2 S) interface, an inter-integrated circuit (l 2 C) bus interface, a universal serial bus (USB) interface, a Bluetooth interface, a ZigBee interface, an IrDA interface, and/or a wireless USB (W-USB) interface.
- the expulsion of the human breath may occur in open space and the detection of the movement caused by the expulsion may occur without the use of a channel.
- a user interface is a graphical user interface (GUI).
- FIG. 1A is a block diagram of an exemplary system for controlling a user interface of a plurality of devices using human breath, in accordance with an embodiment of the invention.
- a user 102 and a plurality of devices to be controlled, such as a multimedia device 106a, a cellphone/smartphone/dataphone 106b, a personal computer (PC), laptop or a notebook computer 106c, a display device 106d and/or a television (TV)/game console/other platform 106e.
- a multimedia device 106a such as a cellphone/smartphone/dataphone 106b, a personal computer (PC), laptop or a notebook computer 106c, a display device 106d and/or a television (TV)/game console/other platform 106e.
- TV television
- Each of the plurality of devices to be controlled such as a multimedia device 106a, a cellphone/smartphone/dataphone 106b, a personal computer (PC), laptop or a notebook computer 106c, a display device 106d and/or a television (TV)/game console/other platform 106e may comprise an embedded micro-electro-mechanical system (MEMS) sensing and processing module 104.
- MEMS micro-electro-mechanical system
- the multimedia device 106a may comprise a user interface 107a
- the cellphone/smartphone/dataphone 106b may comprise a user interface 107b
- the personal computer (PC) laptop or a notebook computer 106c may comprise a user interface 107c.
- the display device 106d may comprise a user interface 107d and the television (TV)/game console/other platform 106e may comprise a user interface 107e.
- Each of the plurality of devices to be controlled may be wired or wirelessly connected to a plurality of other devices 108 for loading of information via, for example, side loading, or loading via a peer-to-peer connenction, and/or a network connection, and by wired and/or wireless communication.
- Exemplary other devices 108 may comprise game consoles, immersive or 3D reality devices, and/or telematic devices.
- Telematic devices refers to devices comprising integrated computing, wireless communication and/or global navigation satellite system devices, which enables sending, receiving and/or storing of information over networks.
- the user interface may enable interacting with the device being controlled by one or more inputs, for example, expulsion of a fluid such as air, tactual inputs such as button presses, audio inputs such as voice commands, and/or movements of the electronic device 202 such as those detected by an accelerometer and/or gyroscope.
- a fluid such as air
- tactual inputs such as button presses
- audio inputs such as voice commands
- movements of the electronic device 202 such as those detected by an accelerometer and/or gyroscope.
- the MEMS sensing and processing module 104 may comprise suitable logic, circuitry, interfaces, and/or code that may be operable to detect movement caused by expulsion of human breath by the user 102. In response to the detection of movement caused by expulsion of human breath, the MEMS sensing and processing module 104 may be operable to generate one or more controls signals.
- the MEMS sensing and processing module 104 may comprise one or more segments or members that may be operable to sense the kinetic energy generated by the expulsion of the human breath and accordingly generate the one or more control signals.
- the generated one or more control signals may be operable to control a user interface of one or more of a plurality of devices, such as the user interface 107a of the multimedia device 106a, the user interface 107b of the cellphone/smartphone/dataphone 106b, the user interface 107c of the PC, laptop or a notebook computer 106c, the user interface 107d of the display device 106d, the user interface 107e of the TV/game console/other platform 106e, and the user interfaces of the mobile multimedia player and/or a remote controller.
- a user interface is a graphical user interface (GUI). Any information and/or data presented on a display including programs and/or applications may be part of the user interface.
- GUI graphical user interface
- the detection of the movement caused by expulsion of human breath may occur without use of a channel.
- the detection of the movement caused by expulsion of human breath may be responsive to the expulsion of human breath into open space, which is then sensed.
- the MEMS sensing and processing module 104 may be operable to navigate within the user interface of one of more of the plurality of devices, such as a handheld device, for example, a multimedia device 106a, a cellphone/smartphone/dataphone 106b, a PC, laptop or a notebook computer 106c, a display device 106d, and/or a TV/game console/other platform 106e via the generated one or more control signals.
- the MEMS sensing and processing module 104 may be operable to select one or more components within the user interface of the plurality of devices via the generated one or more control signals.
- the generated one or more control signals may comprise one or more of a wired and/or a wireless signal.
- one or more of the plurality of devices such as a handheld device, for example, a multimedia device 106a and/or a cellphone/smartphone/dataphone 106b and/or a PC, laptop or a notebook computer 106c may be operable to receive one or more inputs defining the user interface from another device 108.
- the other device 108 may be one or more of a PC, laptop or a notebook computer 106c and/or a handheld device, for example, a multimedia device 106a and/or a cell phone/smartphone/dataphone 106b.
- data may be transferred from the other device 108 to the cellphone/smartphone/dataphone 106b and this data may be associated or mapped to media content that may be remotely accessed by the cellphone/smartphone/dataphone 106b via a service provider such as a cellular or PCS service provider.
- the transferred data that is associated or mapped to media content may be utilized to customize the user interface 107b of the cellphone/smartphone/dataphone 106b.
- media content associated with one or more received inputs may become an integral part of the user interface of the device being controlled.
- the associating and/or mapping may be performed on either the other device 108 and/or one the cellphone/smartphone/dataphone 106b. In instances where the associating and/or mapping is performed on the other device 108, the associated and/or mapped data may be transferred from the other device 108 to the cellphone/smartphone/dataphone 106b.
- an icon transferred from the other device 108 to the cellphone/smartphone/dataphone 106b may be associated or mapped to media content such as an RSS feed, a markup language such as HTML, and XML, that may be remotely accessed by the cellphone/smartphone/dataphone 106b via the service provider of the cellphone/smartphone 106b. Accordingly, when the user 102 blows on the MEMS sensing and processing module 104, control signals generated by the MEMS sensing and processing module 104 may navigate to the icon and select the icon.
- the RSS feed or markup language may be accessed via the service provider of the cellphone/smartphone/dataphone 106b and corresponding RSS feed or markup language content may be displayed on the user interface 107b.
- United States Application Serial No. 12/056,187, filed on March 26, 2008 discloses an exemplary method and system for customizing a user interface of a device and is hereby incorporated herein by reference in its entirety.
- a user 102 may exhale into open space and the exhaled breath or air may be sensed by one or more detection devices or detectors, such as one or more sensors, sensing members and/or sensing segments in the MEMS sensing and processing module 104.
- the MEMS sensing and processing module 104 may be operable to detect movement caused by expulsion of human breath by the user 102.
- One or more electrical, optical and/or magnetic signals may be generated by one or more detection devices or detectors within the MEMS sensing and processing module 104 in response to the detection of movement caused by expulsion of human breath.
- the processor firmware within the MEMS sensing and processing module 104 may be operable to process the received electrical, optical and/or magnetic signals from the one or more detection device(s) or detector(s) utilizing various algorithms and generate one or more control signals to the device being controlled, for example, the multimedia device 106a.
- the generated one or more control signals may be communicated to the device being controlled, for example, the multimedia device 106a via a wired and/or a wireless signal.
- the processor in the device being controlled may utilize the communicated control signals to control the user interface of the device being controlled, such as a user interface 107a of the multimedia device 106a, a user interface 107b of the cellphone/smartphone/dataphone 106b, a user interface 107c of the personal computer (PC), laptop or a notebook computer 106c, a user interface 107d of the display device 106d, a user interface 107e of the TV/game console/other platform 106e, and a user interface of a mobile multimedia player and/or a remote controller.
- a user interface 107a of the multimedia device 106a such as a user interface 107a of the multimedia device 106a, a user interface 107b of the cellphone/smartphone/dataphone 106b, a user interface 107c of the personal computer (PC), laptop or a notebook computer 106c, a user interface 107d of the display device 106d, a user interface 107e of the TV/game console/other platform 106
- FIG. 1 B is a block diagram of an exemplary detection device or detector to detect human breath, in accordance with an embodiment of the invention.
- the sensing module 110 may comprise a sensor control chip 109 and a plurality of sensors, for example, 111a, 111 b, 111c, and 111 d.
- the invention may not be so limited and the sensing module 110 may comprise more or less than the number of sensors or sensing members or segments shown in FIG. 1 B without limiting the scope of the invention. Accordingly, any number of detectors and sources may be utilized according to the desired size, sensitivity, and resolution desired.
- the type of sources and detectors may comprise other sensing mechanisms, other than visible light.
- piezoelectric, ultrasonic, Hall effect, electrostatic, and/or permanent or electro-magnet sensors may be activated by deflected MEMS members to generate a signal to be communicated to the sensor control chip 109.
- the sensing module 110 may be an electrochemical sensor or any other type of breath analyzing sensor, for example.
- the plurality of sensors or sensing members or segments 111 a-d may be an integral part of one or more MEMS devices that may enable the detection of various velocities of air flow from the user's 102 breath.
- the plurality of sensors or sensing members or segments 111a-d may be operable to detect kinetic energy and/or movement caused by the expulsion of human breath by the user 102.
- the sensor control chip 109 may be operable to generate an electrical, optical and/or magnetic signal that may be communicated to the processor in response to the detection of kinetic energy and/or movement caused by expulsion of human breath.
- FIG. 1 C is a block diagram of another embodiment of an exemplary system for controlling a user interface of a device using human breath, in accordance with an embodiment of the invention.
- a user 102 and a device being controlled 106, such as a multimedia device 106a, a cellphone/smartphone/dataphone 106b, a PC, laptop or a notebook computer 106c, a display device 106d and/or a TV/game console/other platform 106e.
- the device being controlled 106 may be wired and/or wirelessly connected to a plurality of other devices 108 for side loading of information.
- the device being controlled 106 may comprise a MEMS sensing and processing module 104 and a user interface 107.
- the MEMS sensing and processing module 104 may comprise a sensing module 110, a processing module 112 and passive devices 113.
- the passive devices 1 13, which may comprise resistors, capacitors and/or inductors, may be embedded within a substrate material of the MEMS processing sensing and processing module 104.
- the processing module 112 may comprise, for example, an ASIC.
- the sensing module 110 may generally be referred to as a detection device or detector, and may comprise one or more sensors, sensing members and/or sensing segments that may be operable to detect kinetic energy and/or movement caused by the expulsion of human breath by the user 102.
- the sensing module 110 may be operable to generate an electrical, optical and/or magnetic signal that may be communicated to the processing module 1 12 in response to the detection of kinetic energy and/or movement caused by expulsion of human breath.
- the processing module 1 12 may comprise suitable logic, circuitry, interfaces, and/or code that may be operable to receive the generated electric signal from the sensing module 110 and generate one or more control signals to the device being controlled 106.
- the processing module 112 may comprise one or more analog to digital converters that may be operable to translate the sensed signal to one or more digital signals, which may be utilized to generate the one or more control signals.
- the generated one or more control signals may be operable to control the user interface 107 of the device being controlled 106.
- the generated one or more signals from the MEMS sensing and processing module 104 may be utilized to control the user interface 107.
- the one or more signals generated by the MEMS sensing and processing module 104 may be operable to control a pointer on the device being controlled 106 such that items in the user interface 107 may be selected and/or manipulated.
- the device being controlled may be operable to receive one or more inputs from the other devices 108, which may be utilized to customize or define the user interface 107.
- the other device 108 may be one or more of a PC, laptop or a notebook computer 106c and/or a handheld device, for example, a multimedia device 106a and/or a cellphone/smartphone/dataphone 106b.
- the other device 108 may be similar to or different from the type of device that is being controlled 106.
- a processor in the other device 108 may be operable to associate or map the data to media content that is remotely accessible by the device being controlled 106.
- a processor in the device being controlled 106 may be operable to associate or map the data to media content that is remotely accessible by the device being controlled 106.
- United States Application Serial No. 12/056,187, filed on March 26, 2008 discloses an exemplary method and system for customizing a user interface of a device and is hereby incorporated herein by reference in its entirety.
- FIG. 1 D is a block diagram of an exemplary processor interacting with a device being controlled, in accordance with an embodiment of the invention.
- a device being controlled 106 such as a multimedia device 106a, a cellphone/smartphone/dataphone 106b, a PC, laptop or a notebook computer 106c, a display device 106d and/or a TV/game console/other platform 106e.
- the device being controlled 106 may comprise a processing module 112, a communication module 120, a processor 122, memory 123, firmware 124, a display 26, and a user interface 128.
- the processing module 112 may be an ASIC and may comprise one or more analog to digital converters (ADCs) 114, processor firmware 1 16, and a communication module 118.
- ADCs analog to digital converters
- the device being controlled 106 may be wired and/or wirelessly connected to a plurality of other devices 108 for loading of information via, for example, side loading, or loading via a peer-to-peer connenction, and/or a network connection, and by wired and/or wireless communication.
- the processing module 112 may comprise suitable logic, circuitry, interfaces, and/or code that may be operable to receive a digital sensing signal and/or an analog sensing signal from the sensing module 110.
- the ADC 14 may comprise suitable logic, circuitry, interfaces, and/or code that may be operable to receive the generated analog sensing signal from the sensing module 1 10 and convert the received signal into a digital signal.
- the processor firmware 116 may comprise suitable logic, circuitry, and/or code that may be operable to receive and process the digital signal from the ADC 114 and/or the digital sensing signal from the sensing module 110 utilizing a plurality of algorithms to generate one or more control signals.
- the processor firmware 116 may be operable to read, store, calibrate, filter, modelize, calculate and/or compare the outputs of the sensing module 1 10.
- the processor firmware 116 may also be operable to incorporate artificial intelligence (Al) algorithms to adapt to a particular user's 102 breathing pattern.
- the processor firmware 116 may be operable to generate one or more control signals to the device being controlled 106 based on processing the received digital signals.
- the generated one or more control signals may be operable to control a user interface of the device being controlled 106, for example, scrolling, zooming, and/or 3-D navigation within the device being controlled 106.
- the communication module 118 may comprise suitable logic, circuitry, interfaces, and/or code that may be operable to receive and communicate the generated one or more control signals to the communication module 120.
- the communication modules 118 and 120 may support a plurality of interfaces.
- the communication modules 1 18 and 120 may support an external memory interface, a universal asynchronous receiver transmitter (UART) interface, an enhanced serial peripheral interface (eSPI), a general purpose input/output (GPIO) interface, a pulse-code modulation (PCM) and/or an inter-IC sound (l 2 S) interface, an inter-integrated circuit (l 2 C) bus interface, a universal serial bus (USB) interface, a Bluetooth interface, a ZigBee interface, an IrDA interface, and/or a wireless USB (W-USB) interface.
- UART universal asynchronous receiver transmitter
- eSPI enhanced serial peripheral interface
- GPIO general purpose input/output
- PCM pulse-code modulation
- l 2 S inter-IC sound
- the communication module 120 may be operable to receive the communicated control signals via a wired and/or a wireless signal.
- the processor 122 may comprise suitable logic, circuitry, interfaces, and/or code that may be operable to utilize the received one or more control signals to control the user interface 128 and/or the display 126.
- the memory may comprise suitable logic, circuitry, interfaces, and/or code that may be operable to store data on the device being controlled 106.
- the firmware 124 may comprise a plurality of drivers and operating system (OS) libraries to convert the received control signals into functional commands.
- the firmware 124 may be operable to map local functions, and convert received control signals into compatible data, such as user customization features, applets, and/or plugins to control the user interface 128.
- OS operating system
- the device being controlled 106 may be operable to receive one or more inputs defining the user interface 128 from another device 108.
- the other device 108 may comprise a user interface 129 and a processor 125.
- the other device 108 may be one or more of a PC, laptop or a notebook computer 106c and/or a handheld device, for example, a multimedia device 106a and/or a cellphone/smartphone/dataphone 106b.
- data may be transferred from the other device 108 to the device being controlled, such as the cellphone/smartphone/dataphone 106b and this data may be associated or mapped to media content that may be remotely accessed by the cellphone/smartphone/dataphone 106b via a service provider such as a cellular or PCS service provider.
- the transferred data that is associated or mapped to media content may be utilized to customize the user interface 128 of the device being controlled, such as the cellphone/smartphone/dataphone 106b.
- media content associated with one or more received inputs may become an integral part of the user interface 128 of the device being controlled 106.
- FIG. 1 E is a block diagram of an exemplary system for side loading of information between two or more devices, in accordance with an embodiment of the invention. Referring to FIG.
- a carrier network 124 a plurality of devices being controlled 106, such as, a plurality of mobile phones 130a, 130b, 130c and 130d, a PC, laptop or a notebook computer 132 connected to a network 134, such as the Internet.
- the network 134 may be coupled to a web server 136, a wireless carrier portal 138, a web portal 140 and/or a database 142.
- Each of the plurality of devices being controlled 106 may have a user interface.
- the mobile phone 130a may have a user interface 131 a
- the mobile phone 130b may have a user interface 131 b
- the mobile phone 130c may have a user interface 131 c
- the mobile phone 130d may have a user interface 131d.
- the PC, laptop or a notebook computer 32 may have a user interface 133.
- the carrier network 124 may be a wireless access carrier network.
- Exemplary carrier networks may comprise 2G, 2.5G, 3G, 4G, IEEE802.11 , IEEE802.16 and/or suitable network capable of handling voice, video and/or data communication.
- the plurality of devices being controlled 106 may be wirelessly connected to the carrier network 124.
- One of the devices being controlled, such as mobile phone 130a may be connected to a plurality of mobile phones 130b, 130c and 130d via a peer-to-peer (P2P) network, for example.
- P2P peer-to-peer
- the device being controlled, such as mobile phone 130a may be communicatively coupled to a PC, laptop, or a notebook computer 132 via a wired or a wireless network.
- the mobile phone 130a may be communicatively coupled to the PC, laptop, or a notebook computer 132 via an infrared (IR) link, an optical link, an USB link, a wireless USB, a Bluetooth link and/or a ZigBee link.
- IR infrared
- the PC, laptop, or a notebook computer 132 may be communicatively coupled to the network 134, for example, the Internet network 134 via a wired or a wireless network.
- the plurality of devices being controlled, such as the plurality of mobile phones 130a, 130b, 130c and 130d may be wirelessly connected to the Internet network 134.
- the web server 136 may comprise suitable logic, circuitry, and/or code that may be operable to receive, for example, HTTP and/or FTP requests from clients or web browsers installed on the PC, laptop, or a notebook computer 132 via the Internet network 134, and generate HTTP responses along with optional data contents, such as HTML documents and linked objects, for example.
- the wireless carrier portal 138 may comprise suitable logic and/or code that may be operable to function as a point of access to information on the Internet network 134 via a mobile device, such a mobile phone 130a, for example.
- the wireless carrier portal 138 may be, for example, a website that may be operable to provide a single function via a mobile web page, for example.
- the web portal 140 may comprise suitable logic and/or code that may be operable to function as a point of access to information on the Internet 134.
- the web portal 140 may be, for example, a site that may be operable to provide a single function via a web page or site.
- the web portal 140 may present information from diverse sources in a unified way such as e-mail, news, stock prices, infotainment and various other features.
- the database 142 may comprise suitable logic, circuitry, and/or code that may be operable to store a structured collection of records or data, for example.
- the database 142 may be operable to utilize software to organize the storage of data.
- the device being controlled such as the mobile phone 130a may be operable to receive one or more inputs defining a user interface 128 from another device, such as the PC, laptop, or a notebook computer 132.
- One or more processors 122 within the device being controlled 106 may be operable to customize the user interface 128 of the device being controlled, such as the mobile phone 130a so that content associated with one or more received inputs may become an integral part of the user interface 128 of the device being controlled, such as the mobile phone 130a.
- the mobile phone 130a may be operable to access content directly from the PC, laptop, or a notebook computer 132 rather than from the carrier network 124. This method of uploading and/or downloading customized information directly from the PC, laptop, or a notebook computer 132 rather than from the carrier network 124 may be referred to as side loading.
- the user interface 128 may be created, modified and/or organized by the user 102.
- the user 102 may choose, select, create, arrange, manipulate and/or organize content to be utilized for the user interface 128 and/or one or more content components.
- the user 02 may organize the content components on a screen and may choose content such as personal photographs for background and/or icon images.
- the user 102 may create and/or modify the way content components are activated or presented to the user 102.
- the user 102 may make, import and/or edit icons and/or backgrounds for the user interface 128.
- the user 102 may associate and/or map the icon to a function so that the user 102 may enable or activate a function via the icon.
- Exemplary icons may enable functions such as hyper-links, book marks, programs/applications, shortcuts, widgets, RSS or markup language feeds or information, and/or favorite buddies.
- the user 102 may organize and/or arrange content components within the user interface 128.
- the icons may be organized by category into groups. Groups of icons such as content components may be referred to as affinity banks, for example.
- the processor 125 in the other device 108 may be operable to associate or map the data to media content that is remotely accessible by the device being controlled 106.
- the processor 122 in the device being controlled 106 may be operable to associate or map the data to media content that is remotely accessible by the device being controlled 106.
- the processor 122 may be operable to associate and/or map an icon to a function so that the user 102 may enable or activate a function via the icon and may organize and/or arrange content components within the user interface 128.
- Creation, modification and/or organization of the user interface 128 and/or content components may be performed on the device being controlled, such as mobile phone 130a and/or may be performed on another device such as the PC, laptop, or a notebook computer 132.
- a user screen and/or audio that may be created, modified and/or organized on another device, such as the PC, laptop, or a notebook computer 132 may be side loaded to the device being controlled, such as mobile phone 130a.
- the side loaded user interface 128 may be modified and/or organized on the device being controlled, such as mobile phone 130a.
- a user interface 128 may be side loaded from the PC, laptop, or a notebook computer 132 to the mobile phone 130a and may be customized on the mobile phone 130a.
- One or more tools may enable creation, modification and/or organization of the user interface 128 and/or audio or visual content components.
- FIG. 2A is a diagram illustrating an exemplary MEMS sensing and processing module embedded in a device, in accordance with an embodiment of the invention.
- a user 102 and a device being controlled, such as a cellphone/smartphone/dataphone 106b.
- the cellphone/smartphone/dataphone 106b may comprise a user interface 107b, and an embedded MEMS sensing and processing module 104.
- the user 102 may be enabled to exhale into open space and onto the MEMS sensing and processing module 104.
- the MEMS sensing and processing module 104 may be operable to detect movement caused by expulsion of human breath by the user 102. In response to the detection of movement caused by expulsion of human breath, the MEMS sensing and processing module 104 may be operable to generate one or more controls signals.
- the MEMS sensing and processing module 104 may comprise one or more segments or members that may be operable to sense the kinetic energy generated by the expulsion of the human breath and accordingly generate the one or more control signals.
- the generated one or more control signals may be operable to control a user interface 107b of the cellphone/smartphone/dataphone 106b.
- the MEMS sensing and processing module 104 may be embedded in an interactive kiosk or panel, for example, an ATM machine.
- a user 102 may be enabled to blow a puff of air at the MEMS sensing and processing module 104 that is embedded in the interactive kiosk in order to access and/or interact with a user interface of the interactive kiosk, for example.
- FIG. 2B is a diagram illustrating an exemplary MEMS sensing and processing module located on a stand alone device that is communicatively coupled to a device via a USB interface, in accordance with an embodiment of the invention.
- a stand alone device 262 and another device, such as the PC, laptop, or a notebook computer 132.
- the stand alone device 262 may be placed on any suitable surface, for example, on a table or desk top 263.
- the stand alone device 262 may comprise a flexible support structure 264.
- the support structure 264 may comprise the MEMS sensing and processing module 104 located on one end, for example.
- the invention may not be so limited and the MEMS sensing and processing module 104 may be located at other locations on the stand alone device 262, for example in a base of the stand alone device 262. Notwithstanding, the invention may not be limited in this regard, and the location of the MEMS sensing and processing module 104 within or on the stand alone device 262 may vary accordingly.
- the MEMS sensing and processing module 104 may be communicatively coupled to the PC, laptop, or a notebook computer 132 via a USB interface 265, for example.
- the MEMS sensing and processing module 104 may be operable to detect movement caused by the expulsion of human breath by the user 102. In response to the detection of movement caused by expulsion of a fluid such as air from human breath, the MEMS sensing and processing module 104 may be operable to generate one or more controls signals.
- the MEMS sensing and processing module 104 may comprise one or more segments or members that may be operable to sense the kinetic energy generated by the expulsion of the human breath and accordingly generate the one or more control signals.
- the generated one or more control signals may be communicated to another device to be controlled, such as PC, laptop, or a notebook computer 132.
- the generated one or more control signals may be operable to control a user interface 133 of the PC, laptop, or a notebook computer 132.
- FIG. 2C is a diagram illustrating an exemplary MEMS sensing and processing module located on a stylus, in accordance with an embodiment of the invention.
- a user 102 and a device being controlled, such as a cellphone/smartphone/dataphone 106b.
- the cellphone/smartphone/dataphone 106b may comprise a user interface 107b, and a stylus 202.
- the stylus 202 may be retractable, collapsible, pivotable about an axis or axes and/or flexible and may be enclosed within the body of the cellphone/smartphone/dataphone 106b. Notwithstanding, the stylus 202 may be a foldable device that may be clipped to the body of the cellphone/smartphone/dataphone 106b without limiting the scope of the invention.
- the stylus 202 may comprise the MEMS sensing and processing module 104 located on one end, for example.
- the user 102 may be enabled to retract the stylus 202 and exhale into open space and onto the MEMS sensing and processing module 104.
- the MEMS sensing and processing module 104 may be operable to detect movement caused by expulsion of human breath by the user 102. In response to the detection of movement caused by expulsion of human breath, the MEMS sensing and processing module 104 may be operable to generate one or more controls signals.
- the MEMS sensing and processing module 104 may comprise one or more segments or members that may be operable to sense the kinetic energy generated by the expulsion of the human breath and accordingly generate the one or more control signals.
- the generated one or more control signals may be operable to control a user interface 107b of the cellphone/smartphone/dataphone 106b.
- FIG. 2D is a diagram illustrating an exemplary MEMS sensing and processing module located on a headset for military personnel, in accordance with an embodiment of the invention.
- a user 102 may wear a detachable helmet 208.
- the detachable helmet 208 may comprise detachable eyewear 204, a detachable microphone 206, and a detachable headset 210.
- the detachable headset 210 may comprise the MEMS sensing and processing module 104 located on one end, for example.
- the detachable eyewear 204 may comprise night vision and/or infrared vision capabilities, for example.
- the detachable microphone 206 may be utilized to communicate with other users, for example.
- the user 102 may be enabled to exhale into open space and the MEMS sensing and processing module 104 may be operable to sense or detect the exhalation. The exhalation may occur from the nostrils and/or the mouth of the user 102.
- the MEMS sensing and processing module 104 may be operable to detect movement caused by expulsion of human breath by the user 102. In response to the detection of movement caused by expulsion of human breath, the MEMS sensing and processing module 104 may be operable to generate one or more controls signals. The generated one or more control signals may be operable to control a user interface of the device being controlled such as a user interface 107a of the multimedia device 106a, a user interface 107b of the cellphone/smartphone/dataphone 106b, a user interface 107c of the personal computer (PC), laptop or a notebook computer 106c and/or a user interface 107d of the display device 06d.
- PC personal computer
- FIG. 2E is a diagram illustrating an exemplary MEMS sensing and processing module located on a headrest of a seating apparatus, in accordance with an embodiment of the invention.
- a seating apparatus 220 may comprise a headrest 222, a backrest 226.
- the headrest 222 may comprise a detachable headset 224.
- the user 102 may be enabled to sit in the seating apparatus 220.
- the detachable headset 224 may comprise the MEMS sensing and processing module 104 located on one end, for example.
- the user 102 may be enabled to exhale into open space and onto the MEMS sensing and processing module 104.
- the seating apparatus 220 may be located inside a car or any other automobile or vehicle, for example. Notwithstanding, the invention may not be so limited and the MEMS sensing and processing module 104 may be located at other locations without limiting the scope of the invention.
- the MEMS sensing and processing module 104 may be operable to detect movement caused by expulsion of human breath by the user 102 seated in the seating apparatus 220. In response to the detection of movement caused by expulsion of human breath, the MEMS sensing and processing module 104 may be operable to generate one or more controls signals.
- the generated one or more control signals may be operable to control a user interface of the device being controlled such as a user interface 107a of the multimedia device 106a, a user interface 107b of the cellphone/smartphone/dataphone 106b, a user interface 107c of the personal computer (PC), laptop or a notebook computer 106c, a user interface 107d of the display device 106d, and/or the user interface of a multimedia player, such as a audio and/or video player.
- a user interface of the device being controlled such as a user interface 107a of the multimedia device 106a, a user interface 107b of the cellphone/smartphone/dataphone 106b, a user interface 107c of the personal computer (PC), laptop or a notebook computer 106c, a user interface 107d of the display device 106d, and/or the user interface of a multimedia player, such as a audio and/or video player.
- FIG. 2F is a diagram illustrating an exemplary MEMS sensing and processing module located inside an automobile, in accordance with an embodiment of the invention.
- the automobile 230 may comprise a visor 232 and a steering wheel 234.
- the visor 232 may comprise a flexible support structure 233.
- the support structure 233 may comprise the MEMS sensing and processing module 104 located on one end, for example.
- the steering wheel 234 may comprise a flexible support structure 235.
- the support structure 235 may comprise the MEMS sensing and processing module 104 located on one end, for example. Notwithstanding, the invention may not be so limited and the MEMS sensing and processing module 104 may be located at other locations within the automobile 230 without limiting the scope of the invention.
- the user 102 may be seated in the seat behind the steering wheel 234, with the processing module 104 mounted on the steering wheel 234.
- the user 102 may be seated in the seat behind the steering wheel 234.
- the user 102 may be enabled to exhale into open space and onto the MEMS sensing and processing module 104.
- the MEMS sensing and processing module 104 may be operable to detect movement caused by expulsion of human breath by the user 102.
- the MEMS sensing and processing module 104 may be operable to generate one or more controls signals to control a user interface of the device being controlled such as a user interface 107a of the multimedia device 106a, a user interface 107b of the cellphone/smartphone/dataphone 106b, a user interface 107c of the personal computer (PC), laptop or a notebook computer 106c, a user interface 107d of the display device 106d, and/or the user interface of a multimedia or other device, such as a audio and/or video player or a navigation (e.g., GPS) device.
- a user interface of the device being controlled such as a user interface 107a of the multimedia device 106a, a user interface 107b of the cellphone/smartphone/dataphone 106b, a user interface 107c of the personal computer (PC), laptop or a notebook computer 106c, a user interface 107d of the display device 106d, and/or the user interface of a multimedia or other device, such as
- FIG. 2G is a diagram illustrating an exemplary MEMS sensing and processing module located on detachable eyewear, in accordance with an embodiment of the invention.
- the user 102 may wear detachable goggles or any other type of eyewear 240, for example.
- the detachable eyewear 240 may comprise a detachable headset 242.
- the detachable headset 242 may be flexible and/or deflectable.
- the detachable headset 242 may comprise the MEMS sensing and processing module 104 located on one end, for example.
- the user 102 may be enabled to exhale into open space and onto the MEMS sensing and processing module 104.
- the MEMS sensing and processing module 104 may be operable to detect movement caused by expulsion of human breath by the user 102 seated in the seating apparatus 102. In response to the detection of movement caused by expulsion of human breath, the MEMS sensing and processing module 104 may be operable to generate one or more controls signals to control a user interface of the device being controlled such as a user interface 107a of the multimedia device 106a, a user interface 107b of the cellphone/smartphone/dataphone 106b, a user interface 107c of the personal computer (PC), laptop or a notebook computer 106c, a user interface 107d of the display device 106d, and/or the user interface of a multimedia player, such as a audio and/or video player.
- a user interface of the device being controlled such as a user interface 107a of the multimedia device 106a, a user interface 107b of the cellphone/smartphone/dataphone 106b, a user interface 107c of the personal computer (PC), laptop or a
- FIG. 2H is a diagram illustrating an exemplary MEMS sensing and processing module located on a neckset, in accordance with an embodiment of the invention.
- a detachable neckset 250 may comprise a flexible printed circuit board (PCB) 254 and processing and/or communication circuitry 252.
- the flexible PCB 254 may comprise the MEMS sensing and processing module 104 located on one end, for example.
- the processing and/or communication circuitry 252 may comprise a battery, a voltage regulator, one or more switches, one or more light emitting diodes (LEDs), a liquid crystal display (LCD), other passive devices such as resistors, capacitors, inductors, a communications chip capable of handling one or more wireless communication protocols such as Bluetooth and/or one or more wired interfaces.
- the processing and/or communication circuitry 252 may be packaged within a PCB. Notwithstanding, the invention may not be so limited and the processing and/or communication circuitry 252 may comprise other components and circuits without limiting the scope of the invention.
- the user 102 may be enabled to wear the neckset 250 around his/her neck and exhale into open space and the MEMS sensing and processing module 104 may be operable to sense or detect the exhalation.
- the exhalation may occur from the nostrils and/or the mouth of the user 102.
- the MEMS sensing and processing module 104 may be operable to detect movement caused by expulsion of human breath by the user 102. In response to the detection of movement caused by expulsion of human breath, the MEMS sensing and processing module 104 may be operable to generate one or more controls signals via the flexible PCB 254 to the processing and/or communication circuitry 252.
- the processing and/or communication circuitry 252 may be operable to process and communicate the generated one or more control signals to a device being controlled, such as a multimedia device 106a, a cellphone/smartphone/dataphone 106b, a personal computer (PC), laptop or a notebook computer 106c and/or a display device 106d.
- On or more processors within the device being controlled may be operable to utilize the communicated control signals to control a user interface of the device being controlled such as a user interface 107a of the multimedia device 106a, a user interface 107b of the cellphone/smartphone/dataphone 106b, a user interface 107c of the personal computer (PC), laptop or a notebook computer 106c and/or a user interface 107d of the display device 106d.
- a user interface of the device being controlled such as a user interface 107a of the multimedia device 106a, a user interface 107b of the cellphone/smartphone/dataphone 106b, a user interface 107c of the personal computer (PC), laptop or a notebook computer 106c and/or a user interface 107d of the display device 106d.
- PC personal computer
- FIG. 2I is a diagram illustrating an exemplary MEMS sensing and processing module located on a clip, in accordance with an embodiment of the invention.
- the clip 272 may be placed on any suitable piece of clothing, for example, on a collar of a shirt, a lapel of a coat or a pocket.
- the clip 272 may comprise a flexible support structure 274, for example.
- a clip 272 is illustrated, other suitable attachment structure may be utilized to affix the support structure 274.
- the support structure 274 may comprise the MEMS sensing and processing module 104, the latter of which may be located on one end of or anywhere on the support structure 274, for example. In other exemplary embodiments of the invention, the support structure 274 may not be utilized and the MEMS sensing and processing module 104 may be attached to the clip 272 or other suitable attachment structure.
- the MEMS sensing and processing module 104 may be operable to detect movement caused by the expulsion of human- breath by the user 102. In response to the detection of movement caused by expulsion of human breath, the MEMS sensing and processing module 104 may be operable to generate one or more controls signals.
- the MEMS sensing and processing module 104 may comprise one or more segments or members that may be operable to sense the kinetic energy generated by the expulsion of the human breath and accordingly generate the one or more control signals.
- the generated one or more control signals may be enabled to control a user interface 107b of the cellphone/smartphone/dataphone 106b.
- FIG. 2J is a diagram illustrating an exemplary MEMS sensing and processing module embedded in a fabric, in accordance with an embodiment of the invention.
- the fabric 276 may be any suitable piece of clothing, for example, a collar of a shirt, a lapel of a coat or a pocket.
- the fabric 276 may comprise an embedded MEMS sensing and processing module 104, the latter of which may be located within or anywhere on the fabric 276 274, for example.
- the invention may not be so limited and the MEMS sensing and processing module 104 may be placed at other locations on the outerwear or innerwear of the user 102 without limiting the scope of the invention.
- the MEMS sensing and processing module 104 may be fabricated to be flexible so that the MEMS sensing and processing module 104 may be placed or interwoven in the fabric 276.
- the MEMS sensing and processing module 104 may be operable to detect movement caused by the expulsion of human breath by the user 102. In response to the detection of movement caused by expulsion of human breath, the MEMS sensing and processing module 104 may be operable to generate one or more controls signals.
- the MEMS sensing and processing module 104 may comprise one or more segments or members that may be operable to sense the kinetic energy generated by the expulsion of the human breath and accordingly generate the one or more control signals.
- the generated one or more control signals may be enabled to control a user interface 107b of the cellphone/smartphone/dataphone 106b.
- FIG. 3A is a diagram illustrating an exemplary electronic device that may be controlled via a sectional user interface, in accordance with an embodiment of the invention.
- an electronic device 302 comprising a touchscreen display 304.
- the electronic device 302 may comprise a non- touchscreen display and one or more input devices such as a trackball, one or more multi-function buttons, and/or a keyboard, without deviating from the scope of the present invention.
- the electronic device 302 may comprise a user interface, such as a graphical user interface (GUI), which may enable a user to navigate through and launch the various applications and/or functions on the electronic device 302.
- GUI graphical user interface
- the user interface may enable interacting with the electronic device via respiratory inputs such as exhalations, tactual inputs such as button presses, audio actions such as voice commands, and/or movements of the electronic device 302 such as those detected by an accelerometer and/or gyroscope.
- the user interface may enable interacting with the electronic device 302 via any combination of one or more of the input methods.
- the user interface may be operable to detect an error and/or failure of one or more input methods and default to one or more other input methods. In this manner, interacting with the user interface may not be critically impacted by the failure and/or absence of a particular input method.
- the user interface of the electronic device 302 may display information about the status and/or capabilities of the electronic device 302 and/or display information and/or content generated by one or more applications on the electronic device 302.
- a homescreen of the user interface may be displayed or presented.
- the electronic device 302 may comprise one or more of a cellular telephone, a Smartphone, a wireless telephone, a notebook computer, a personal media player, a personal digital assistant, a multimedia device, a handheld device and/or a multi-function mobile device.
- the user interface may be sectioned into one or more of a fixed region 310 comprising one or more fixed zones 311 , a control region 314 comprising one or more control zones 315, and a content region 318 comprising one or more content zones 319.
- each of the regions 310, 314, and 318 when present, may be of any size and/or shape and may be in any location(s) of the display 304.
- the presence, size, shape, and location(s) of the regions 310, 314, and 320 may be configured (i.e., personalize or customize) by a user of the electronic device 302.
- the electronic device 302 may comprise a user interface customization application which a user may run to configure the regions of the user interface based on preferences such as whether the user is right handed or left handed.
- exemplary configurations 306a, 306b, and 306c of the user interface are illustrated in FIG. 3B.
- the fixed region 310 may display information independent of a state of and/or activity in the control region 314. Exemplary information that may be displayed in the fixed region 310 may comprise the day, the time, weather, appointments in a calendar, RSS (or XML, or other markup language) feeds, recent email contacts, and/or recent phone contacts. However, the preceding are just examples of information that may be displayed in the fixed region 310 and the invention may not be so limited. Additionally, the size, shape and/or location of the fixed region 310 may change depending on what functions and/or applications are running on the electronic device 302. Furthermore, the type and/or amount of information displayed in the fixed region 310 may be customized by a user of the electronic device 302. In this regard, FIG. 3C illustrates some exemplary fixed regions 310a, 310b, and 310c.
- the control region 314 may enable controlling the electronic device 302 such that desired information may be displayed and/or desired applications and/or functions may be launched on the electronic device 302.
- respiratory and/or tactual input may be utilized to scroll, select, manipulate, or otherwise affect objects, such as text, images, links, and/or icons, of the user interface.
- objects such as text, images, links, and/or icons
- FIG. 3E additional details of interacting with objects of the user interface utilizing respiratory and tactual input are described below with respect to FIG. 3E.
- the type and/or amount of information displayed in the control region 314 may be customized by a user of the electronic device 302.
- the size, shape and/or location of the control region 314 may change depending on what functions and/or applications are running on the electronic device 302.
- the content region 318 may display information that may depend on a state of and/or activity in the control region 314.
- the information in the content region 318 may depend on an active icon in the control region.
- an active icon may be an icon which has been navigated to (via breath and/or tactual input) but has not been selected via a "click" (e.g., a tap on a touch screen, a button press or a puff of air).
- the active icon may be link to a website and the content region 318 may display RSS feeds from that website.
- the active icon may be a shortcut to launch an email client and the content region 318 may display one or more recent email messages.
- exemplary information displayed in the content region 318 may comprise RSS or XML feeds, images, a calendar, recent calls, recent texts, and/or recent emails.
- the preceding are just examples and the invention is not so limited.
- the information displayed in the content region 318 may be customizable by a user of the electronic device 302.
- the size, shape and/or location of the content region 318 may change depending on what functions and/or applications are running on the electronic device 302.
- FIG. 3D illustrates a few exemplary content regions 318a, 318b and 318c.
- the display 304 may be a touchscreen and the control region 314 may be responsive to a range of tactual inputs, as opposed to the fixed region 310 and/or the content region 318 which may have limited response to tactual inputs.
- the control region 314 may be responsive to tactual movements, a number of touches, and/or duration of touches while the fixed region 310 and the content region 318 may be responsive to multiple touches (e.g., a double tap).
- limiting the amount of the display 304 that may be allocated to the control region 314 may reduce the amount of area that a user needs to be able to reach in order to navigate and select icons, thus facilitating single-handed operation of the electronic device 302.
- limiting the tactual responsiveness of the fixed region 310 and the content region 318 may reduce inadvertent actions and/or selections (i.e., inadvertent "clicks").
- Information in the fixed region 310 and/or the content region 318 may be displayed in the form of one or more objects, such as images, text, links and/or icons.
- objects in the fixed region 310 and/or the content region 318 may be selectable via tactual and/or respiratory input.
- the response of the fixed region 310 and/or the content region 3 8 may be limited, as described above, to prevent inadvertent clicks.
- objects in the content region 318 may be scrolled into the control region 314 such that they may become selectable.
- respiratory input may be utilized to scroll objects from the content region 318 into the control region 314 such that the object may be selected via tactual input to the control region 310.
- the sectional user interface of the electronic device 302 may be described as a universal content access manager (UCAM) which may provide advantages over traditional graphical user interfaces.
- UCAM universal content access manager
- One advantage may be that the configurability (i.e. customization or personalization) of the UCAM may greatly increase the utility and/or ease of use of the electronic device 302 over a similar device having a conventional graphical user interface.
- objects in each section may be sequenced, juxtaposed, superimposed, overlaid, or otherwise positioned and/or organized such that a user may quickly access desired information, applications, and/or functions.
- Another advantage may be the ability to section the UCAM into one or more regions may greatly increase the utility and/or ease of use of the electronic device 302 over a similar device having a conventional graphical user interface.
- portions of each region may be configured to be responsive or non-responsive to a variety of input types and may be configured to be active (e.g., updated in real-time) or passive (e.g., statically displayed until changed by a user) in terms of information and/or objects displayed therein.
- Another advantage of the UCAM may be its compatibility with a variety of platforms. In this regard, a user may load the UCAM onto a plurality of his/her electronic devices such that the user may interact with all of the user's electronic devices in the same manner.
- FIG. 3B is a diagram illustrating several exemplary configurations of a sectional user interface, in accordance with an embodiment of the invention.
- exemplary user interface configurations 306a, 306b, and 306c each having a fixed region 310 comprising one or more fixed zones 311 , a control region 314 comprising one or more control zones 315, and a content region 318 comprising one or more content zones 319.
- the size, shape, and/or location of the fixed region 310, the control region 314, and the content region 318 may be configured based on user preferences and/or based on a function and/or application running on the electronic device 302.
- FIG. 3C is a diagram illustrating several exemplary fixed regions of a sectional user interface, in accordance with an embodiment of the invention.
- fixed regions 310a, 310b, and 310c may comprise one or more objects 312.
- the portion of the fixed region 310 allocated to each object 312 may be configured to be of any shape and/or size.
- Exemplary objects 312 may comprise text, images, links and/or icons which may correspond to the date, the time, weather information, appointments in a calendar, RSS or XML feeds, recent email contacts, and/or recent phone contacts.
- FIG. 3D is a diagram illustrating several exemplary content regions of a sectional user interface, in accordance with an embodiment of the invention.
- a user may configure attributes of the content region 318 such as the number of objects displayed, the size of the objects displayed, and the order of objects displayed.
- the content region 318 may be customized to have different attributes for each icon, each group of icons, and/or each user.
- exemplary content regions 318a, 318b, and 318c are depicted.
- the content region 318a may correspond to an active icon which may, for example, be a folder or website comprising digital photographs. Consequently, the objects 320-I , 320 4 may correspond to the last four pictures uploaded to the folder or web site.
- the content region 318b may correspond to an active icon which may, for example, be a link to social networking website. Consequently, the objects 322i, 322 N may correspond to the last 'N' events which occurred on one or more profiles on the social networking site.
- the content region 318b may correspond to an active icon which may, for example, launch an email client and the objects 322i 322 N may correspond to the last 'NT emails sent or received.
- the content region 318c may correspond to an active icon which may, for example, be a shortcut to launch a web browser. Consequently, the objects 324i and 324 2 may be links to favorite and/or recently visited web pages.
- the MEMS sensing and processing module 104 may be operable to modify interaction with a user interface of the device being controlled 106 by activating one or more portions of the control region 314. For example, a user 102 may be enabled to control a speed of background auto-scrolling to scroll through options or menus in a mobile game on the device being controlled 106.
- FIG. 3E illustrates interacting with a sectional user interface of an electronic device via respiratory and tactual input, in accordance with an embodiment of the invention.
- a control region 314a which may comprise an active icon area 328.
- the control region 314a depicted in FIG. 3E is an exemplary configuration of the control region 314 and the invention is not limited to the depicted embodiment.
- icons may be represented in a variety of ways and may comprise visual information such as images and/or text and/or may comprise audio information such as tones, songs, and/or speech.
- the active icon area 328 may determine, at least in part, the information displayed in a content region 318 as well as how an electronic device 302 may respond to a tactual and/or respiratory input.
- a content region 318 as described with respect to FIGS. 3A and 3D, may display information corresponding to the icon that is in the active icon area 328.
- a "click" e.g., a touchscreen tap, a button press or puff of air
- an application or function associated with the icon in the active icon area 328 may be launched.
- icons may be grouped categorically and each category may comprise one or more icons.
- the number of categories and/or the number of icons in each category may be configured by a user.
- Exemplary categories may comprise phone and messaging, news, multimedia, music, photos, and videos. Additionally, information and/or objects displayed in the fixed zone 310 may be determined based on which category is active.
- each icon may comprise descriptive text, image(s) and/or audio, configurable by a user, to indicate which functions and/or applications may be associated with the icon.
- a background image, configurable by a user, of the display 304 may be associated with each category and may indicate which category is currently selected.
- a user may scroll between categories utilizing tactual and/or respiratory input and scroll between icons utilizing respiratory and/or tactual input.
- the speed, direction, and/or duration of a scroll may be determined based on the type, duration, intensity, direction, and/or number of tactual and/or respiratory inputs.
- a user may scroll between categories utilizing tactual input and scroll between icons utilizing respiratory input. For example a user may scroll through the categories 00, 10, 90 by shifting the position of his thumb in the control region 314 or by rolling a trackball; and the user may scroll through the icons in the active category 326 by exhaling. A user may scroll though categories until a background image displayed on the electronic device 302 corresponds to a desired category. A user may scroll through icons until the icon in the active icon area 328 corresponds to a desired function and/or application and/or results in desired information and/or objects in the content area 318.
- FIG. 3F illustrates an exemplary sectional user interface which may provide an indication of a sequence of categories and/or icons when scrolling.
- the icons and/or categories may scroll in a linear manner in which there are first (e.g., leftmost or top) and last (e.g., rightmost or bottom) icons and/or categories.
- icons and/or categories may scroll in a cyclical manner in which all icons and/or categories may be accessed by scrolling in either direction regardless of which icon and/or category is active at the beginning of the scroll. Notwithstanding the manner in which icons and/or categories scroll, it may be desirable to provide a user with an indication of next and/or previous icons and/or categories in a scrolling sequence.
- various aspects of the invention may enable displaying an indication of next and/or previous icons and/or categories in the fixed region 310, the control region 314, and/or the content region 318.
- the indication may enable a user to determine which direction to scroll icons and/or categories to reach a desired icon and/or category in a fastest and/or most efficient manner.
- a portion(s) of the fixed region 310, the control region 314 and/or the content region 318 may be overlaid by semi-transparent image(s) of the next icon(s) and/or category/categories in the scrolling sequence.
- the semi-transparency may be sufficiently opaque for a user to identify the next and/or previous icon(s) and/or category/categories and sufficiently transparent so that the information in the fixed region 310 and in the content region 318 may not be critically obstructed.
- icons and categories may be scrolled in a "pinwheel" or "slot machine” fashion.
- semi-transparent images of a two previous icon 330b, a one previous icon 330a, current icon 330, a one next icon 330c, and a two next icon 330d of a one previous category 340 may be overlaid on the user interface.
- semi- transparent images of a two previous icon 332b, a one previous icon 332a, current icon 332, a one next icon 332c, and a two next icon 332d of the active category 326 may be overlaid on the user interface.
- semi-transparent images of a two previous icon 334b, a one previous icon 334a, current icon 334, a one next icon 334c, and a two next icon 334d of a one next category 342 may be overlaid on the user interface.
- FIG. 3G illustrates interacting with an exemplary sectional user interface via respiratory and tactual input, in accordance with an embodiment of the invention. Referring to FIG. 3G there is shown a user interacting with an exemplary sectional user interface via tactual and respiratory input.
- the region 340 of the user interface may be a control region and may display elements which may be displaced by respiratory input and selected by a thumb tap in the region 340.
- the arrow 344 in FIG. 3G is utilized to illustrate that categories of icons may be scrolled via thumb shifiting (i.e. a slight drag of the thumb) in the region 340.
- Information and/or objects displayed in the regions 342 may be superimposed transparencies that allow a user to see the previews of the next and previous icons.
- the information and/or objects displayed in the regions 342 may be fixed or may change and/or update. Some objects displayed in the regions 342 may be selectable via a thumb tap.
- the sectional user interface comprising the regions 340 and 342 may provide a disambiguated solution compared to conventional user interfaces.
- the sectional user interface may enable configurable (i.e. customized or personalized) and predictable control of an electronic device and multi-layered and/or multi-dimensional display of content.
- FIG. 3H illustrates another exemplary sectional user interface which may provide an indication of a sequence of categories and/or icons when scrolling.
- icons and categories may be scrolled in a "flipbook" fashion.
- semi-transparent images of a two previous icon 332b, a one previous icon 332a, current icon 332, a one next icon 332c, and a two next icon 332d of the active category 326 may be overlaid on the user interface.
- FIG. 4A illustrates launching an application via a user interface utilizing tactual and respiratory input, in accordance with an embodiment of the invention.
- exemplary screen shots 410a, 410b, 410c, and 41 Od depict an exemplary sequence of actions to navigate to a desired icon and launch an application associated with that icon.
- the sequence of actions may begin with the electronic device 302 in the state depicted by screenshot 410a.
- an icon 402 may be in the active icon area 328 of the control region 314a.
- the background image of diagonal stripes may correspond to the category to which the icon 402 may belong.
- the objects 402i, 402 4 in the content region 318a may correspond to the icon 402.
- a user may scroll through a sequence of categories via a tactual movement such as a thumb shift or a roll of a trackball.
- the user may seek a category associated with a background image of dots.
- the device may be in the state depicted in screenshot 410b.
- an icon 404 may be in the active icon area 328 of the control region 314b and the objects 404 ⁇ ,, 404 N in the content region 318b may correspond to the icon 404.
- a user may scroll through the icons in the category with the background image of dots via a respiratory input such as one or more exhalations.
- the user may scroll through a sequence of icons until the device is in the state depicted in the screenshot 410c.
- an icon 406, corresponding to a desired function or application may be in the active icon area 328 of the control region 314c.
- the objects 406-I , 406 N in the content region 418c may correspond to the icon 406.
- the user may have arrived at his desired icon, icon 406, and may launch the desired application and/or function by selecting the icon 406 via a tactual input such as a tap of a touchscreen or a button press.
- a tactual input such as a tap of a touchscreen or a button press.
- 41 Od a web page may be associated with the icon 406 and upon selecting the icon 406, a web browser may launch and a web page may be displayed full-screen as depicted in the screenshot 41 Od.
- FIG. 4B illustrates exemplary interaction with an application running on a electronic device, in accordance with an embodiment of the invention.
- aspects of the invention may enable zooming in (enlarging) and/or zooming out (shrinking) via a combination of respiratory and tactual inputs.
- a web browser running on the electronic device 302 may be displaying a full webpage 422 and a user may wish to zoom in on a portion 424 of the webpage. Accordingly, the user may utilize a tactual input to control a reference point(s) for the zoom and utilize a respiratory input to control the direction and/or amount of zoom.
- the user may touch the reference point on a touchscreen and may zoom in or out based on that reference point by exhaling.
- the direction and/or amount of zoom may be controlled by, for example, the intensity, duration, direction, and/or number of exhalations.
- FIG. 4C illustrates interaction with an application running on an electronic device, in accordance with an embodiment of the invention.
- aspects of the invention may enable scrolling via a combination of respiratory and tactual inputs.
- Exemplary applications may comprise a web browser, a media player, a still camera, a video camera, and a file system browser.
- a web browser may be displaying a portion of a webpage 424 and a user may wish to scroll to another portion 428 of the webpage. Accordingly, the user may utilize a respiratory input to perform a coarse scroll and utilize a tactual input to perform a fine scroll.
- FIG. 5 is a block diagram of an exemplary user interface interacting with a MEMS sensing and processing module and a host system, in accordance with an embodiment of the invention.
- the device being controlled 106 may comprise a communication module 502, a user interface 504 and a host interface 506, a plurality of drivers and/or libraries 506, 518, 520 and 522 and a plurality of applets 508, 510, 512 and 514.
- the user interface 504 may be a graphical user interface (GUI), for example.
- GUI graphical user interface
- the communication module 502 may comprise suitable logic, circuitry, interfaces, and/or code that may be operable to receive one or more signals from the MEMS sensing and processing module 104 operable to function as a driver, and/or an interface such as a human interface device (HID).
- a driver such as a custom expulsion of air driver or an air detection driver for processing on the device being controlled 106.
- the received signal may be processed in the device being controlled 106 using the driver.
- the one or more signals may be generated in response to detection of movement of air caused by the expulsion of human breath by user 102.
- the communication module 502 may be operable to receive one or more signals from the MEMS sensing and processing module 104 via a wired and/or a wireless signal.
- the communication module 502 may support a plurality of drivers, interfaces and/or HID profiles.
- the communication module 120 may support an external memory interface, a universal asynchronous receiver transmitter (UART) interface, an enhanced serial peripheral interface (eSPI), a general purpose input/output (GPIO) interface, a pulse-cdde modulation (PCM) and/or an inter-IC sound (l 2 S) interface, an inter-integrated circuit (l 2 C) bus interface, a universal serial bus (USB) interface and/or HID profile, a Bluetooth interface and/or HID profile, a ZigBee interface and/or HID profile, an IrDA interface and/or HID profile, and/or a wireless USB (W-USB) interface and/or a HID profile.
- UART universal asynchronous receiver transmitter
- eSPI enhanced serial peripheral interface
- the user 102 may be enabled to interface with the GUI 504 of the device being controlled 106 via the one or more received signals.
- the received one or more signals may be compliant with one or more drivers, a universal serial bus (USB) HID class and/or a wireless protocol HID class, such as wireless USB HID class and/or a ZigBee HID class, for example.
- USB universal serial bus
- the invention may not be so limited and one or more drivers and/or other wireless protocol HID classes may be utilized without limiting the scope of the invention.
- Bluetooth utilizes the USB HID class.
- the received signal may be passed to a driver such as a custom expulsion of air driver or an air detection driver for processing on the device being controlled 106.
- the received signal may be processed in the device being controlled 106 using the driver.
- the communication module 502 may be operable to format the received one or more signals into a HID profile.
- the HID profile may comprise one or more drivers and/or libraries 516-522 that may enable interfacing with the GUI 504 of the device being controlled 106.
- the one or more drivers and/or libraries 516-522 may enable one or more of initiation, establishment and/or termination of communication by the device being controlled 106 with the MEMS sensing and processing module 104.
- the HID profile may define protocols, procedures, and/or usage scenarios for using the HID, such as the MEMS sensing and processing module 104 over a wired and/or wireless link, such as Bluetooth.
- the device being controlled 106 may host a wireless protocol stack, such as the Bluetooth stack which may use the Service Discovery Protocol (SDP) to discover HIDs, such as the MEMS sensing and processing module 104.
- SDP Service Discovery Protocol
- the user interface of one of more of the devices being controlled 106 may also be activated and modified based on button or key-activated function modes.
- the function modes may comprise portions of firmware embedded in the MEMS sensing and processing module 104 and one or more applications, drivers, and/or libraries installed on the device being controlled 106.
- the function modes may be activated from the MEMS sensing and processing module 104, and/or via one or more stimuli on the device being controlled 106.
- the one or more stimuli may comprise puff of air, touch, audio, visual, gestures, and/or other stimuli.
- the device being controlled 106 may be operable to receive from the HID, such as the MEMS sensing and processing module 104 before it is activated, device information, such as descriptors to the class drivers and/or libraries 516-522.
- the drivers and/or libraries 516-522 may be operable to utilize the descriptors to determine device characteristics in order to enable controls on the device being controlled 106.
- the library, variable # 1 516 may be operable to detect the direction of expulsion of human breath onto the HID, such as the MEMS sensing and processing module 104 and accordingly convert the received signal into a directional signal that controls one or more components of the user interface 504.
- the library, momentum # 1 518 may be operable to detect a puff of air exhaled by the user 102, and accordingly utilize the corresponding received signal from the MEMS sensing and processing module 104 to scroll through one or more menus of the user interface 504 and slow down after a particular period of time.
- the library, momentum # 1 518 may be operable to detect a repeated number of puffs of human breath within a certain time, or a combination of directional puffs of human breath within a certain time, or a fast left-right-left sequence of a puff of human breath exhaled by the user 102, and generate a control signal to activate and/or switch through a user interface of the device being controlled 106. For example, by blowing a puff of air on the MEMS sensing and processing module 04, a direction and speed of scrolling may be determined based on the flow of air across the surface of the MEMS sensing and processing module 104 from left-bottom to right-top.
- the MEMS sensing and processing module 104 may generate a control signal that may result in a corresponding two-axis scrolling of a user interface, and the speed of scrolling may be determined based on the duration of sensing the flow of air or the intensity of pressure of the flow of air on the MEMS sensing and processing module 104.
- one or more fixed puffs of air at the MEMS sensing and processing module 104 within a certain period of time after scrolling interaction may be processed as a zoom function mode that may enable zooming in into an area visible as a result of scrolling.
- the user 102 may be enabled to end the zoom function mode by puffing air again at the MEMS sensing and processing module 104 and returning back to the scroll function mode.
- the library, Boolean # 1 520 may be operable to utilize the received signal from the MEMS sensing and processing module 104 to select one or more menus and/or icons within the user interface 504.
- the library, Boolean # 2 522 may also be operable to utilize the received signal from the MEMS sensing and processing module 104 to select one or more menus and/or icons within the user interface 504.
- the library, Boolean # 2 522 may also be operable to determine a function mode based on a received sequence of puffs of human breath within a particular period of time. For example, a number of puffs of human breath received within a certain period of time may switch a function mode from a scrolling function mode to a magnifying function mode within the user interface 504. Notwithstanding, the invention may not be so limited and other driver and/or libraries may be utilized without limiting the scope of the invention.
- the device being controlled 106 may be operable to interface with the detection device, such as the MEMS sensing and processing module 104 utilizing one or more applets 508-514.
- the applets 508-514 may comprise software components, code and/or programs that may be operable to run in context of another program, such as a web browser, for example.
- the applet, Ul skin # 1 508 may comprise a software component, code and/or program that may function as a pinwheel, where a plurality of icons may cycle through the background of the user interface 504.
- the user 102 may be prompted to select one or more icons from the background of the user interface 504 of the device being controlled 106.
- the applet, Ul skin # 2 510 may comprise a software component, code and/or program that may enable dissolving one or more icons on the user interface 504 into dust, for example, when a user 102 blows air at the icons being displayed on the GUI 504.
- one of the applets may comprise a software component, code and/or program that may switch between one or more components of the user interface 504 upon activation, for example.
- one of the applets may comprise a software component, code and/or program that may function as a 3-D flipbook, where a user 102 may be enabled to blow air at a book on the GUI 504 to turn one or more pages within the book.
- the applet, Faves # 1 512 may comprise a software component, code and/or program that may enable morphing two or more pictures of users or friends on the GUI 504 into a single picture, when a user 102 blows air onto the two or more pictures of users or friends on the GUI 504.
- the applet, Scroll Function 514 may comprise a software component, code and/or program that may enable scrolling through a plurality of menus, pages and/or icons on the GUI 504.
- the GUI 504 of the device being controlled 106 may be operable to interface with the MEMS sensing and processing module 104 based on one or more outputs generated by the applets 508-514.
- the host computer interface (HCI) 506 may comprise an interface to a display, other hardware and/or processors within the device being controlled 106 for controller management, link establishment, and/or maintenance, for example.
- a HCI transport layer may be operable to deliver HCI commands to the other hardware within the device being controlled 106.
- the MEMS sensing and processing module 104 may be utilized to enable a plurality of function modes on the device being controlled 106.
- the MEMS sensing and processing module 104 may be operable to enable a scroll function mode to enable scrolling through a document in a multi-dimensional axis, a zoom function mode to enable zooming in and zooming out of a document or enable directional magnification, displace a cursor in a body of text, a point and click function mode, where clicks may be an exemplary embodiment of puffing air one or more times on the MEMS sensing and processing module 104, a drag and drop function mode to enable dragging an item on the user interface 128 by pausing a pointer on the item, dragging the item and then dropping the item by puffing air again on the MEMS sensing and processing module 104.
- the MEMS sensing and processing module 104 may be operable to move a character and/or, avatar in games and other multimedia applications, in one or more dimensions, displace a background by controlling an auto-scroll speed or displace other superimposed elements in games and other applications, scroll and select through a list of options with no interruption of gameplay, swap weapons while shooting in games, and/or add customizable inputs to existing controllers in games based on mapping specific controls to be operated by breath to enable more simultaneous inputs.
- the MEMS sensing and processing module 104 may be operable to enable multimodal input with a keyboard, a mouse, or any other input device, enable multimodal input with a touchscreen, enable multimodal input with voice and/or speech for GUI-based interaction, such as, motion of virtual elements, and/or enable multimodal input with gesture and/or motion tracking.
- the MEMS sensing and processing module 104 may be operable to enable control of non-GUI variable functions, such as setting an audio level or volume, skipping tracks, forward or backward in an audio-video device, and/or skipping voicemail messages in a phone, browse through icons, applications, windows, or widgets while entering data, and/or interact with a navigation system, or in-vehicle dashboard and entertainment system with a user's hands on the steering wheel as illustrated in FIG. 2F.
- non-GUI variable functions such as setting an audio level or volume, skipping tracks, forward or backward in an audio-video device, and/or skipping voicemail messages in a phone, browse through icons, applications, windows, or widgets while entering data, and/or interact with a navigation system, or in-vehicle dashboard and entertainment system with a user's hands on the steering wheel as illustrated in FIG. 2F.
- the MEMS sensing and processing module 104 may be operable to enable mimicking real-life interactions such as blowing fire, a candle, a pinwheel, soap bubbles, or a waft of dust in virtual reality environments and games, modify audio and/or video parameters while playing music, such as filters, pitch, or source.
- the MEMS sensing and processing module 104 may be operable to enable hands-free operation of wearable equipment in enterprise, law enforcement, homeland security, medical emergency, military operations by enabling function modes, such as scroll or zoom content in a head mounted display as disclosed in FIG. 2D, remotely interact with a large display or a video projector, and/or control a toy or electronic device by adjusting the direction of motion, and/or speed, set parameters such as line width in graphics design editing applications, while drawing or providing other input, for example.
- the invention may not be so limited and the MEMS sensing and processing module may be utilized in other applications without limiting the scope of the invention.
- the human or user 102 interfacing with the GUI 504 may be agnostic to any particular operating system (OS) platform on the device being controlled 106.
- OS operating system
- the device being controlled 106 may be running on any one or more of a Windows OS, Symbian OS, Android OS, Palm OS, or other operating systems on mobile phones such as the iPhone or a Blackberry phone.
- FIG. 6 is a flowchart illustrating exemplary steps for processing signals that control a device using human breath. Referring to FIG. 6, exemplary steps may begin at step 602.
- one or more signals may be received from a detection device, operable to function as a human interface device (HID) such as the MEMS sensing and processing module 104.
- the detection device may comprise a micro-electro-mechanical system (MEMS) detector that may be embedded in a device to be controlled.
- MEMS micro-electro-mechanical system
- the one or more signals may be generated in response to detection of movement of air caused by the expulsion of human breath.
- the device being controlled 106 may be operable to format the received one or more signals into a HID profile.
- the HID profile may comprise one or more drivers and/or libraries 516-522 that may enable interfacing with the GUI 504 of the device being controlled 106.
- the one or more drivers and/or libraries 516-522 may enable one or more of initiation, establishment and/or termination of communication by the device being controlled 106 with the MEMS sensing and processing module 104.
- one or more applets 508-514 within the device being controlled 106 may be operable to interface with the detection device, such as the MEMS sensing and processing module 104.
- the user 102 may be enabled to interface with a graphical user interface (GUI) 128 of the device being controlled 106 via the one or more received signals utilizing one or more applets 508-514. Control then passes to end step 514.
- GUI graphical user interface
- FIG. 7 A is a flow chart illustrating exemplary steps for controlling a user interface of a device using human breath, in accordance with an embodiment of the invention.
- exemplary steps may begin at step 702.
- the sensing module 110 in the MEMS sensing and processing module 104 may be operable to detect movement or change in composition such as ambient air composition, for example caused by the expulsion of human breath by the user 102.
- the sensing module 110 may be operable to generate one or more electrical, optical and/or magnetic signals in response to the detection of movement caused by the expulsion of human breath.
- the processor firmware 1 16 may be operable to process the received electrical, magnetic and/or optical signals from the sensing module 110 utilizing various algorithms.
- the processor firmware 116 may also be operable to incorporate artificial intelligence (Al) algorithms to adapt to a particular user's 102 breathing pattern.
- the processor firmware 1 16 may be operable to generate one or more control signals to the device being controlled 106 based on processing the received electrical, optical and/or magnetic signals from the sensing module 110.
- the generated one or more control signals may be operable to control a user interface 128 of the device being controlled 106, such as a user interface 107a of the multimedia device 106a, a user interface 107b of the cellphone/smartphone/dataphone 106b, a user interface 107c of the personal computer (PC), laptop or a notebook computer 106c, a user interface 107d of the display device 106d, a user interface 107e of the TV/game console/other platform 106e, and a user interface of a mobile multimedia player and/or a remote controller. Control then passes to end step 714.
- a user interface 128 of the device being controlled 106 such as a user interface 107a of the multimedia device 106a, a user interface 107b of the cellphone/smartphone/dataphone 106b, a user interface 107c of the personal computer (PC), laptop or a notebook computer 106c, a user interface 107d of the display device 106d, a user interface 107e of
- FIG. 7B is a flow chart illustrating exemplary steps for side loading of information, in accordance with an embodiment of the invention.
- exemplary steps may begin at step 752.
- the device being controlled 106 such as the mobile phone 130a may be operable to receive data and/or media content from another device 108, such as the PC, laptop, or a notebook computer 132.
- the device being controlled 106 such as the mobile phone 130a may be operable to retrieve data and/or media content from a network, such as the Internet 134.
- the retrieved data and/or media content may comprise an RSS feed, a URL and/or multimedia content.
- step 758 it may be determined whether the laptop, PC and/or notebook 132 may perform association and/or mapping of the received data and/or media content and the retrieved data and/or media content. If the association or mapping is performed on the laptop, PC and/or notebook 132, control passes to step 760.
- one or more processors within the laptop, PC and/or notebook 132 may be operable to associate and/or map the received and retrieved data and/or media content into icons or groups.
- the laptop, PC and/or notebook 132 may be operable to associate and/or map an icon to a function so that the user 102 may enable or activate a function via the icon.
- Exemplary icons may enable functions such as hyper-links, book marks, shortcuts, widgets, RSS feeds and/or favorite buddies.
- the laptop, PC and/or notebook 132 may be operable to communicate the associated icons or groups to the device being controlled 106, such as the mobile phone 130a. Control then passes to step 766.
- the association or mapping is not performed on the laptop, PC and/or notebook 132, control passes to step 764.
- one or more processors within the device being controlled 106 such as the mobile phone 130a may be operable to associate and/or map the received and retrieved data and/or media content into icons or groups.
- the mobile phone 130a may be operable to associate and/or map an icon to a function so that the user 102 may enable or activate a function via the icon.
- the device being controlled 106 such as the mobile phone 130a may be operable to customize the associated icons or groups so that content associated with the received data and/or media content may become an integral part of the user interface 131 a of the device being controlled, such as the mobile phone 130a.
- the user interface 131a may be modified and/or organized by the user 102.
- the user 102 may choose, create, arrange and/or organize content to be utilized for the user interface 131 a and/or one or more content components.
- the user 102 may organize the content components on a screen and may choose content such as personal photographs for background and/or icon images.
- the user 102 may create and/or modify the way content components are activated or presented to the user 102.
- the user 102 may make, import and/or edit icons and/or backgrounds for the user interface 128. Control then passes to end step 768.
- a method and system for controlling a user interface of a device using human breath may comprise a device 106 (FIG. 1A) comprising an embedded micro-electro-mechanical system (MEMS) sensing and processing module 104 (FIG. 1A).
- the MEMS sensing and processing module 104 may be operable to detect movement caused by the expulsion of human breath by the user 102.
- the MEMS sensing and processing module 104 may be operable to generate one or more controls signals.
- the generated one or more control signals may be utilized to control a user interface 128 of the device 106, such as a multimedia device 106a, a cellphone/smartphone/dataphone 106b, a PC, laptop or a notebook computer 106c, a display device 106d, a TV/game console/other platform 106e, a mobile multimedia player and/or a remote controller.
- a user interface 128 of the device 106 such as a multimedia device 106a, a cellphone/smartphone/dataphone 106b, a PC, laptop or a notebook computer 106c, a display device 106d, a TV/game console/other platform 106e, a mobile multimedia player and/or a remote controller.
- the detection of the movement caused by the expulsion of human breath may occur without use of a channel.
- the detection of the movement caused by expulsion of human breath may be responsive to the human breath being exhaled into open space and onto one or more detectors in the MEMS sensing and processing module 104 that
- the MEMS sensing and processing module 104 may be operable to navigate within the user interface of one of more of the devices being controlled 106 via the generated one or more control signals.
- the MEMS sensing and processing module 104 may be operable to select one or more components within the user interface 128 of the devices being controlled 106 via the generated one or more control signals.
- the generated one or more control signals may be communicated to the device being controlled via one or more of an external memory interface, a universal asynchronous receiver transmitter (UART) interface, an enhanced serial peripheral interface (eSPI), a general purpose input/output (GPIO) interface, a pulse-code modulation (PCM) and/or an inter-IC sound (l 2 S) interface, an inter-integrated circuit (l 2 C) bus interface, a universal serial bus (USB) interface, a Bluetooth interface, a ZigBee interface, an IrDA interface, and/or a wireless USB (W-USB) interface.
- UART universal asynchronous receiver transmitter
- eSPI enhanced serial peripheral interface
- GPIO general purpose input/output
- PCM pulse-code modulation
- l 2 S inter-IC sound
- l 2 C inter-integrated circuit
- USB universal serial bus
- Bluetooth a Bluetooth interface
- ZigBee interface ZigBee interface
- IrDA interface IrDA interface
- W-USB wireless USB
- the MEMS sensing and processing module 104 may be operable to enable one or more of initiation, establishment and/or termination of communication by the device 106.
- the MEMS sensing and processing module 104 may be operable to enable interaction within the user interface 128 of the device being controlled 106 based on one or more of the expulsion of human breath or expulsion of a fluid such as air, tactual inputs such as button presses, audio inputs such as voice commands, and/or movements of the device being controlled 106 such as those detected by an accelerometer and/or a gyroscope.
- the MEMS sensing and processing module 104 may be operable to generate control signals to control one or more analog and/or digital functions within the user interface 128 of one of more of the devices being controlled 106.
- the MEMS sensing and processing module 104 may be operable to omni-directionally detect puffs of air at ultra low pressure.
- the MEMS sensing and processing module 104 may be operable to allow intuitive function modes, such as, a scroll, pan, zoom, and/or click function modes, prevent unintentional selection of content, and/or minimize occlusion of content.
- One or more of the plurality of devices such as a handheld device, for example, a multimedia device 106a and/or a cellphone/smartphone/dataphone 106b and/or a PC, laptop or a notebook computer 106c may be operable to receive one or more inputs defining the user interface 128 from another device 108.
- the other device 108 may be one or more of a PC, laptop or a notebook computer 106c and/or a handheld device, for example, a multimedia device 106a and/or a cellphone/smartphone/dataphone 106b.
- data may be transferred from the other device 108 to the cellphone/smartphone/dataphone 106b and this data may be associated or mapped to media content that may be remotely accessed by the cellphone/smartphone/dataphone 106b via a service provider such as a cellular or PCS service provider.
- the transferred data that is associated or mapped to media content may be utilized to customize the user interface of the cellphone/smartphone/dataphone 106b.
- media content associated with one or more received inputs may become an integral part of the user interface 128 of the device being controlled 106.
- the invention is not limited to the expulsion of breath. Accordingly, in various exemplary embodiments of the invention, the MEMS may be operable to detect the expulsion of any type of fluid such as air, and the source of the fluid may be an animal, a machine and/or a device.
- the source of the fluid may be an animal, a machine and/or a device.
- FIG. 1 may depict a non-transitory computer readable medium and/or storage medium, and/or a non-transitory machine readable medium and/or storage medium, having stored thereon, a machine code and/or a computer program having at least one code section executable by a machine and/or a computer, thereby causing the machine and/or computer to perform the steps as described herein for controlling a user interface of a device using human breath.
- aspects of the invention may be realized in hardware, software, firmware or a combination thereof.
- the invention may be realized in a centralized fashion in at least one computer system or in a distributed fashion where different elements are spread across several interconnected computer systems. Any kind of computer system or other apparatus adapted for carrying out the methods described herein is suited.
- a typical combination of hardware, software and firmware may be a general-purpose computer system with a computer program that, when being loaded and executed, controls the computer system such that it carries out the methods described herein.
- One embodiment of the invention may be implemented as a board level product, as a single chip, application specific integrated circuit (ASIC), or with varying levels integrated on a single chip with other portions of the system as separate components.
- ASIC application specific integrated circuit
- the degree of integration of the system will primarily be determined by speed and cost considerations. Because of the sophisticated nature of modern processors, it is possible to utilize a commercially available processor, which may be implemented external to an ASIC implementation of the present system. Alternatively, if the processor is available as an ASIC core or logic block, then the commercially available processor may be implemented as part of an ASIC device with various functions implemented as firmware.
- the present invention may also be embedded in a computer program product, which comprises all the features enabling the implementation of the methods described herein, and which when loaded in a computer system is able to carry out these methods.
- Computer program in the present context may mean, for example, any expression, in any language, code or notation, of a set of instructions intended to cause a system having an information processing capability to perform a particular function either directly or after either or both of the following: a) conversion to another language, code or notation; b) reproduction in a different material form.
- other meanings of computer program within the understanding of those skilled in the art are also contemplated by the present invention.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- User Interface Of Digital Computer (AREA)
- Fluid Mechanics (AREA)
- Computer Hardware Design (AREA)
- Microelectronics & Electronic Packaging (AREA)
- Multimedia (AREA)
- Health & Medical Sciences (AREA)
- Audiology, Speech & Language Pathology (AREA)
- General Health & Medical Sciences (AREA)
Applications Claiming Priority (4)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US24137609P | 2009-09-11 | 2009-09-11 | |
| US24220109P | 2009-09-14 | 2009-09-14 | |
| US12/813,292 US20110010112A1 (en) | 1999-02-12 | 2010-06-10 | Method and System for Controlling a User Interface of a Device Using Human Breath |
| PCT/US2010/048646 WO2011032096A2 (en) | 2009-09-11 | 2010-09-13 | Method and system for controlling a user interface of a device using human breath |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| EP2475969A2 true EP2475969A2 (de) | 2012-07-18 |
| EP2475969A4 EP2475969A4 (de) | 2016-11-02 |
Family
ID=43733125
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| EP10816236.3A Ceased EP2475969A4 (de) | 2009-09-11 | 2010-09-13 | Verfahren und system zur steuerung einer benutzeroberfläche einer vorrichtung mithilfe des menschliches atems |
Country Status (5)
| Country | Link |
|---|---|
| EP (1) | EP2475969A4 (de) |
| JP (1) | JP2013542470A (de) |
| KR (1) | KR20130022401A (de) |
| CN (1) | CN102782459A (de) |
| WO (1) | WO2011032096A2 (de) |
Families Citing this family (16)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US9164997B2 (en) * | 2012-01-19 | 2015-10-20 | Microsoft Technology Licensing, Llc | Recognizing cloud content |
| JP5870841B2 (ja) * | 2012-05-17 | 2016-03-01 | 株式会社デンソー | 車両用表示装置 |
| JP2014063344A (ja) * | 2012-09-21 | 2014-04-10 | Sharp Corp | 携帯端末装置、表示プログラムおよび記録媒体 |
| CN104000719B (zh) * | 2013-02-22 | 2016-08-17 | 陈青越 | 一种基于网络的气流感应远程控制情趣用具系统 |
| JP6488556B2 (ja) * | 2014-05-14 | 2019-03-27 | 凸版印刷株式会社 | 端末装置、表示制御方法及びプログラム |
| CN104536556B (zh) * | 2014-09-15 | 2021-01-15 | 联想(北京)有限公司 | 一种信息处理方法及电子设备 |
| CN107481491B (zh) * | 2016-07-20 | 2019-09-20 | 宝沃汽车(中国)有限公司 | 电器的控制系统及移动终端 |
| KR102814684B1 (ko) | 2016-08-26 | 2025-05-29 | 삼성전자주식회사 | 외부 기기를 제어하는 휴대 기기 및 이의 오디오 신호 처리 방법 |
| CN106354504B (zh) * | 2016-08-29 | 2020-08-11 | 北京小米移动软件有限公司 | 消息显示方法及装置 |
| CN112153269B (zh) * | 2019-06-27 | 2022-04-29 | 京东方科技集团股份有限公司 | 应用于电子设备的图片显示方法、装置、介质与电子设备 |
| KR102266426B1 (ko) * | 2020-01-10 | 2021-06-16 | 이종민 | 바람을 이용한 스마트폰 제어장치 및 그 제어방법 |
| US20230045458A1 (en) * | 2020-01-31 | 2023-02-09 | Sony Group Corporation | Information processing apparatus and information processing method |
| CN111588955A (zh) * | 2020-05-27 | 2020-08-28 | 北京无线电测量研究所 | 一种呼吸机显控终端 |
| CN111625146B (zh) * | 2020-05-27 | 2023-08-04 | 北京无线电测量研究所 | 一种带有双触摸屏的电子产品 |
| CN111627371A (zh) * | 2020-05-27 | 2020-09-04 | 北京无线电测量研究所 | 一种带有双显示屏的电子产品 |
| US20240370098A1 (en) * | 2021-09-07 | 2024-11-07 | PI-A Creative Systems Ltd | Method for detecting user input to a breath input configured user interface |
Family Cites Families (10)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US6421617B2 (en) * | 1998-07-18 | 2002-07-16 | Interval Research Corporation | Interface including fluid flow measurement for use in determining an intention of, or an effect produced by, an animate object |
| US6213955B1 (en) * | 1998-10-08 | 2001-04-10 | Sleep Solutions, Inc. | Apparatus and method for breath monitoring |
| US6449496B1 (en) * | 1999-02-08 | 2002-09-10 | Qualcomm Incorporated | Voice recognition user interface for telephone handsets |
| US7739061B2 (en) * | 1999-02-12 | 2010-06-15 | Pierre Bonnat | Method and system for controlling a user interface of a device using human breath |
| WO2000048066A1 (fr) * | 1999-02-12 | 2000-08-17 | Pierre Bonnat | Procede et dispositif de commande d'un systeme electronique ou informatique au moyen d'un flux de fluide |
| US9116544B2 (en) * | 2008-03-26 | 2015-08-25 | Pierre Bonnat | Method and system for interfacing with an electronic device via respiratory and/or tactual input |
| US20110178613A9 (en) * | 2000-02-14 | 2011-07-21 | Pierre Bonnat | Method And System For Processing Signals For A MEMS Detector That Enables Control Of A Device Using Human Breath |
| JP2004177992A (ja) * | 2002-11-22 | 2004-06-24 | Panasonic Mobile Communications Co Ltd | 風圧センサ付き携帯端末及び風圧センサ付き携帯端末により実行可能なプログラム |
| US7580540B2 (en) * | 2004-12-29 | 2009-08-25 | Motorola, Inc. | Apparatus and method for receiving inputs from a user |
| US7587277B1 (en) * | 2005-11-21 | 2009-09-08 | Miltec Corporation | Inertial/magnetic measurement device |
-
2010
- 2010-09-13 EP EP10816236.3A patent/EP2475969A4/de not_active Ceased
- 2010-09-13 WO PCT/US2010/048646 patent/WO2011032096A2/en not_active Ceased
- 2010-09-13 CN CN2010800511238A patent/CN102782459A/zh active Pending
- 2010-09-13 JP JP2012528957A patent/JP2013542470A/ja active Pending
- 2010-09-13 KR KR1020127009299A patent/KR20130022401A/ko not_active Ceased
Also Published As
| Publication number | Publication date |
|---|---|
| EP2475969A4 (de) | 2016-11-02 |
| CN102782459A (zh) | 2012-11-14 |
| KR20130022401A (ko) | 2013-03-06 |
| WO2011032096A2 (en) | 2011-03-17 |
| JP2013542470A (ja) | 2013-11-21 |
| WO2011032096A3 (en) | 2014-03-20 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US9753533B2 (en) | Method and system for controlling a user interface of a device using human breath | |
| WO2011032096A2 (en) | Method and system for controlling a user interface of a device using human breath | |
| US7739061B2 (en) | Method and system for controlling a user interface of a device using human breath | |
| US9116544B2 (en) | Method and system for interfacing with an electronic device via respiratory and/or tactual input | |
| US10134358B2 (en) | Head mounted display device and method for controlling the same | |
| US9250443B2 (en) | Head mounted display apparatus and contents display method | |
| JP6083072B2 (ja) | スマートエアマウス | |
| CN120266083A (zh) | 用于基于注意力与用户界面交互的方法 | |
| JP2025507385A (ja) | 三次元環境を表示しながらコンピュータシステムのシステム機能にアクセスするためのデバイス、方法、及びグラフィカルユーザインタフェース | |
| US9594435B2 (en) | Display apparatus and contents display method | |
| US12461638B2 (en) | Customized user interfaces | |
| US20190294236A1 (en) | Method and System for Processing Signals that Control a Device Using Human Breath | |
| EP2538308A2 (de) | Bewegungsbasierte Steuerung von einer gesteuerten Vorrichtung | |
| CN119790364A (zh) | 用于改善与三维环境的交互的可访问性的设备、方法和图形用户界面 | |
| WO2013164351A1 (en) | Device and method for processing user input | |
| KR101687552B1 (ko) | 휴대 단말기 및 그 동작 방법 | |
| EP2660695B1 (de) | Vorrichtung und verfahren zur verarbeitung von benutzereingaben | |
| CN117042997A (zh) | 具有可变外观的用户界面 | |
| US20260034882A1 (en) | Customized user interfaces | |
| KR101687550B1 (ko) | 휴대 단말기 및 그 동작 방법 | |
| CN118715499A (zh) | 用于在显示三维环境时访问计算机系统的系统功能的设备、方法和图形用户界面 | |
| WO2013058545A1 (en) | Electronic book apparatus and user interface providing method of the same |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
| 17P | Request for examination filed |
Effective date: 20120315 |
|
| AK | Designated contracting states |
Kind code of ref document: A2 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO SE SI SK SM TR |
|
| DAX | Request for extension of the european patent (deleted) | ||
| R17D | Deferred search report published (corrected) |
Effective date: 20140320 |
|
| A4 | Supplementary search report drawn up and despatched |
Effective date: 20161005 |
|
| RIC1 | Information provided on ipc code assigned before grant |
Ipc: G06F 3/0482 20130101ALI20160928BHEP Ipc: G06F 3/01 20060101AFI20160928BHEP Ipc: G06F 3/0481 20130101ALI20160928BHEP Ipc: G06F 3/0485 20130101ALI20160928BHEP |
|
| STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
| 17Q | First examination report despatched |
Effective date: 20180604 |
|
| REG | Reference to a national code |
Ref country code: DE Ref legal event code: R003 |
|
| STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION HAS BEEN REFUSED |
|
| 18R | Application refused |
Effective date: 20191108 |