US20120249461A1 - Dedicated user interface controller for feedback responses - Google Patents
Dedicated user interface controller for feedback responses Download PDFInfo
- Publication number
- US20120249461A1 US20120249461A1 US13/433,069 US201213433069A US2012249461A1 US 20120249461 A1 US20120249461 A1 US 20120249461A1 US 201213433069 A US201213433069 A US 201213433069A US 2012249461 A1 US2012249461 A1 US 2012249461A1
- Authority
- US
- United States
- Prior art keywords
- user interface
- sensor
- input
- user
- data
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1684—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
- G06F1/1694—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being a single or a set of motion sensors for pointer control or gesture input obtained by sensing movements of the portable computer
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/016—Input arrangements with force or tactile feedback as computer generated output to the user
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04886—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2200/00—Indexing scheme relating to G06F1/04 - G06F1/32
- G06F2200/16—Indexing scheme relating to G06F1/16 - G06F1/18
- G06F2200/163—Indexing scheme relating to constructional details of the computer
- G06F2200/1637—Sensing arrangement for detection of housing movement or orientation, e.g. for controlling scrolling or cursor movement on the display of an handheld computer
Definitions
- the present invention relates to user interface control.
- a single host processor controls robust operating functions for a consumer electronic device.
- One function generally controlled by the host processor is “haptics,” which refers to generating tactile feedback to a user of consumer electronics products, for example, when using a touch screen.
- haptics refers to generating tactile feedback to a user of consumer electronics products, for example, when using a touch screen.
- UI user interface
- a haptic system produces a mechanical vibration that simulates a “click” of a mechanical actuator.
- the haptic response should follow closely in time with the user action.
- prolonged latency in the haptic response which is the delay between the moment of user contact and the corresponding haptics response, causes a disconnect between the touch and the haptic response.
- Bundling all the operating control for a device increases latency in haptic responses as well as other UI feedback responses. This latency is due to the time the device incurs to sense a user interaction, register and decode the interaction, process it through the operating system and/or an active application, select a response to the interaction, and drive the corresponding output device. When the latency exceeds about 250 ms, the latency becomes noticeable to the user and it can be perceived as device error rather than an event that was triggered by the user's input. For example, a user may touch a first button on a touch screen and move onto another function of the device before feeling the haptic response to the first button. This temporal disconnect results in low user acceptance of haptics leading to a poor user experience.
- bundling all the operating control for a device leads to inefficient power consumption.
- the host processor when in sleep mode generally wakes regularly to check the various bundled functions. Since the host processor typically is one of the larger power consumers in a device, waking the host processor regularly to check each bundled function on the device significantly drains power.
- the inventors recognized a need in the art for user feedback responses with low latency and low power consumption.
- FIG. 1 is a simplified block diagram of a display device according to an embodiment of the present invention.
- FIG. 2( a ) is a simplified block diagram of a user interface (UI) controller according to an embodiment of the present invention.
- FIG. 2( b ) is illustrates a two-dimensional workspace according to an embodiment of the present invention.
- FIG. 3( a ) illustrates a simplified flow diagram for generating a UI effect according to an embodiment of the present invention.
- FIG. 3( b ) illustrates a simplified flow diagram for generating a UI effect according to an embodiment of the present invention.
- FIG. 4 illustrates a simplified flow diagram for operating in sleep mode and to generate a UI effect according to an embodiment of the present invention.
- Embodiments of the present invention provide a user interface processing system for a device that may include at least one sensor, at least one output device, and a controller.
- the controller may include a memory, which may store instructional information, and a processor.
- the processor may be configured to receive sensor data from the sensor(s) and to interpret sensor data according to the instructional information.
- the processor may also generate a user interface feedback command and transmit the command to the at least one output device.
- the processor may report the sensor data to a host system of the device.
- the user interface controller may decrease latency in providing the feedback response to the user.
- FIG. 1 is a simplified block diagram of a haptic-enabled display device 100 according to an embodiment of the present invention.
- the device 100 may include a User Interface (UI) controller 110 , UI sensor(s) 120 with corresponding input device(s) 130 , environmental sensor(s) 140 , output device(s) 150 , and a host system 160 .
- UI User Interface
- the UI controller 110 may be coupled to the UI sensors 120 to receive user inputs and to the environmental sensors 140 to receive environmental conditions.
- the UI controller 110 also may be coupled to the output devices 150 to generate user feedback in response to the detected user inputs and environmental conditions.
- the UI controller 110 may be coupled to the host system 160 of the device.
- the UI controller 110 may receive instructions from the host system 160 and may transmit processed data from the UI sensors 120 and environmental sensors 140 to the host system 160 .
- the structure of the UI controller 110 will be described in further detail below.
- the UI sensors 120 may detect user input from their corresponding input devices 130 .
- a touch screen 130 . 1 may be provided as an input device 130 .
- the touch screen 130 . 1 may be a capacitive touch screen, a stereoscopic capacitive touch screen, a resistive touch screen.
- the input devices 130 may also be provided as an audio-pick device such as a microphone 130 . 2 .
- the input devices 130 may be provided as an optical system including a light emitting and light pick-up device, and/or an infra-red light emitting and light pick-up device. Consequently, the UI sensors 120 may be provided as a corresponding touch sensor 120 . 1 , audio sensor 120 . 2 , optical sensor, and/or infra-red sensor.
- the UI sensor(s) 120 may identify proximity events. For example, the UI sensors 120 may detect user fingers approaching the corresponding input device(s) 130 of a touch screen such as a capacitive touch screen. The UI controller 110 may then calculate a proximity event from the UI sensors 120 data.
- the environmental sensors 140 may detect environmental conditions such as location, position, orientation, temperature, lighting, etc., of the device.
- the environmental sensors 140 may be provided as a temperature sensor 140 . 1 , a motion sensor 140 . 2 (e.g., digital compass sensor, GPS, accelerometer and/or gyroscope), and/or an ambient light sensor.
- the output devices 150 may generate sensory user feedback.
- the user feedback may be a haptics response to provide a vibro-tactile feedback, an audio response to provide an auditory feedback, and/or a lighting response to provide a visual feedback in response to a user input.
- the output devices may be provided as a haptics device 150 . 1 , a speaker 150 . 2 , a display screen 150 . 3 , etc.
- the haptics device 150 . 1 may be embodied as piezoelectric elements, linear resonant actuators (LRAs) and/or eccentric rotating mass actuators (ERMs). In another embodiment, multiple haptics actuators may be provided to provide plural haptic responses, for example at different parts of the device simultaneously.
- the speaker 150 .
- the display screen 150 . 3 may provide a visual response.
- the display screen 150 . 3 may be provided as a backlit LCD display with an LCD matrix, lenticular lenses, polarizers, etc.
- a touch screen may be overlaid on face of the display.
- the host system 160 may include an operating system and application(s) that are being executed by the operating system (OS).
- the host system 160 may represent processing resources for the remainder of the device and may include central processing units, memory for storage of instructions representing an operating system and/or applications, input/output devices such as display driver (not shown), audio drivers, user input keys and the like.
- the host system 160 may include program instructions to govern operations of the device and manage device resources on behalf of various applications.
- the host system 160 may, for example, manage content of the display, providing icons and softkeys thereon to solicit user input thru the output devices 150 .
- the host system 160 may also control the output devices 150 via the UI controller 110 or directly via the bypass route shown in FIG. 1 .
- FIG. 2( a ) is a functional block diagram of a UI controller 200 according to an embodiment of the present invention.
- the UI controller 200 may be implemented in the device 100 of FIG. 1 .
- the UI controller 200 may include input driver(s) 210 , a processor 220 , a memory 230 , and output driver(s) 240 .
- the input driver(s) 210 may receive sensor inputs (environmental and/or user interface sensors) and may generate a corresponding input signal.
- the sensor inputs may be coupled to the input driver(s) 210 via a serial interface such as a high speed I2C interface.
- the input driver(s) 210 may also control the coupled sensor operations such as when to power on, read data, etc.
- the processor 220 may control the operations of the UI controller 110 according to instructions saved in the memory 230 .
- the memory 230 may be provided as a non-volatile memory, a volatile memory such as random access memory (RAM), or a combination thereof.
- the processor 220 may include a gesture classification module 222 , a UI search module 224 , and a response search module 226 .
- the memory 230 may include gesture definition data 232 , UI map data 234 , and response patterns data 236 .
- the data may be stored as look-up-tables (LUTs).
- the gesture definition data 232 may include a LUT with possible input value(s) and corresponding gesture(s).
- the UI map data 234 may include a LUT with possible input value(s) and corresponding icon(s).
- the response patterns 236 may include a LUT with possible gesture and icon value(s), and their corresponding response drive pattern(s).
- the data may be written into the memory 230 by the host system (e.g., OS and/or applications) or may be pre-programmed.
- the gesture classification module 222 may receive the input signal from the input driver(s) 210 and may calculate a gesture from the input signal based on the gesture definition data 232 . For example, the gesture classification module 222 may compare the input signal to stored input value(s) in the gesture definition data 232 and may match the input signal to a corresponding stored gesture value. The gesture may represent a user action on the touch screen indicated by the input signal. The calculated gesture may be reported to the host system.
- the UI search module 224 may receive the input signal from the input driver(s) 210 and may calculate a UI interaction such as an icon selection from the input signal based on the UI map data 232 . For example, the UI search module 224 may compare the input signal to stored input value(s) in the UI map data 232 and may match the input signal to a corresponding UI interaction. The UI interaction may represent a user action on the touch screen indicated by the input signal. The calculated UI interaction may be reported to the host system.
- the response search module 226 may receive the calculated gesture and UI interaction, and may generate a response drive pattern based on the response patterns data 236 . For example, the response search module 226 may compare the stored gesture and UI interaction to stored gesture and UI interaction values, and may match them to a corresponding response drive pattern.
- the response drive pattern may received by output driver(s) 240 , which, in turn, may generate corresponding drive signals that are outputted to respective output device(s) (i.e., haptic device, speaker, and/or display screen).
- the drive pattern may correspond to a haptic effect, audio effect, and/or visual effect in response to a user action to provide quick feedback to the user because the UI map data 232 , the UI search module 234 , the response patterns data 236 are available in the UI controller.
- the device can output response faster than if OS and application are involved.
- a haptic-enabled display device may establish interactive user interface elements and provide a haptic response only when user input spatially coincides with a registered element.
- a haptics enabled device may register specific haptics response patterns with each of the interactive elements and, when user input indicates interaction with an element, the device responds with a haptic effect that is registered with it.
- FIG. 2( b ) illustrates a two-dimensional workspace 250 (i.e., UI map) for use in accordance with embodiments of the present invention.
- the workspace 250 is illustrated as including a plurality of icons 260 and buttons 270 that identify interactive elements of the workspace 250 .
- the workspace 250 may include other areas that are not designated as interactive.
- icons 260 may be spaced apart from each other by a certain separation distance.
- other areas of the display may be unoccupied by content or occupied with display data that is non-interactive.
- non-interactive areas of the device may be may be designated as “dead zones” (DZs) for purposes of user interaction (shown in gray in the example of FIG. 2( b )).
- DZs dead zones
- the device may output haptics responses when a touch is detected in a spatial area of the workspace that is occupied by an interactive user element.
- the device may be configured to avoid outputting a haptics response when a user interacts with a dead zone of the workspace, even though the device may register a touch at the position. By avoiding outputting of haptics responses for user touches that occur in dead zones, the device improves user interaction by simulating clicks only for properly registered user interactivity.
- FIG. 3( a ) illustrates a method 300 of generating a UI effect according to an embodiment of the present invention.
- the UI controller 110 may receive sensor input(s).
- the sensor input(s) may be from UI sensor(s) or from environmental sensor(s) or a combination thereof.
- the UI controller 110 may process the sensor data by decoding the data according to instructions stored in its memory.
- the instructions may be sent from the host system 160 and may include gesture definitions, UI map information, response patterns corresponding to a currently application running on the device 100 .
- the UI map information may relate to a specific display level/stage in the running application.
- the UI map may identify spatial areas of the touch screen that are displaying interactive user interface elements, such as icons, buttons, menu items and the like.
- the UI controller may calculate a gesture and/or user interaction representing the sensor data.
- the instructions may also include user feedback profiles corresponding to the current display level/stage of the running application.
- the user feedback profiles may define different UI effects such as haptic effects, sound effects, and/or visual effects associated with various sensor inputs.
- the UI controller 110 may generate a UI effect drive pattern, which may be based on the processed sensor data and the stored instructions.
- the UI controller 110 may transmit the drive pattern to one or more of the output devices 150 , which, in turn, may generate the desired UI effect.
- the UI effect may be a sensory feedback to the user such as a haptic effect, sound effect, and/or visual effect.
- the UI controller 110 may generate a vibrating haptic effect accompanied with a clicking sound to provide the user confirmation of the specific user input event.
- the UI controller 110 may generate user feedback response in the form of a UI effect such as a haptic response without the need to involve the host system 160 .
- the UI controller 110 may also report the processed sensor data to the host system 160 in step 308 .
- the host system 160 may update the running application on device according to the processed sensor data.
- the host system 160 may then send updated gesture definitions, UI maps, and/or response patterns to the UI controller 110 if the display level/stage of the running application has changed or the running application has ended in response to the processed sensor data.
- all instruction data may be sent to the UI controller 110 at the initiation of an application.
- Having a direct sensor-to-output communication path in a device advantageously reduces latency of feedback responses such as haptic events.
- delays of 250 ms between a touch and a haptics response can interfere with satisfactory user experience.
- Such delays can be incurred in systems that require a host system 160 to decode a user touch and generate a haptics event in response.
- a dedicated UI controller may reduce feedback response latency to improve user experience satisfaction.
- FIG. 3( b ) illustrates a method 350 of generating a UI effect according to another embodiment of the present invention.
- the UI controller 110 may receive UI sensor input(s).
- the UI sensor input(s) may correspond to a user input event relating to the device 100 .
- the UI sensor input(s) may come from a capacitive touch sensor, resistive touch sensor, audio sensor, optical sensor, and/or infra-red sensor.
- the user input event may identify a proximity event such as when the user's finger(s) approach a touch screen.
- the UI controller 110 may generate location coordinates for the user event and may process the UI sensor data based on the location coordinates and instructions stored in the memory 220 .
- location coordinates may be resolved as X,Y coordinates of touch along a surface of the touch screen.
- location coordinates may also be include a Z coordinate corresponding to the distance from the touch screen, for example in relation to a proximity event.
- the instructions may be sent from the host system 160 and may include UI map information corresponding to a currently application running on the device 100 .
- the UI map information may relate to a specific display level/stage in the running application.
- the UI map may identify spatial areas of the touch screen that are displaying interactive user interface elements, such as icons, buttons, menu items and the like.
- the instructions may also include user feedback profiles corresponding to the current display level/stage of the running application.
- the user feedback profiles may define different UI effects such as haptic effects, sound effects, and/or visual effects associated with various sensor inputs.
- the UI controller 110 may read environmental sensor input(s) in step 356 .
- the environmental sensor input(s) may be indicative of device environmental conditions such as location, position, orientation, temperature, lighting, etc.
- the environmental sensor input(s) may be provided by an ambient light sensor, digital compass sensor, accelerometer and/or gyroscope.
- the environmental sensor input(s) may be processed based on instructions stored in the memory 220 .
- the UI controller 110 may process the UI sensor data while reading and processing environmental sensor data. The parallel processing may further reduce latency issues.
- the processed UI data and environmental data may be combined.
- the UI controller 110 may process the combined data to interpret user actions such as a gesture.
- tap strengths may be distinguished by the UI controller if the application uses tap strength levels as different user input events.
- the UI sensor data may correspond to the location of the tap
- environmental data may correspond to force from an accelerometer measurement.
- a light tap may be identified by the touch screen as a normal touch while a hard tap may be identified by the accelerometer measurements over a certain threshold level.
- a light tap may be distinguished from a hard tap.
- different tap strengths as well as other input variances may designate different gestures.
- the UI controller 110 may generate a corresponding UI effect drive pattern in step 364 .
- the UI controller 110 may generate an effect command for the drive pattern based on the processed sensor data and the stored instructions.
- the UI controller 110 may transmit the drive pattern to one or more of the output devices 150 to produce the UI effect.
- the UI effect may be a sensory feedback to the user such as a haptic effect, sound effect, and/or visual effect.
- the UI controller 110 may also report the interpreted user action to the host system 160 in step 366 .
- the host system 160 may update the running application on device according to the interpreted user action.
- the host system 160 may then send updated gesture definitions, UI maps, and/or response patterns to the UI controller 110 if the display level/stage of the running application has changed or the running application has ended in response to the interpreted user action.
- all instruction data may be sent to the UI controller 110 at the initiation of an application.
- a dedicated UI controller separate from the host system may also advantageously reduce power consumption. Having the host system process UI sensor and environmental sensor inputs is inefficient especially during sleep cycles. Typically, a host system must wake from sleep mode on a regular basis to read the coupled sensor inputs. However, according to an embodiment of the present invention the UI controller may service the sensor inputs and allow the host system to remain in sleep mode. Allowing the host system, generally a large power consumer, to remain in sleep mode for longer periods of time may reduce the overall power consumption of the device.
- FIG. 4 illustrates a method 400 of sleep mode operations and generating a UI effect according to an embodiment of the present invention.
- the UI controller 110 may be in sleep mode.
- the host system 160 may also be in sleep mode at this time.
- the UI controller 110 may wake from sleep mode. For example, the UI controller 110 may wake based on a wake up timer trigger or the like. The host system 160 may remain in sleep mode at this time.
- the UI controller 110 may check if any UI sensor inputs are triggered. For example, the UI controller 110 may check if the user has interacted with a selected object to wake the device from sleep mode.
- the UI controller 110 may check if any environmental sensor inputs are triggered in step 408 . If no environmental sensor inputs are triggered either, the UI controller 110 may return to sleep mode. However, if an environmental sensor input is triggered, the UI controller 110 may read and process the environmental data in step 410 . If necessary, a feedback output may be generated based on the environmental data in step 412 . Also, if necessary, the environmental data may be reported to the host system in step 414 in turn waking the host system. Alternatively, after processing the environmental data, the UI controller 110 may return to sleep mode if a feedback output is not deemed necessary.
- the UI controller 110 may read and process the UI data.
- the UI sensor input(s) may correspond to a user event relating to the device 100 .
- the UI sensor input(s) may come from a capacitive touch sensor, resistive touch sensor, audio sensor, optical sensor, and/or infra-red sensor.
- the user event may identify proximity event such as when the user's finger(s) approach a touch screen.
- the UI controller 110 may generate location coordinates for the user event and may process the UI sensor data based on the location coordinates and instructions stored in the memory 220 .
- location coordinates may be resolved as X,Y coordinates of touch along a surface of the touch screen. Additionally, according to embodiments of the present events, location coordinates may also be resolved as a Z coordinate corresponding to the distance from the touch screen for a proximity event.
- the instructions may be sent from the host system 160 and may include
- the instructions may also include UI map information corresponding to an application running concurrently on the device 110 , in particular to a current display level/stage in the running application.
- the UI map may identify spatial areas of the touch screen that are displaying interactive user interface elements, such as icons, buttons, menu items and the like.
- the instructions may also include user feedback profiles corresponding to the current display level/stage of the running application.
- the user feedback profiles may define different UI effects such as haptic effects, sound effects, and/or visual effects associated with various sensor inputs.
- the UI controller 110 may read environmental sensor input(s) in step 418 .
- the environmental sensor input(s) may be indicative of environmental conditions of the device such as location, position, orientation, temperature, lighting, etc.
- the environmental sensor input(s) may be provided by an ambient light sensor, digital compass sensor, accelerometer and/or gyroscope.
- the environmental sensor input(s) may be processed based on instructions stored in the memory 220 .
- the UI controller 110 may process the UI sensor data while reading and processing environmental sensor data. The parallel processing may further reduce latency issues.
- step 422 the processed UI data and environmental data may be combined.
- step 324 the UI controller 110 may process the combined data to interpret use actions such as gesture(s) as described above.
- the UI controller 110 may generate a corresponding UI effect drive pattern in step 426 .
- the UI controller 110 may generate an effect command for the drive pattern based on the processed sensor data and the stored instructions.
- the UI controller 110 may transmit the drive pattern to one or more of the output devices 150 to produce the UI effect.
- the UI effect may be a sensory feedback to the user such as a haptic effect, sound effect, and/or visual effect.
- the UI controller 110 may also report the interpreted user action to the host system 160 in step 414 in turn waking the host system.
- the host system 160 may update the running application on device according to the interpreted user action.
- the host system 160 may send updated gesture definitions, UI maps, and/or response patterns to the UI controller 110 if the display level/stage of the running application has changed or the running application has ended in response to the interpreted user action.
- all instruction data may be sent to the UI controller 110 at the initiation of an application.
- Various embodiments may be implemented using hardware elements, software elements, or a combination of both.
- hardware elements may include processors, microprocessors, circuits, circuit elements (e.g., transistors, resistors, capacitors, inductors, and so forth), integrated circuits, application specific integrated circuits (ASIC), programmable logic devices (PLD), digital signal processors (DSP), field programmable gate array (FPGA), logic gates, registers, semiconductor device, chips, microchips, chip sets, and so forth.
- Examples of software may include software components, programs, applications, computer programs, application programs, system programs, machine programs, operating system software, middleware, firmware, software modules, routines, subroutines, functions, methods, procedures, software interfaces, application program interfaces (API), instruction sets, computing code, computer code, code segments, computer code segments, words, values, symbols, or any combination thereof. Determining whether an embodiment is implemented using hardware elements and/or software elements may vary in accordance with any number of factors, such as desired computational rate, power levels, heat tolerances, processing cycle budget, input data rates, output data rates, memory resources, data bus speeds and other design or performance constraints.
- Some embodiments may be implemented, for example, using a computer-readable medium or article which may store an instruction or a set of instructions that, if executed by a machine, may cause the machine to perform a method and/or operations in accordance with the embodiments.
- a machine may include, for example, any suitable processing platform, computing platform, computing device, processing device, computing system, processing system, computer, processor, or the like, and may be implemented using any suitable combination of hardware and/or software.
- the computer-readable medium or article may include, for example, any suitable type of memory unit, memory device, memory article, memory medium, storage device, storage article, storage medium and/or storage unit, for example, memory, removable or non-removable media, erasable or non-erasable media, writeable or re-writeable media, digital or analog media, hard disk, floppy disk, Compact Disc Read Only Memory (CD-ROM), Compact Disc Recordable (CD-R), Compact Disc Rewriteable (CD-RW), optical disk, magnetic media, magneto-optical media, removable memory cards or disks, various types of Digital Versatile Disc (DVD), a tape, a cassette, or the like.
- any suitable type of memory unit for example, any suitable type of memory unit, memory device, memory article, memory medium, storage device, storage article, storage medium and/or storage unit, for example, memory, removable or non-removable media, erasable or non-erasable media, writeable or re-writeable media, digital or analog media, hard disk
- the instructions may include any suitable type of code, such as source code, compiled code, interpreted code, executable code, static code, dynamic code, encrypted code, and the like, implemented using any suitable high-level, low-level, object-oriented, visual, compiled and/or interpreted programming language.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Hardware Design (AREA)
- User Interface Of Digital Computer (AREA)
- Programmable Controllers (AREA)
Abstract
Embodiments of the present invention provide a user interface processing system for a device that may include at least one sensor, at least one output device, and a controller. The controller may include a memory, which may store instructional information, and a processor. The processor may be configured to receive sensor data from the at least one sensor and to interpret sensor data according to the instructional information. The processor may also generate a user interface feedback command and transmit the command to the at least one output device. Furthermore, the processor may report the sensor data to a host system of the device. By processing the sensor data and generating a corresponding feedback response, for example a haptic response, without the need for host system processing, the user interface controller may decrease latency in providing the feedback response to the user.
Description
- This application claims priority to provisional U.S. Patent Application Ser. No. 61/470,764, entitled “Touch Screen and Haptic Control” filed on Apr. 1, 2011, the content of which is incorporated herein in its entirety.
- The present invention relates to user interface control.
- Typically, a single host processor controls robust operating functions for a consumer electronic device. One function generally controlled by the host processor is “haptics,” which refers to generating tactile feedback to a user of consumer electronics products, for example, when using a touch screen. When a user interacts with a user interface (UI) such as a touch screen, a haptic system produces a mechanical vibration that simulates a “click” of a mechanical actuator. For a user to accept haptics, the haptic response should follow closely in time with the user action. Thus, prolonged latency in the haptic response, which is the delay between the moment of user contact and the corresponding haptics response, causes a disconnect between the touch and the haptic response.
- Bundling all the operating control for a device increases latency in haptic responses as well as other UI feedback responses. This latency is due to the time the device incurs to sense a user interaction, register and decode the interaction, process it through the operating system and/or an active application, select a response to the interaction, and drive the corresponding output device. When the latency exceeds about 250 ms, the latency becomes noticeable to the user and it can be perceived as device error rather than an event that was triggered by the user's input. For example, a user may touch a first button on a touch screen and move onto another function of the device before feeling the haptic response to the first button. This temporal disconnect results in low user acceptance of haptics leading to a poor user experience.
- Furthermore, bundling all the operating control for a device leads to inefficient power consumption. For example, the host processor when in sleep mode generally wakes regularly to check the various bundled functions. Since the host processor typically is one of the larger power consumers in a device, waking the host processor regularly to check each bundled function on the device significantly drains power.
- Hence, the inventors recognized a need in the art for user feedback responses with low latency and low power consumption.
-
FIG. 1 is a simplified block diagram of a display device according to an embodiment of the present invention. -
FIG. 2( a) is a simplified block diagram of a user interface (UI) controller according to an embodiment of the present invention. -
FIG. 2( b) is illustrates a two-dimensional workspace according to an embodiment of the present invention. -
FIG. 3( a) illustrates a simplified flow diagram for generating a UI effect according to an embodiment of the present invention. -
FIG. 3( b) illustrates a simplified flow diagram for generating a UI effect according to an embodiment of the present invention. -
FIG. 4 illustrates a simplified flow diagram for operating in sleep mode and to generate a UI effect according to an embodiment of the present invention. - Embodiments of the present invention provide a user interface processing system for a device that may include at least one sensor, at least one output device, and a controller. The controller may include a memory, which may store instructional information, and a processor. The processor may be configured to receive sensor data from the sensor(s) and to interpret sensor data according to the instructional information. The processor may also generate a user interface feedback command and transmit the command to the at least one output device. Furthermore, the processor may report the sensor data to a host system of the device. By processing the sensor data and generating a corresponding feedback response, for example a haptic response, without the need for host system processing, the user interface controller may decrease latency in providing the feedback response to the user.
-
FIG. 1 is a simplified block diagram of a haptic-enableddisplay device 100 according to an embodiment of the present invention. Thedevice 100 may include a User Interface (UI)controller 110, UI sensor(s) 120 with corresponding input device(s) 130, environmental sensor(s) 140, output device(s) 150, and ahost system 160. - The
UI controller 110 may be coupled to theUI sensors 120 to receive user inputs and to theenvironmental sensors 140 to receive environmental conditions. TheUI controller 110 also may be coupled to theoutput devices 150 to generate user feedback in response to the detected user inputs and environmental conditions. Moreover, theUI controller 110 may be coupled to thehost system 160 of the device. TheUI controller 110 may receive instructions from thehost system 160 and may transmit processed data from theUI sensors 120 andenvironmental sensors 140 to thehost system 160. The structure of theUI controller 110 will be described in further detail below. - The
UI sensors 120 may detect user input from theircorresponding input devices 130. A touch screen 130.1 may be provided as aninput device 130. The touch screen 130.1 may be a capacitive touch screen, a stereoscopic capacitive touch screen, a resistive touch screen. Theinput devices 130 may also be provided as an audio-pick device such as a microphone 130.2. Moreover, theinput devices 130 may be provided as an optical system including a light emitting and light pick-up device, and/or an infra-red light emitting and light pick-up device. Consequently, theUI sensors 120 may be provided as a corresponding touch sensor 120.1, audio sensor 120.2, optical sensor, and/or infra-red sensor. - In another embodiment, the UI sensor(s) 120 may identify proximity events. For example, the
UI sensors 120 may detect user fingers approaching the corresponding input device(s) 130 of a touch screen such as a capacitive touch screen. TheUI controller 110 may then calculate a proximity event from theUI sensors 120 data. - The
environmental sensors 140 may detect environmental conditions such as location, position, orientation, temperature, lighting, etc., of the device. For example, theenvironmental sensors 140 may be provided as a temperature sensor 140.1, a motion sensor 140.2 (e.g., digital compass sensor, GPS, accelerometer and/or gyroscope), and/or an ambient light sensor. - The
output devices 150 may generate sensory user feedback. The user feedback may be a haptics response to provide a vibro-tactile feedback, an audio response to provide an auditory feedback, and/or a lighting response to provide a visual feedback in response to a user input. The output devices may be provided as a haptics device 150.1, a speaker 150.2, a display screen 150.3, etc. The haptics device 150.1 may be embodied as piezoelectric elements, linear resonant actuators (LRAs) and/or eccentric rotating mass actuators (ERMs). In another embodiment, multiple haptics actuators may be provided to provide plural haptic responses, for example at different parts of the device simultaneously. The speaker 150.2 may provide an audio response, and the display screen 150.3 may provide a visual response. The display screen 150.3 may be provided as a backlit LCD display with an LCD matrix, lenticular lenses, polarizers, etc. A touch screen may be overlaid on face of the display. - The
host system 160 may include an operating system and application(s) that are being executed by the operating system (OS). Thehost system 160 may represent processing resources for the remainder of the device and may include central processing units, memory for storage of instructions representing an operating system and/or applications, input/output devices such as display driver (not shown), audio drivers, user input keys and the like. Thehost system 160 may include program instructions to govern operations of the device and manage device resources on behalf of various applications. Thehost system 160 may, for example, manage content of the display, providing icons and softkeys thereon to solicit user input thru theoutput devices 150. Thehost system 160 may also control theoutput devices 150 via theUI controller 110 or directly via the bypass route shown inFIG. 1 . -
FIG. 2( a) is a functional block diagram of aUI controller 200 according to an embodiment of the present invention. TheUI controller 200 may be implemented in thedevice 100 ofFIG. 1 . TheUI controller 200 may include input driver(s) 210, aprocessor 220, amemory 230, and output driver(s) 240. The input driver(s) 210 may receive sensor inputs (environmental and/or user interface sensors) and may generate a corresponding input signal. The sensor inputs may be coupled to the input driver(s) 210 via a serial interface such as a high speed I2C interface. The input driver(s) 210 may also control the coupled sensor operations such as when to power on, read data, etc. - The
processor 220 may control the operations of theUI controller 110 according to instructions saved in thememory 230. Thememory 230 may be provided as a non-volatile memory, a volatile memory such as random access memory (RAM), or a combination thereof. Theprocessor 220 may include agesture classification module 222, aUI search module 224, and a response search module 226. Thememory 230 may includegesture definition data 232,UI map data 234, andresponse patterns data 236. The data may be stored as look-up-tables (LUTs). For example, thegesture definition data 232 may include a LUT with possible input value(s) and corresponding gesture(s). TheUI map data 234 may include a LUT with possible input value(s) and corresponding icon(s). Furthermore, theresponse patterns 236 may include a LUT with possible gesture and icon value(s), and their corresponding response drive pattern(s). Also, the data may be written into thememory 230 by the host system (e.g., OS and/or applications) or may be pre-programmed. - The
gesture classification module 222 may receive the input signal from the input driver(s) 210 and may calculate a gesture from the input signal based on thegesture definition data 232. For example, thegesture classification module 222 may compare the input signal to stored input value(s) in thegesture definition data 232 and may match the input signal to a corresponding stored gesture value. The gesture may represent a user action on the touch screen indicated by the input signal. The calculated gesture may be reported to the host system. - The
UI search module 224 may receive the input signal from the input driver(s) 210 and may calculate a UI interaction such as an icon selection from the input signal based on theUI map data 232. For example, theUI search module 224 may compare the input signal to stored input value(s) in theUI map data 232 and may match the input signal to a corresponding UI interaction. The UI interaction may represent a user action on the touch screen indicated by the input signal. The calculated UI interaction may be reported to the host system. - Further, the response search module 226 may receive the calculated gesture and UI interaction, and may generate a response drive pattern based on the
response patterns data 236. For example, the response search module 226 may compare the stored gesture and UI interaction to stored gesture and UI interaction values, and may match them to a corresponding response drive pattern. The response drive pattern may received by output driver(s) 240, which, in turn, may generate corresponding drive signals that are outputted to respective output device(s) (i.e., haptic device, speaker, and/or display screen). For example, the drive pattern may correspond to a haptic effect, audio effect, and/or visual effect in response to a user action to provide quick feedback to the user because theUI map data 232, theUI search module 234, theresponse patterns data 236 are available in the UI controller. Thus, the device can output response faster than if OS and application are involved. - According to an embodiment of the present invention, a haptic-enabled display device may establish interactive user interface elements and provide a haptic response only when user input spatially coincides with a registered element. In another embodiment, a haptics enabled device may register specific haptics response patterns with each of the interactive elements and, when user input indicates interaction with an element, the device responds with a haptic effect that is registered with it.
-
FIG. 2( b) illustrates a two-dimensional workspace 250 (i.e., UI map) for use in accordance with embodiments of the present invention. Theworkspace 250 is illustrated as including a plurality oficons 260 andbuttons 270 that identify interactive elements of theworkspace 250. Theworkspace 250 may include other areas that are not designated as interactive. For example,icons 260 may be spaced apart from each other by a certain separation distance. Further, other areas of the display may be unoccupied by content or occupied with display data that is non-interactive. Thus, non-interactive areas of the device may be may be designated as “dead zones” (DZs) for purposes of user interaction (shown in gray in the example ofFIG. 2( b)). - In an embodiment, the device may output haptics responses when a touch is detected in a spatial area of the workspace that is occupied by an interactive user element. In an embodiment, the device may be configured to avoid outputting a haptics response when a user interacts with a dead zone of the workspace, even though the device may register a touch at the position. By avoiding outputting of haptics responses for user touches that occur in dead zones, the device improves user interaction by simulating clicks only for properly registered user interactivity.
-
FIG. 3( a) illustrates amethod 300 of generating a UI effect according to an embodiment of the present invention. Instep 302, theUI controller 110 may receive sensor input(s). The sensor input(s) may be from UI sensor(s) or from environmental sensor(s) or a combination thereof. - In
step 304, theUI controller 110 may process the sensor data by decoding the data according to instructions stored in its memory. The instructions may be sent from thehost system 160 and may include gesture definitions, UI map information, response patterns corresponding to a currently application running on thedevice 100. For example, the UI map information may relate to a specific display level/stage in the running application. The UI map may identify spatial areas of the touch screen that are displaying interactive user interface elements, such as icons, buttons, menu items and the like. The UI controller may calculate a gesture and/or user interaction representing the sensor data. - The instructions may also include user feedback profiles corresponding to the current display level/stage of the running application. For example, the user feedback profiles may define different UI effects such as haptic effects, sound effects, and/or visual effects associated with various sensor inputs.
- In
step 306, theUI controller 110 may generate a UI effect drive pattern, which may be based on the processed sensor data and the stored instructions. TheUI controller 110 may transmit the drive pattern to one or more of theoutput devices 150, which, in turn, may generate the desired UI effect. As described above, the UI effect may be a sensory feedback to the user such as a haptic effect, sound effect, and/or visual effect. For example, in response to sensed input of user input event of touching an icon, theUI controller 110 may generate a vibrating haptic effect accompanied with a clicking sound to provide the user confirmation of the specific user input event. Thus, theUI controller 110 may generate user feedback response in the form of a UI effect such as a haptic response without the need to involve thehost system 160. - The
UI controller 110 may also report the processed sensor data to thehost system 160 instep 308. Thehost system 160 may update the running application on device according to the processed sensor data. Thehost system 160 may then send updated gesture definitions, UI maps, and/or response patterns to theUI controller 110 if the display level/stage of the running application has changed or the running application has ended in response to the processed sensor data. In another embodiment, all instruction data may be sent to theUI controller 110 at the initiation of an application. - Having a direct sensor-to-output communication path in a device advantageously reduces latency of feedback responses such as haptic events. As noted, during operation, delays of 250 ms between a touch and a haptics response can interfere with satisfactory user experience. Such delays can be incurred in systems that require a
host system 160 to decode a user touch and generate a haptics event in response. During high volume data entry, such as typing, texting or cursor navigation, users enter data so quickly that their fingers may have touched and departed a given touch screen location before a 250 ms latency haptics event is generated. Thus, a dedicated UI controller according to embodiments of the present invention as described herein may reduce feedback response latency to improve user experience satisfaction. -
FIG. 3( b) illustrates amethod 350 of generating a UI effect according to another embodiment of the present invention. Instep 352, theUI controller 110 may receive UI sensor input(s). The UI sensor input(s) may correspond to a user input event relating to thedevice 100. For example, the UI sensor input(s) may come from a capacitive touch sensor, resistive touch sensor, audio sensor, optical sensor, and/or infra-red sensor. In one embodiment, the user input event may identify a proximity event such as when the user's finger(s) approach a touch screen. - In
step 354, theUI controller 110 may generate location coordinates for the user event and may process the UI sensor data based on the location coordinates and instructions stored in thememory 220. Typically, location coordinates may be resolved as X,Y coordinates of touch along a surface of the touch screen. Additionally, according to an embodiment of the present invention, location coordinates may also be include a Z coordinate corresponding to the distance from the touch screen, for example in relation to a proximity event. - As described above, the instructions may be sent from the
host system 160 and may include UI map information corresponding to a currently application running on thedevice 100. In particular, the UI map information may relate to a specific display level/stage in the running application. The UI map may identify spatial areas of the touch screen that are displaying interactive user interface elements, such as icons, buttons, menu items and the like. The instructions may also include user feedback profiles corresponding to the current display level/stage of the running application. For example, the user feedback profiles may define different UI effects such as haptic effects, sound effects, and/or visual effects associated with various sensor inputs. - Further in response to receiving UI sensor input(s), the
UI controller 110 may read environmental sensor input(s) instep 356. The environmental sensor input(s) may be indicative of device environmental conditions such as location, position, orientation, temperature, lighting, etc. For example, the environmental sensor input(s) may be provided by an ambient light sensor, digital compass sensor, accelerometer and/or gyroscope. - In
step 358, the environmental sensor input(s) may be processed based on instructions stored in thememory 220. As shown inFIG. 3( b), theUI controller 110 may process the UI sensor data while reading and processing environmental sensor data. The parallel processing may further reduce latency issues. - In
step 360, the processed UI data and environmental data may be combined. Instep 362, theUI controller 110 may process the combined data to interpret user actions such as a gesture. For example, tap strengths may be distinguished by the UI controller if the application uses tap strength levels as different user input events. The UI sensor data may correspond to the location of the tap, and environmental data may correspond to force from an accelerometer measurement. For example, a light tap may be identified by the touch screen as a normal touch while a hard tap may be identified by the accelerometer measurements over a certain threshold level. Thus, a light tap may be distinguished from a hard tap. Moreover, different tap strengths as well as other input variances may designate different gestures. - Based on the interpreted user action, the
UI controller 110 may generate a corresponding UI effect drive pattern instep 364. TheUI controller 110 may generate an effect command for the drive pattern based on the processed sensor data and the stored instructions. TheUI controller 110 may transmit the drive pattern to one or more of theoutput devices 150 to produce the UI effect. As described above, the UI effect may be a sensory feedback to the user such as a haptic effect, sound effect, and/or visual effect. - Furthermore, the
UI controller 110 may also report the interpreted user action to thehost system 160 instep 366. Thehost system 160 may update the running application on device according to the interpreted user action. Thehost system 160 may then send updated gesture definitions, UI maps, and/or response patterns to theUI controller 110 if the display level/stage of the running application has changed or the running application has ended in response to the interpreted user action. In another embodiment, all instruction data may be sent to theUI controller 110 at the initiation of an application. - A dedicated UI controller separate from the host system according to embodiments of the present invention described herein may also advantageously reduce power consumption. Having the host system process UI sensor and environmental sensor inputs is inefficient especially during sleep cycles. Typically, a host system must wake from sleep mode on a regular basis to read the coupled sensor inputs. However, according to an embodiment of the present invention the UI controller may service the sensor inputs and allow the host system to remain in sleep mode. Allowing the host system, generally a large power consumer, to remain in sleep mode for longer periods of time may reduce the overall power consumption of the device.
-
FIG. 4 illustrates amethod 400 of sleep mode operations and generating a UI effect according to an embodiment of the present invention. Instep 402, theUI controller 110 may be in sleep mode. Thehost system 160 may also be in sleep mode at this time. - At
step 404, theUI controller 110 may wake from sleep mode. For example, theUI controller 110 may wake based on a wake up timer trigger or the like. Thehost system 160 may remain in sleep mode at this time. - In
step 406, theUI controller 110 may check if any UI sensor inputs are triggered. For example, theUI controller 110 may check if the user has interacted with a selected object to wake the device from sleep mode. - If no UI sensor inputs are triggered in
step 406, theUI controller 110 may check if any environmental sensor inputs are triggered instep 408. If no environmental sensor inputs are triggered either, theUI controller 110 may return to sleep mode. However, if an environmental sensor input is triggered, theUI controller 110 may read and process the environmental data instep 410. If necessary, a feedback output may be generated based on the environmental data instep 412. Also, if necessary, the environmental data may be reported to the host system instep 414 in turn waking the host system. Alternatively, after processing the environmental data, theUI controller 110 may return to sleep mode if a feedback output is not deemed necessary. - If a UI sensor input(s) is triggered in
step 406, theUI controller 110 may read and process the UI data. The UI sensor input(s) may correspond to a user event relating to thedevice 100. For example, the UI sensor input(s) may come from a capacitive touch sensor, resistive touch sensor, audio sensor, optical sensor, and/or infra-red sensor. In one embodiment, the user event may identify proximity event such as when the user's finger(s) approach a touch screen. - In
step 416, theUI controller 110 may generate location coordinates for the user event and may process the UI sensor data based on the location coordinates and instructions stored in thememory 220. Typically, location coordinates may be resolved as X,Y coordinates of touch along a surface of the touch screen. Additionally, according to embodiments of the present events, location coordinates may also be resolved as a Z coordinate corresponding to the distance from the touch screen for a proximity event. - As described above, the instructions may be sent from the
host system 160 and may include The instructions may also include UI map information corresponding to an application running concurrently on thedevice 110, in particular to a current display level/stage in the running application. The UI map may identify spatial areas of the touch screen that are displaying interactive user interface elements, such as icons, buttons, menu items and the like. The instructions may also include user feedback profiles corresponding to the current display level/stage of the running application. For example, the user feedback profiles may define different UI effects such as haptic effects, sound effects, and/or visual effects associated with various sensor inputs. - Further in response to receiving UI sensor input(s), the
UI controller 110 may read environmental sensor input(s) instep 418. The environmental sensor input(s) may be indicative of environmental conditions of the device such as location, position, orientation, temperature, lighting, etc. For example, the environmental sensor input(s) may be provided by an ambient light sensor, digital compass sensor, accelerometer and/or gyroscope. - In
step 420, the environmental sensor input(s) may be processed based on instructions stored in thememory 220. As shown, theUI controller 110 may process the UI sensor data while reading and processing environmental sensor data. The parallel processing may further reduce latency issues. - In
step 422, the processed UI data and environmental data may be combined. In step 324, theUI controller 110 may process the combined data to interpret use actions such as gesture(s) as described above. - Based on the interpreted user action, the
UI controller 110 may generate a corresponding UI effect drive pattern instep 426. TheUI controller 110 may generate an effect command for the drive pattern based on the processed sensor data and the stored instructions. TheUI controller 110 may transmit the drive pattern to one or more of theoutput devices 150 to produce the UI effect. As described above, the UI effect may be a sensory feedback to the user such as a haptic effect, sound effect, and/or visual effect. - Furthermore, the
UI controller 110 may also report the interpreted user action to thehost system 160 instep 414 in turn waking the host system. Thehost system 160 may update the running application on device according to the interpreted user action. Thehost system 160 may send updated gesture definitions, UI maps, and/or response patterns to theUI controller 110 if the display level/stage of the running application has changed or the running application has ended in response to the interpreted user action. In another embodiment, all instruction data may be sent to theUI controller 110 at the initiation of an application. - Those skilled in the art may appreciate from the foregoing description that the present invention may be implemented in a variety of forms, and that the various embodiments may be implemented alone or in combination. Therefore, while the embodiments of the present invention have been described in connection with particular examples thereof, the true scope of the embodiments and/or methods of the present invention should not be so limited since other modifications will become apparent to the skilled practitioner upon a study of the drawings, specification, and following claims.
- Various embodiments may be implemented using hardware elements, software elements, or a combination of both. Examples of hardware elements may include processors, microprocessors, circuits, circuit elements (e.g., transistors, resistors, capacitors, inductors, and so forth), integrated circuits, application specific integrated circuits (ASIC), programmable logic devices (PLD), digital signal processors (DSP), field programmable gate array (FPGA), logic gates, registers, semiconductor device, chips, microchips, chip sets, and so forth. Examples of software may include software components, programs, applications, computer programs, application programs, system programs, machine programs, operating system software, middleware, firmware, software modules, routines, subroutines, functions, methods, procedures, software interfaces, application program interfaces (API), instruction sets, computing code, computer code, code segments, computer code segments, words, values, symbols, or any combination thereof. Determining whether an embodiment is implemented using hardware elements and/or software elements may vary in accordance with any number of factors, such as desired computational rate, power levels, heat tolerances, processing cycle budget, input data rates, output data rates, memory resources, data bus speeds and other design or performance constraints.
- Some embodiments may be implemented, for example, using a computer-readable medium or article which may store an instruction or a set of instructions that, if executed by a machine, may cause the machine to perform a method and/or operations in accordance with the embodiments. Such a machine may include, for example, any suitable processing platform, computing platform, computing device, processing device, computing system, processing system, computer, processor, or the like, and may be implemented using any suitable combination of hardware and/or software. The computer-readable medium or article may include, for example, any suitable type of memory unit, memory device, memory article, memory medium, storage device, storage article, storage medium and/or storage unit, for example, memory, removable or non-removable media, erasable or non-erasable media, writeable or re-writeable media, digital or analog media, hard disk, floppy disk, Compact Disc Read Only Memory (CD-ROM), Compact Disc Recordable (CD-R), Compact Disc Rewriteable (CD-RW), optical disk, magnetic media, magneto-optical media, removable memory cards or disks, various types of Digital Versatile Disc (DVD), a tape, a cassette, or the like. The instructions may include any suitable type of code, such as source code, compiled code, interpreted code, executable code, static code, dynamic code, encrypted code, and the like, implemented using any suitable high-level, low-level, object-oriented, visual, compiled and/or interpreted programming language.
Claims (25)
1. A user interface processing system for a device, comprising:
at least one sensor;
at least one output device; and a
a controller, including
a memory to store instructional information, and
a processor to receive sensor data from the at least one sensor, interpret the sensor data according to the instructional information, generate a user interface feedback command, transmit the command to the at least one output device, and to report the sensor data to a host system of the device.
2. The user interface processing system of claim 1 , wherein the instructional information includes gesture definitions and a UI map, and wherein to interpret the sensor data includes calculating a user interaction from the sensor data based on the UI map and calculating a gesture from the sensor data based on the gesture definitions.
3. The user interface processing system of claim 2 , wherein to report the sensor data includes transmitting the calculated user interaction and calculated gesture to the host system.
4. The user interface processing system of claim 2 , wherein the UI map includes an active area and a deadzone where the user interface feedback command is generated if the user interaction is with the active element and no user interface feedback command is generated if the user interaction is with the deadzone.
5. The user interface processing system of claim 1 , wherein the at least one sensor is a touch screen sensor.
6. The user interface processing system of claim 5 , further comprising an input device coupled to the touch screen sensor.
7. The user interface processing system of claim 5 , wherein the touch screen sensor is a proximity sensor.
8. The user interface processing system of claim 5 , wherein the touch screen sensor is a force sensor.
9. The user interface processing system of claim 1 , wherein the at least one sensor is an environmental sensor.
10. The user interface processing system of claim 9 , wherein the environmental sensor is an ambient light sensor.
11. The user interface processing system of claim 9 , wherein the environmental sensor is a digital compass sensor.
12. The user interface processing system of claim 9 , wherein the environmental sensor is an accelerometer or gyroscope.
13. The user interface processing system of claim 1 , wherein the at least one output device is a haptics driver.
14. A display controller for a device, comprising:
an interface;
a memory storing data processing instructions; and
a processor configured to:
receiving a sensor input from the interface;
processing the sensor input based on the stored data processing instructions;
generating a user interface output command;
transmitting the user interface output command to an effect generating device; and
transmitting the processed sensor input data to a host system of the device.
15. The display controller of claim 14 , wherein the serial interface includes a plurality of I2C connectors that are configured to be read in parallel.
16. A method of generating user interface effects, comprising:
receiving at least one sensor input;
processing the at least one sensor input;
generating a user interface effect based on the processed at least one sensor input and stored instructions; and
reporting the processed at least one sensor input to a host processor.
17. The method of claim 16 , wherein the at least one sensor input includes a user interaction sensor input and an environmental sensor input.
18. The method of claim 17 , wherein the processing includes separately processing the user interaction sensor input based on location coordinates and a user interface map received from the host processor, and the environmental sensor input.
19. The method of claim 16 , further comprises processing a gesture based on the at least one sensor input.
20. The method of claim 16 , wherein the user interface effect is a haptics effect.
21. A method of operating a user interface system in an electronic device, comprising:
placing a user interface controller in sleep mode;
after a predetermined time, waking the user interface controller from sleep mode;
checking a user interface sensor input trigger,
if triggered, reading an environmental sensor input, generating a user interface effect output, and reporting to a host system of the electronic device;
if not triggered, returning to sleep mode unless the environmental sensor input is triggered.
22. The method of claim 21 , further comprises if the environmental sensor input is triggered, generating a user interface effect output and reporting to the host system.
23. The method of claim 21 , wherein the user interface effect command is a haptics effect.
24. A haptics driver, comprising:
an input interface for connection to a touch screen sensor;
an output interface for connection to a haptic device;
a host device interface for connection to a host device;
a memory for storage of:
a UI map representing displayable user interface element(s) associated with the touch screen sensor,
a response pattern(s) associated with the user interface elements, the user interface element(s) and response pattern(s) to be received via the host device interface and stored in the memory; and
a processor to interpret sensor input data received at the input interface, identify whether a user interface element is indicated by the sensor input data and, when a user interface element is so indicated, output an associated response pattern to the output interface.
25. The haptic driver of claim 24 , further comprising, when a user interface element is so indicated, outputting data identifying the indicated user interface element via the host interface.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US13/433,069 US20120249461A1 (en) | 2011-04-01 | 2012-03-28 | Dedicated user interface controller for feedback responses |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US201161470764P | 2011-04-01 | 2011-04-01 | |
| US13/433,069 US20120249461A1 (en) | 2011-04-01 | 2012-03-28 | Dedicated user interface controller for feedback responses |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| US20120249461A1 true US20120249461A1 (en) | 2012-10-04 |
Family
ID=46926543
Family Applications (4)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US13/433,105 Active 2032-08-25 US8937603B2 (en) | 2011-04-01 | 2012-03-28 | Method and apparatus for haptic vibration response profiling and feedback |
| US13/433,069 Abandoned US20120249461A1 (en) | 2011-04-01 | 2012-03-28 | Dedicated user interface controller for feedback responses |
| US13/434,677 Abandoned US20120249475A1 (en) | 2011-04-01 | 2012-03-29 | 3d user interface control |
| US13/434,623 Abandoned US20120249474A1 (en) | 2011-04-01 | 2012-03-29 | Proximity and force detection for haptic effect generation |
Family Applications Before (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US13/433,105 Active 2032-08-25 US8937603B2 (en) | 2011-04-01 | 2012-03-28 | Method and apparatus for haptic vibration response profiling and feedback |
Family Applications After (2)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US13/434,677 Abandoned US20120249475A1 (en) | 2011-04-01 | 2012-03-29 | 3d user interface control |
| US13/434,623 Abandoned US20120249474A1 (en) | 2011-04-01 | 2012-03-29 | Proximity and force detection for haptic effect generation |
Country Status (2)
| Country | Link |
|---|---|
| US (4) | US8937603B2 (en) |
| WO (4) | WO2012135373A2 (en) |
Cited By (18)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20130009915A1 (en) * | 2011-07-08 | 2013-01-10 | Nokia Corporation | Controlling responsiveness to user inputs on a touch-sensitive display |
| US20130117267A1 (en) * | 2011-11-03 | 2013-05-09 | Kirill Buryak | Customer support solution recommendation system |
| US20130304240A1 (en) * | 2012-05-09 | 2013-11-14 | Bristol, Inc. d/b/a Remote Automated Solutions | Methods and apparatus to display information via a process control device |
| US20130318437A1 (en) * | 2012-05-22 | 2013-11-28 | Samsung Electronics Co., Ltd. | Method for providing ui and portable apparatus applying the same |
| US20140092003A1 (en) * | 2012-09-28 | 2014-04-03 | Min Liu | Direct haptic feedback |
| US20140191984A1 (en) * | 2013-01-04 | 2014-07-10 | Samsung Electronics Co., Ltd. | Display system with concurrent mult-mode control mechanism and method of operation thereof |
| US20150346821A1 (en) * | 2012-09-11 | 2015-12-03 | Nec Casio Mobile Communications, Ltd. | Electronic device, method for controlling electronic device, and recording medium |
| US9400570B2 (en) | 2014-11-14 | 2016-07-26 | Apple Inc. | Stylus with inertial sensor |
| US9575573B2 (en) | 2014-12-18 | 2017-02-21 | Apple Inc. | Stylus with touch sensor |
| EP3019943A4 (en) * | 2013-07-12 | 2017-05-31 | Tactual Labs Co. | Reducing control response latency with defined cross-control behavior |
| US9678571B1 (en) | 2016-09-06 | 2017-06-13 | Apple Inc. | Devices, methods, and graphical user interfaces for generating tactile outputs |
| US9817489B2 (en) | 2014-01-27 | 2017-11-14 | Apple Inc. | Texture capture stylus and method |
| US9830784B2 (en) | 2014-09-02 | 2017-11-28 | Apple Inc. | Semantic framework for variable haptic output |
| US9864432B1 (en) | 2016-09-06 | 2018-01-09 | Apple Inc. | Devices, methods, and graphical user interfaces for haptic mixing |
| DK201670725A1 (en) * | 2016-09-06 | 2018-03-19 | Apple Inc | Devices, Methods, and Graphical User Interfaces for Generating Tactile Outputs |
| US9984539B2 (en) | 2016-06-12 | 2018-05-29 | Apple Inc. | Devices, methods, and graphical user interfaces for providing haptic feedback |
| US9996157B2 (en) | 2016-06-12 | 2018-06-12 | Apple Inc. | Devices, methods, and graphical user interfaces for providing haptic feedback |
| US11314330B2 (en) | 2017-05-16 | 2022-04-26 | Apple Inc. | Tactile feedback for locked device user interfaces |
Families Citing this family (128)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP5648207B2 (en) * | 2009-09-04 | 2015-01-07 | 現代自動車株式会社 | Vehicle control device |
| JP2012256214A (en) * | 2011-06-09 | 2012-12-27 | Sony Corp | Information processing device, information processing method, and program |
| US10191546B2 (en) * | 2011-06-20 | 2019-01-29 | Immersion Corporation | Haptic theme framework |
| US9195351B1 (en) * | 2011-09-28 | 2015-11-24 | Amazon Technologies, Inc. | Capacitive stylus |
| US9591250B2 (en) * | 2011-10-19 | 2017-03-07 | Thomson Licensing | Remote control with feedback for blind navigation |
| US8633911B2 (en) * | 2011-12-14 | 2014-01-21 | Synaptics Incorporated | Force sensing input device and method for determining force information |
| US8791799B2 (en) | 2012-02-01 | 2014-07-29 | Immersion Corporation | Eccentric rotating mass actuator optimization for haptic effects |
| KR101873759B1 (en) * | 2012-04-10 | 2018-08-02 | 엘지전자 주식회사 | Display apparatus and method for controlling thereof |
| US10281986B2 (en) * | 2012-05-03 | 2019-05-07 | Georgia Tech Research Corporation | Methods, controllers and computer program products for accessibility to computing devices |
| WO2013169299A1 (en) | 2012-05-09 | 2013-11-14 | Yknots Industries Llc | Haptic feedback based on input progression |
| WO2013170099A1 (en) | 2012-05-09 | 2013-11-14 | Yknots Industries Llc | Calibration of haptic feedback systems for input devices |
| WO2013188307A2 (en) | 2012-06-12 | 2013-12-19 | Yknots Industries Llc | Haptic electromagnetic actuator |
| US9158405B2 (en) * | 2012-06-15 | 2015-10-13 | Blackberry Limited | Electronic device including touch-sensitive display and method of controlling same |
| US9886116B2 (en) | 2012-07-26 | 2018-02-06 | Apple Inc. | Gesture and touch input detection through force sensing |
| US9874972B2 (en) * | 2012-09-25 | 2018-01-23 | Synaptics Incorporated | Systems and methods for decoupling image generation rate from reporting rate in capacitive sensing |
| KR20140047897A (en) * | 2012-10-15 | 2014-04-23 | 삼성전자주식회사 | Method for providing for touch effect and an electronic device thereof |
| US9589538B2 (en) * | 2012-10-17 | 2017-03-07 | Perceptive Pixel, Inc. | Controlling virtual objects |
| JP6498863B2 (en) | 2012-12-13 | 2019-04-10 | イマージョン コーポレーションImmersion Corporation | Haptic system with increased LRA bandwidth |
| US9202350B2 (en) * | 2012-12-19 | 2015-12-01 | Nokia Technologies Oy | User interfaces and associated methods |
| KR102044826B1 (en) * | 2013-01-02 | 2019-11-14 | 삼성전자 주식회사 | Method for providing function of mouse and terminal implementing the same |
| CN103970291B (en) * | 2013-01-31 | 2018-08-14 | 索尼公司 | Mobile terminal |
| US9304587B2 (en) | 2013-02-13 | 2016-04-05 | Apple Inc. | Force sensing mouse |
| US10578499B2 (en) * | 2013-02-17 | 2020-03-03 | Microsoft Technology Licensing, Llc | Piezo-actuated virtual buttons for touch surfaces |
| GB2513884B (en) | 2013-05-08 | 2015-06-17 | Univ Bristol | Method and apparatus for producing an acoustic field |
| WO2014190018A1 (en) * | 2013-05-21 | 2014-11-27 | Stanley Innovation, Inc. | A system and method for a human machine interface utilizing near-field quasi-state electrical field sensing technology |
| US10591992B2 (en) * | 2013-06-17 | 2020-03-17 | Lenovo (Singapore) Pte. Ltd. | Simulation of control areas on touch surface using haptic feedback |
| WO2014207842A1 (en) * | 2013-06-26 | 2014-12-31 | 富士通株式会社 | Drive device, electronic apparatus, and drive control program |
| JP6032364B2 (en) | 2013-06-26 | 2016-11-24 | 富士通株式会社 | DRIVE DEVICE, ELECTRONIC DEVICE, AND DRIVE CONTROL PROGRAM |
| US11229239B2 (en) * | 2013-07-19 | 2022-01-25 | Rai Strategic Holdings, Inc. | Electronic smoking article with haptic feedback |
| US9520036B1 (en) * | 2013-09-18 | 2016-12-13 | Amazon Technologies, Inc. | Haptic output generation with dynamic feedback control |
| US9213408B2 (en) * | 2013-10-08 | 2015-12-15 | Immersion Corporation | Generating haptic effects while minimizing cascading |
| TWI606386B (en) * | 2013-10-31 | 2017-11-21 | 富智康(香港)有限公司 | Page switching system, touch device and page switching method |
| JP2015121983A (en) * | 2013-12-24 | 2015-07-02 | 京セラ株式会社 | Tactile sensation presentation device |
| US9448631B2 (en) | 2013-12-31 | 2016-09-20 | Microsoft Technology Licensing, Llc | Input device haptics and pressure sensing |
| US20150242037A1 (en) | 2014-01-13 | 2015-08-27 | Apple Inc. | Transparent force sensor with strain relief |
| US9471143B2 (en) * | 2014-01-20 | 2016-10-18 | Lenovo (Singapore) Pte. Ltd | Using haptic feedback on a touch device to provide element location indications |
| US9182823B2 (en) | 2014-01-21 | 2015-11-10 | Lenovo (Singapore) Pte. Ltd. | Actuating haptic element of a touch-sensitive device |
| US20150323994A1 (en) * | 2014-05-07 | 2015-11-12 | Immersion Corporation | Dynamic haptic effect modification |
| US9323331B2 (en) * | 2014-05-21 | 2016-04-26 | International Business Machines Corporation | Evaluation of digital content using intentional user feedback obtained through haptic interface |
| US10146318B2 (en) | 2014-06-13 | 2018-12-04 | Thomas Malzbender | Techniques for using gesture recognition to effectuate character selection |
| KR102294193B1 (en) | 2014-07-16 | 2021-08-26 | 삼성전자주식회사 | Apparatus and method for supporting computer aided diagonosis based on probe speed |
| WO2016018305A1 (en) * | 2014-07-30 | 2016-02-04 | Hewlett-Packard Development Company, L.P. | Detector for a display |
| US10297119B1 (en) | 2014-09-02 | 2019-05-21 | Apple Inc. | Feedback device in an electronic device |
| GB2530036A (en) * | 2014-09-09 | 2016-03-16 | Ultrahaptics Ltd | Method and apparatus for modulating haptic feedback |
| US9939901B2 (en) | 2014-09-30 | 2018-04-10 | Apple Inc. | Haptic feedback assembly |
| EP3002666A1 (en) | 2014-10-02 | 2016-04-06 | Huawei Technologies Co., Ltd. | Interaction method for user interfaces |
| US9846484B2 (en) * | 2014-12-04 | 2017-12-19 | Immersion Corporation | Systems and methods for controlling haptic signals |
| JP6771473B2 (en) | 2015-02-20 | 2020-10-21 | ウルトラハプティクス アイピー リミテッドUltrahaptics Ip Ltd | Improved algorithm in the tactile system |
| JP2018507485A (en) | 2015-02-20 | 2018-03-15 | ウルトラハプティクス アイピー リミテッドUltrahaptics Ip Ltd | Perception in the tactile system |
| US9798409B1 (en) | 2015-03-04 | 2017-10-24 | Apple Inc. | Multi-force input device |
| US9645647B2 (en) | 2015-05-13 | 2017-05-09 | Immersion Corporation | Systems and methods for haptic feedback for modular devices |
| CN107850941A (en) | 2015-06-26 | 2018-03-27 | 沙特基础工业全球技术公司 | Electromechanical actuator for the touch feedback in electronic equipment |
| US10818162B2 (en) | 2015-07-16 | 2020-10-27 | Ultrahaptics Ip Ltd | Calibration techniques in haptic systems |
| US10109161B2 (en) * | 2015-08-21 | 2018-10-23 | Immersion Corporation | Haptic driver with attenuation |
| KR20180048629A (en) * | 2015-09-25 | 2018-05-10 | 임머숀 코퍼레이션 | Haptic effect design system |
| US10516348B2 (en) * | 2015-11-05 | 2019-12-24 | Mems Drive Inc. | MEMS actuator package architecture |
| JP2017111462A (en) * | 2015-11-27 | 2017-06-22 | 京セラ株式会社 | Feeling presentation device and feeling presentation method |
| WO2017111928A1 (en) * | 2015-12-22 | 2017-06-29 | Intel Corporation | Reduction of touchscreen bounce |
| US10976819B2 (en) * | 2015-12-28 | 2021-04-13 | Microsoft Technology Licensing, Llc | Haptic feedback for non-touch surface interaction |
| US11189140B2 (en) | 2016-01-05 | 2021-11-30 | Ultrahaptics Ip Ltd | Calibration and detection techniques in haptic systems |
| US10061385B2 (en) | 2016-01-22 | 2018-08-28 | Microsoft Technology Licensing, Llc | Haptic feedback for a touch input device |
| CN109070138B (en) * | 2016-04-19 | 2021-02-09 | 日本电信电话株式会社 | Simulated force sense generating device |
| US10585480B1 (en) | 2016-05-10 | 2020-03-10 | Apple Inc. | Electronic device with an input device having a haptic engine |
| FR3053489A1 (en) * | 2016-06-29 | 2018-01-05 | Dav | CONTROL METHOD AND CONTROL INTERFACE FOR MOTOR VEHICLE |
| FR3053488A1 (en) * | 2016-06-29 | 2018-01-05 | Dav | CONTROL METHOD AND CONTROL INTERFACE FOR MOTOR VEHICLE |
| US10268275B2 (en) | 2016-08-03 | 2019-04-23 | Ultrahaptics Ip Ltd | Three-dimensional perceptions in haptic systems |
| US10671167B2 (en) | 2016-09-01 | 2020-06-02 | Apple Inc. | Electronic device including sensed location based driving of haptic actuators and related methods |
| US10606355B1 (en) * | 2016-09-06 | 2020-03-31 | Apple Inc. | Haptic architecture in a portable electronic device |
| CN106980362A (en) | 2016-10-09 | 2017-07-25 | 阿里巴巴集团控股有限公司 | Input method and device based on virtual reality scenario |
| KR102629409B1 (en) * | 2016-11-11 | 2024-01-26 | 삼성전자주식회사 | Method for providing object information and electronic device thereof |
| US10943578B2 (en) | 2016-12-13 | 2021-03-09 | Ultrahaptics Ip Ltd | Driving techniques for phased-array systems |
| US10936067B1 (en) | 2017-02-13 | 2021-03-02 | Snap, Inc. | Generating a response that depicts haptic characteristics |
| US11494986B2 (en) * | 2017-04-20 | 2022-11-08 | Samsung Electronics Co., Ltd. | System and method for two dimensional application usage in three dimensional virtual reality environment |
| US10732714B2 (en) | 2017-05-08 | 2020-08-04 | Cirrus Logic, Inc. | Integrated haptic system |
| US10712930B2 (en) | 2017-05-28 | 2020-07-14 | International Business Machines Corporation | 3D touch based user interface value pickers |
| US11054932B2 (en) | 2017-09-06 | 2021-07-06 | Apple Inc. | Electronic device having a touch sensor, force sensor, and haptic actuator in an integrated module |
| US11531395B2 (en) | 2017-11-26 | 2022-12-20 | Ultrahaptics Ip Ltd | Haptic effects from focused acoustic fields |
| US10871829B2 (en) | 2017-12-05 | 2020-12-22 | Tactai, Inc. | Touch enabling process, haptic accessory, and core haptic engine to enable creation and delivery of tactile-enabled experiences with virtual objects |
| JP7483610B2 (en) | 2017-12-22 | 2024-05-15 | ウルトラハプティクス アイピー リミテッド | Minimizing unwanted responses in haptic systems |
| US11360546B2 (en) | 2017-12-22 | 2022-06-14 | Ultrahaptics Ip Ltd | Tracking in haptic systems |
| US10832537B2 (en) | 2018-04-04 | 2020-11-10 | Cirrus Logic, Inc. | Methods and apparatus for outputting a haptic signal to a haptic transducer |
| KR20250060311A (en) | 2018-05-02 | 2025-05-07 | 울트라햅틱스 아이피 엘티디 | Blocking plate structure for improved acoustic transmission efficiency |
| US10599221B2 (en) | 2018-06-15 | 2020-03-24 | Immersion Corporation | Systems, devices, and methods for providing limited duration haptic effects |
| US11269415B2 (en) | 2018-08-14 | 2022-03-08 | Cirrus Logic, Inc. | Haptic output systems |
| US10936071B2 (en) | 2018-08-30 | 2021-03-02 | Apple Inc. | Wearable electronic device with haptic rotatable input |
| US11733956B2 (en) * | 2018-09-04 | 2023-08-22 | Apple Inc. | Display device sharing and interactivity |
| US10831276B2 (en) | 2018-09-07 | 2020-11-10 | Apple Inc. | Tungsten frame of a haptic feedback module for a portable electronic device |
| US11098951B2 (en) | 2018-09-09 | 2021-08-24 | Ultrahaptics Ip Ltd | Ultrasonic-assisted liquid manipulation |
| US10852830B2 (en) * | 2018-09-11 | 2020-12-01 | Apple Inc. | Power efficient, dynamic management of haptic module mechanical offset |
| US10966007B1 (en) | 2018-09-25 | 2021-03-30 | Apple Inc. | Haptic output system |
| US11378997B2 (en) | 2018-10-12 | 2022-07-05 | Ultrahaptics Ip Ltd | Variable phase and frequency pulse-width modulation technique |
| GB201817495D0 (en) | 2018-10-26 | 2018-12-12 | Cirrus Logic Int Semiconductor Ltd | A force sensing system and method |
| WO2020141330A2 (en) | 2019-01-04 | 2020-07-09 | Ultrahaptics Ip Ltd | Mid-air haptic textures |
| US12373033B2 (en) | 2019-01-04 | 2025-07-29 | Ultrahaptics Ip Ltd | Mid-air haptic textures |
| US11509292B2 (en) | 2019-03-29 | 2022-11-22 | Cirrus Logic, Inc. | Identifying mechanical impedance of an electromagnetic load using least-mean-squares filter |
| US10828672B2 (en) | 2019-03-29 | 2020-11-10 | Cirrus Logic, Inc. | Driver circuitry |
| US11644370B2 (en) * | 2019-03-29 | 2023-05-09 | Cirrus Logic, Inc. | Force sensing with an electromagnetic load |
| US11283337B2 (en) | 2019-03-29 | 2022-03-22 | Cirrus Logic, Inc. | Methods and systems for improving transducer dynamics |
| US12035445B2 (en) | 2019-03-29 | 2024-07-09 | Cirrus Logic Inc. | Resonant tracking of an electromagnetic load |
| US10955955B2 (en) | 2019-03-29 | 2021-03-23 | Cirrus Logic, Inc. | Controller for use in a device comprising force sensors |
| US11842517B2 (en) | 2019-04-12 | 2023-12-12 | Ultrahaptics Ip Ltd | Using iterative 3D-model fitting for domain adaptation of a hand-pose-estimation neural network |
| US10976825B2 (en) | 2019-06-07 | 2021-04-13 | Cirrus Logic, Inc. | Methods and apparatuses for controlling operation of a vibrational output system and/or operation of an input sensor system |
| GB2604215B (en) | 2019-06-21 | 2024-01-31 | Cirrus Logic Int Semiconductor Ltd | A method and apparatus for configuring a plurality of virtual buttons on a device |
| US11921923B2 (en) * | 2019-07-30 | 2024-03-05 | Maxim Integrated Products, Inc. | Oscillation reduction in haptic vibrators by minimization of feedback acceleration |
| US11374586B2 (en) | 2019-10-13 | 2022-06-28 | Ultraleap Limited | Reducing harmonic distortion by dithering |
| CN114631139A (en) | 2019-10-13 | 2022-06-14 | 超飞跃有限公司 | Dynamic capping with virtual microphones |
| US11408787B2 (en) | 2019-10-15 | 2022-08-09 | Cirrus Logic, Inc. | Control methods for a force sensor system |
| US11380175B2 (en) | 2019-10-24 | 2022-07-05 | Cirrus Logic, Inc. | Reproducibility of haptic waveform |
| US11169610B2 (en) | 2019-11-08 | 2021-11-09 | Ultraleap Limited | Tracking techniques in haptic systems |
| US12276687B2 (en) | 2019-12-05 | 2025-04-15 | Cirrus Logic Inc. | Methods and systems for estimating coil impedance of an electromagnetic transducer |
| US11545951B2 (en) | 2019-12-06 | 2023-01-03 | Cirrus Logic, Inc. | Methods and systems for detecting and managing amplifier instability |
| US11715453B2 (en) | 2019-12-25 | 2023-08-01 | Ultraleap Limited | Acoustic transducer structures |
| WO2021141936A1 (en) * | 2020-01-06 | 2021-07-15 | Tactai, Inc. | Haptic waveform generation and rendering at interface device |
| CN115136085B (en) * | 2020-02-20 | 2025-06-27 | 发那科株式会社 | Numerical control device |
| US11662821B2 (en) | 2020-04-16 | 2023-05-30 | Cirrus Logic, Inc. | In-situ monitoring, calibration, and testing of a haptic actuator |
| US12244253B2 (en) | 2020-04-16 | 2025-03-04 | Cirrus Logic Inc. | Restricting undesired movement of a haptic actuator |
| US11024135B1 (en) | 2020-06-17 | 2021-06-01 | Apple Inc. | Portable electronic device having a haptic button assembly |
| US11816267B2 (en) | 2020-06-23 | 2023-11-14 | Ultraleap Limited | Features of airborne ultrasonic fields |
| US11886639B2 (en) | 2020-09-17 | 2024-01-30 | Ultraleap Limited | Ultrahapticons |
| NL2027963B1 (en) | 2021-04-14 | 2022-10-25 | Microsoft Technology Licensing Llc | Touch-sensitive input device |
| US11775084B2 (en) * | 2021-04-20 | 2023-10-03 | Microsoft Technology Licensing, Llc | Stylus haptic component arming and power consumption |
| US11567575B2 (en) * | 2021-06-14 | 2023-01-31 | Microsoft Technology Licensing, Llc | Haptic response control |
| US11933822B2 (en) | 2021-06-16 | 2024-03-19 | Cirrus Logic Inc. | Methods and systems for in-system estimation of actuator parameters |
| US11908310B2 (en) | 2021-06-22 | 2024-02-20 | Cirrus Logic Inc. | Methods and systems for detecting and managing unexpected spectral content in an amplifier system |
| US11765499B2 (en) | 2021-06-22 | 2023-09-19 | Cirrus Logic Inc. | Methods and systems for managing mixed mode electromechanical actuator drive |
| US11552649B1 (en) | 2021-12-03 | 2023-01-10 | Cirrus Logic, Inc. | Analog-to-digital converter-embedded fixed-phase variable gain amplifier stages for dual monitoring paths |
| US20250181167A1 (en) * | 2022-03-07 | 2025-06-05 | Junji Sone | Skin stimulation device and method for driving skin stimulation device |
| WO2025147540A1 (en) * | 2024-01-03 | 2025-07-10 | Apple Inc. | Three-dimensional user interfaces |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20060038793A1 (en) * | 2003-10-08 | 2006-02-23 | Harald Philipp | Touch Sensitive Control Panel |
| US20060197753A1 (en) * | 2005-03-04 | 2006-09-07 | Hotelling Steven P | Multi-functional hand-held device |
| US20090066660A1 (en) * | 2007-09-06 | 2009-03-12 | Ure Michael J | Interface with and communication between mobile electronic devices |
| US20090271797A1 (en) * | 2008-04-24 | 2009-10-29 | Canon Kabushiki Kaisha | Information processing apparatus, information processing method, and medium storing information processing program stored thereon |
| US20110075835A1 (en) * | 2009-09-30 | 2011-03-31 | Apple Inc. | Self adapting haptic device |
Family Cites Families (39)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5825308A (en) | 1996-11-26 | 1998-10-20 | Immersion Human Interface Corporation | Force feedback interface having isotonic and isometric functionality |
| US6411276B1 (en) | 1996-11-13 | 2002-06-25 | Immersion Corporation | Hybrid control of haptic feedback for host computer and interface device |
| DE20080209U1 (en) * | 1999-09-28 | 2001-08-09 | Immersion Corp | Control of haptic sensations for interface devices with vibrotactile feedback |
| US7730401B2 (en) | 2001-05-16 | 2010-06-01 | Synaptics Incorporated | Touch screen with user interface enhancement |
| US7295015B2 (en) * | 2004-02-19 | 2007-11-13 | Brooks Automation, Inc. | Ionization gauge |
| US7956846B2 (en) | 2006-01-05 | 2011-06-07 | Apple Inc. | Portable electronic device with content-dependent touch sensitivity |
| US8681098B2 (en) | 2008-04-24 | 2014-03-25 | Oblong Industries, Inc. | Detecting, representing, and interpreting three-space input: gestural continuum subsuming freespace, proximal, and surface-contact modes |
| US8612024B2 (en) | 2006-02-24 | 2013-12-17 | Medtronic, Inc. | User interface with 3D environment for configuring stimulation therapy |
| US7890863B2 (en) | 2006-10-04 | 2011-02-15 | Immersion Corporation | Haptic effects with proximity sensing |
| US8103109B2 (en) | 2007-06-19 | 2012-01-24 | Microsoft Corporation | Recognizing hand poses and/or object classes |
| JP2010533336A (en) | 2007-07-11 | 2010-10-21 | ユイ・ジン・オ | Data input device using finger motion sensing and input conversion method using the same |
| KR101424259B1 (en) | 2007-08-22 | 2014-07-31 | 삼성전자주식회사 | Method and apparatus for providing input feedback in portable terminal |
| US8917247B2 (en) | 2007-11-20 | 2014-12-23 | Samsung Electronics Co., Ltd. | External device identification method and apparatus in a device including a touch spot, and computer-readable recording mediums having recorded thereon programs for executing the external device identification method in a device including a touch spot |
| KR20090066368A (en) | 2007-12-20 | 2009-06-24 | 삼성전자주식회사 | A mobile terminal having a touch screen and a method of controlling the function thereof |
| US9285459B2 (en) | 2008-05-09 | 2016-03-15 | Analog Devices, Inc. | Method of locating an object in 3D |
| US20090279107A1 (en) | 2008-05-09 | 2009-11-12 | Analog Devices, Inc. | Optical distance measurement by triangulation of an active transponder |
| US8099332B2 (en) | 2008-06-06 | 2012-01-17 | Apple Inc. | User interface for application management for a mobile device |
| US20090309825A1 (en) | 2008-06-13 | 2009-12-17 | Sony Ericsson Mobile Communications Ab | User interface, method, and computer program for controlling apparatus, and apparatus |
| US8174372B2 (en) | 2008-06-26 | 2012-05-08 | Immersion Corporation | Providing haptic feedback on a touch surface |
| KR101014263B1 (en) | 2008-09-04 | 2011-02-16 | 삼성전기주식회사 | Tactile sensor |
| KR20100036850A (en) | 2008-09-30 | 2010-04-08 | 삼성전기주식회사 | Touch panel apparatus using tactile sensor |
| KR101021440B1 (en) | 2008-11-14 | 2011-03-15 | 한국표준과학연구원 | Touch input device, mobile device using same and control method thereof |
| US20100134409A1 (en) | 2008-11-30 | 2010-06-03 | Lenovo (Singapore) Pte. Ltd. | Three-dimensional user interface |
| US9746544B2 (en) | 2008-12-03 | 2017-08-29 | Analog Devices, Inc. | Position measurement systems using position sensitive detectors |
| US8823518B2 (en) | 2008-12-08 | 2014-09-02 | Motorola Solutions, Inc. | Method of sensor cluster processing for a communication device |
| US7843277B2 (en) * | 2008-12-16 | 2010-11-30 | Immersion Corporation | Haptic feedback generation based on resonant frequency |
| US8686952B2 (en) | 2008-12-23 | 2014-04-01 | Apple Inc. | Multi touch with multi haptics |
| US8291348B2 (en) | 2008-12-31 | 2012-10-16 | Hewlett-Packard Development Company, L.P. | Computing device and method for selecting display regions responsive to non-discrete directional input actions and intelligent content analysis |
| US8760413B2 (en) | 2009-01-08 | 2014-06-24 | Synaptics Incorporated | Tactile surface |
| JP5343871B2 (en) | 2009-03-12 | 2013-11-13 | 株式会社リコー | Touch panel device, display device with touch panel including the same, and control method for touch panel device |
| EP2434945B1 (en) | 2009-05-27 | 2018-12-19 | Analog Devices, Inc. | Multiuse optical sensor |
| US8279197B2 (en) | 2009-08-25 | 2012-10-02 | Pixart Imaging Inc. | Method and apparatus for detecting defective traces in a mutual capacitance touch sensing device |
| KR20110031797A (en) * | 2009-09-21 | 2011-03-29 | 삼성전자주식회사 | Input device and method of mobile terminal |
| KR101120894B1 (en) | 2009-10-20 | 2012-02-27 | 삼성전기주식회사 | Haptic feedback device and electronic device |
| US9104275B2 (en) | 2009-10-20 | 2015-08-11 | Lg Electronics Inc. | Mobile terminal to display an object on a perceived 3D space |
| KR101802520B1 (en) | 2010-03-16 | 2017-11-28 | 임머숀 코퍼레이션 | Systems and methods for pre-touch and true touch |
| US9043732B2 (en) | 2010-10-21 | 2015-05-26 | Nokia Corporation | Apparatus and method for user input for controlling displayed information |
| US9019230B2 (en) | 2010-10-31 | 2015-04-28 | Pixart Imaging Inc. | Capacitive touchscreen system with reduced power consumption using modal focused scanning |
| US9164586B2 (en) * | 2012-11-21 | 2015-10-20 | Novasentis, Inc. | Haptic system with localized response |
-
2012
- 2012-03-28 US US13/433,105 patent/US8937603B2/en active Active
- 2012-03-28 WO PCT/US2012/030994 patent/WO2012135373A2/en not_active Ceased
- 2012-03-28 WO PCT/US2012/031003 patent/WO2012135378A1/en not_active Ceased
- 2012-03-28 US US13/433,069 patent/US20120249461A1/en not_active Abandoned
- 2012-03-29 US US13/434,677 patent/US20120249475A1/en not_active Abandoned
- 2012-03-29 WO PCT/US2012/031272 patent/WO2012135532A1/en not_active Ceased
- 2012-03-29 WO PCT/US2012/031279 patent/WO2012135534A1/en not_active Ceased
- 2012-03-29 US US13/434,623 patent/US20120249474A1/en not_active Abandoned
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20060038793A1 (en) * | 2003-10-08 | 2006-02-23 | Harald Philipp | Touch Sensitive Control Panel |
| US20060197753A1 (en) * | 2005-03-04 | 2006-09-07 | Hotelling Steven P | Multi-functional hand-held device |
| US20090066660A1 (en) * | 2007-09-06 | 2009-03-12 | Ure Michael J | Interface with and communication between mobile electronic devices |
| US20090271797A1 (en) * | 2008-04-24 | 2009-10-29 | Canon Kabushiki Kaisha | Information processing apparatus, information processing method, and medium storing information processing program stored thereon |
| US20110075835A1 (en) * | 2009-09-30 | 2011-03-31 | Apple Inc. | Self adapting haptic device |
Cited By (54)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US8717327B2 (en) | 2011-07-08 | 2014-05-06 | Nokia Corporation | Controlling responsiveness to user inputs on a touch-sensitive display |
| US20130009915A1 (en) * | 2011-07-08 | 2013-01-10 | Nokia Corporation | Controlling responsiveness to user inputs on a touch-sensitive display |
| US20130117267A1 (en) * | 2011-11-03 | 2013-05-09 | Kirill Buryak | Customer support solution recommendation system |
| US10445351B2 (en) | 2011-11-03 | 2019-10-15 | Google Llc | Customer support solution recommendation system |
| US8819013B2 (en) * | 2011-11-03 | 2014-08-26 | Google Inc. | Customer support solution recommendation system |
| US9779159B2 (en) | 2011-11-03 | 2017-10-03 | Google Inc. | Customer support solution recommendation system |
| US20130304240A1 (en) * | 2012-05-09 | 2013-11-14 | Bristol, Inc. d/b/a Remote Automated Solutions | Methods and apparatus to display information via a process control device |
| US9563187B2 (en) * | 2012-05-09 | 2017-02-07 | Bristol, Inc. | Methods and apparatus to display information via a process control device |
| US20130318437A1 (en) * | 2012-05-22 | 2013-11-28 | Samsung Electronics Co., Ltd. | Method for providing ui and portable apparatus applying the same |
| US9746924B2 (en) * | 2012-09-11 | 2017-08-29 | Nec Corporation | Electronic device, method for controlling electronic device, and recording medium |
| US20150346821A1 (en) * | 2012-09-11 | 2015-12-03 | Nec Casio Mobile Communications, Ltd. | Electronic device, method for controlling electronic device, and recording medium |
| US20140092003A1 (en) * | 2012-09-28 | 2014-04-03 | Min Liu | Direct haptic feedback |
| US20140191984A1 (en) * | 2013-01-04 | 2014-07-10 | Samsung Electronics Co., Ltd. | Display system with concurrent mult-mode control mechanism and method of operation thereof |
| US10175874B2 (en) * | 2013-01-04 | 2019-01-08 | Samsung Electronics Co., Ltd. | Display system with concurrent multi-mode control mechanism and method of operation thereof |
| EP3019943A4 (en) * | 2013-07-12 | 2017-05-31 | Tactual Labs Co. | Reducing control response latency with defined cross-control behavior |
| US9817489B2 (en) | 2014-01-27 | 2017-11-14 | Apple Inc. | Texture capture stylus and method |
| US10089840B2 (en) | 2014-09-02 | 2018-10-02 | Apple Inc. | Semantic framework for variable haptic output |
| US12300095B2 (en) | 2014-09-02 | 2025-05-13 | Apple Inc. | Semantic framework for variable haptic output |
| US10504340B2 (en) | 2014-09-02 | 2019-12-10 | Apple Inc. | Semantic framework for variable haptic output |
| US9830784B2 (en) | 2014-09-02 | 2017-11-28 | Apple Inc. | Semantic framework for variable haptic output |
| US10417879B2 (en) | 2014-09-02 | 2019-09-17 | Apple Inc. | Semantic framework for variable haptic output |
| US11790739B2 (en) | 2014-09-02 | 2023-10-17 | Apple Inc. | Semantic framework for variable haptic output |
| US9928699B2 (en) | 2014-09-02 | 2018-03-27 | Apple Inc. | Semantic framework for variable haptic output |
| US10977911B2 (en) | 2014-09-02 | 2021-04-13 | Apple Inc. | Semantic framework for variable haptic output |
| US9400570B2 (en) | 2014-11-14 | 2016-07-26 | Apple Inc. | Stylus with inertial sensor |
| US9575573B2 (en) | 2014-12-18 | 2017-02-21 | Apple Inc. | Stylus with touch sensor |
| US10276000B2 (en) | 2016-06-12 | 2019-04-30 | Apple Inc. | Devices, methods, and graphical user interfaces for providing haptic feedback |
| US10156903B2 (en) | 2016-06-12 | 2018-12-18 | Apple Inc. | Devices, methods, and graphical user interfaces for providing haptic feedback |
| US11735014B2 (en) | 2016-06-12 | 2023-08-22 | Apple Inc. | Devices, methods, and graphical user interfaces for providing haptic feedback |
| US10175759B2 (en) | 2016-06-12 | 2019-01-08 | Apple Inc. | Devices, methods, and graphical user interfaces for providing haptic feedback |
| US9984539B2 (en) | 2016-06-12 | 2018-05-29 | Apple Inc. | Devices, methods, and graphical user interfaces for providing haptic feedback |
| US9996157B2 (en) | 2016-06-12 | 2018-06-12 | Apple Inc. | Devices, methods, and graphical user interfaces for providing haptic feedback |
| US11468749B2 (en) | 2016-06-12 | 2022-10-11 | Apple Inc. | Devices, methods, and graphical user interfaces for providing haptic feedback |
| US12190714B2 (en) | 2016-06-12 | 2025-01-07 | Apple Inc. | Devices, methods, and graphical user interfaces for providing haptic feedback |
| US12353631B2 (en) | 2016-06-12 | 2025-07-08 | Apple Inc. | Devices, methods, and graphical user interfaces for providing haptic feedback |
| US10139909B2 (en) | 2016-06-12 | 2018-11-27 | Apple Inc. | Devices, methods, and graphical user interfaces for providing haptic feedback |
| US11379041B2 (en) | 2016-06-12 | 2022-07-05 | Apple Inc. | Devices, methods, and graphical user interfaces for providing haptic feedback |
| US11037413B2 (en) | 2016-06-12 | 2021-06-15 | Apple Inc. | Devices, methods, and graphical user interfaces for providing haptic feedback |
| US10692333B2 (en) | 2016-06-12 | 2020-06-23 | Apple Inc. | Devices, methods, and graphical user interfaces for providing haptic feedback |
| US9678571B1 (en) | 2016-09-06 | 2017-06-13 | Apple Inc. | Devices, methods, and graphical user interfaces for generating tactile outputs |
| EP4047455A1 (en) * | 2016-09-06 | 2022-08-24 | Apple Inc. | Devices, methods, and graphical user interfaces for generating tactile outputs |
| US10901513B2 (en) | 2016-09-06 | 2021-01-26 | Apple Inc. | Devices, methods, and graphical user interfaces for haptic mixing |
| US10620708B2 (en) | 2016-09-06 | 2020-04-14 | Apple Inc. | Devices, methods, and graphical user interfaces for generating tactile outputs |
| US11221679B2 (en) | 2016-09-06 | 2022-01-11 | Apple Inc. | Devices, methods, and graphical user interfaces for generating tactile outputs |
| US9690383B1 (en) | 2016-09-06 | 2017-06-27 | Apple Inc. | Devices, methods, and graphical user interfaces for generating tactile outputs |
| US10528139B2 (en) | 2016-09-06 | 2020-01-07 | Apple Inc. | Devices, methods, and graphical user interfaces for haptic mixing |
| US10901514B2 (en) | 2016-09-06 | 2021-01-26 | Apple Inc. | Devices, methods, and graphical user interfaces for generating tactile outputs |
| US10372221B2 (en) | 2016-09-06 | 2019-08-06 | Apple Inc. | Devices, methods, and graphical user interfaces for generating tactile outputs |
| US11662824B2 (en) | 2016-09-06 | 2023-05-30 | Apple Inc. | Devices, methods, and graphical user interfaces for generating tactile outputs |
| US10175762B2 (en) | 2016-09-06 | 2019-01-08 | Apple Inc. | Devices, methods, and graphical user interfaces for generating tactile outputs |
| DK201670725A1 (en) * | 2016-09-06 | 2018-03-19 | Apple Inc | Devices, Methods, and Graphical User Interfaces for Generating Tactile Outputs |
| US9864432B1 (en) | 2016-09-06 | 2018-01-09 | Apple Inc. | Devices, methods, and graphical user interfaces for haptic mixing |
| US9753541B1 (en) | 2016-09-06 | 2017-09-05 | Apple Inc. | Devices, methods, and graphical user interfaces for generating tactile outputs |
| US11314330B2 (en) | 2017-05-16 | 2022-04-26 | Apple Inc. | Tactile feedback for locked device user interfaces |
Also Published As
| Publication number | Publication date |
|---|---|
| US8937603B2 (en) | 2015-01-20 |
| WO2012135373A2 (en) | 2012-10-04 |
| WO2012135373A3 (en) | 2014-05-01 |
| US20120249462A1 (en) | 2012-10-04 |
| WO2012135378A1 (en) | 2012-10-04 |
| US20120249475A1 (en) | 2012-10-04 |
| WO2012135534A1 (en) | 2012-10-04 |
| US20120249474A1 (en) | 2012-10-04 |
| WO2012135532A1 (en) | 2012-10-04 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US20120249461A1 (en) | Dedicated user interface controller for feedback responses | |
| US11429244B2 (en) | Method and apparatus for displaying application | |
| US10296136B2 (en) | Touch-sensitive button with two levels | |
| US9721365B2 (en) | Low latency modification of display frames | |
| US20130265243A1 (en) | Adaptive power adjustment for a touchscreen | |
| US9798512B1 (en) | Context-based volume adjustment | |
| US20160299632A1 (en) | Method and device for implementing a touch interface | |
| US20200004416A1 (en) | Virtual Keyboard Animation | |
| US20120007826A1 (en) | Touch-controlled electric apparatus and control method thereof | |
| CN107861653A (en) | Display control apparatus, display control program and display control method | |
| KR101438231B1 (en) | Terminal device having a hybrid touch screen and control method thereof | |
| KR101682527B1 (en) | touch keypad combined mouse using thin type haptic module | |
| US20140152601A1 (en) | Touch display device and control method thereof | |
| US12079393B2 (en) | Tactile feedback | |
| KR20190125322A (en) | Multi-speed processor for haptic feedback | |
| US20150138102A1 (en) | Inputting mode switching method and system utilizing the same | |
| KR20150044348A (en) | Method for recognizing touch data and apparatus thereof | |
| CN103869937A (en) | Touch feedback method and electronic device |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AS | Assignment |
Owner name: ANALOG DEVICES, INC., MASSACHUSETTS Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:FLANAGAN, ADRIAN;FEEN, KENNETH M.;MURPHY, MARK J.;AND OTHERS;SIGNING DATES FROM 20120411 TO 20120423;REEL/FRAME:028361/0936 |
|
| STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |