US20220261090A1 - System and method for multi-mode command input - Google Patents

System and method for multi-mode command input Download PDF

Info

Publication number
US20220261090A1
US20220261090A1 US17/737,664 US202217737664A US2022261090A1 US 20220261090 A1 US20220261090 A1 US 20220261090A1 US 202217737664 A US202217737664 A US 202217737664A US 2022261090 A1 US2022261090 A1 US 2022261090A1
Authority
US
United States
Prior art keywords
controlling device
touch sensitive
sensitive surface
switches
appliance
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
US17/737,664
Inventor
Arsham Hatambeiki
Jeffrey Kohanek
Pamela Eichler Keiles
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Universal Electronics Inc
Original Assignee
Universal Electronics Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Universal Electronics Inc filed Critical Universal Electronics Inc
Priority to US17/737,664 priority Critical patent/US20220261090A1/en
Assigned to UNIVERSAL ELECTRONICS INC. reassignment UNIVERSAL ELECTRONICS INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: KEILES, PAMELA EICHLER, HATAMBEIKI, ARSHAM, KOHANEK, JEFFREY
Publication of US20220261090A1 publication Critical patent/US20220261090A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/0202Constructional details or processes of manufacture of the input device
    • G06F3/0219Special purpose keyboards
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03547Touch pads, in which fingers can move on a surface
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0414Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using force sensing means to determine a position
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01HELECTRIC SWITCHES; RELAYS; SELECTORS; EMERGENCY PROTECTIVE DEVICES
    • H01H25/00Switches with compound movement of handle or other operating part
    • H01H25/04Operating part movable angularly in more than one plane, e.g. joystick
    • H01H25/041Operating part movable angularly in more than one plane, e.g. joystick having a generally flat operating member depressible at different locations to operate different controls
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01HELECTRIC SWITCHES; RELAYS; SELECTORS; EMERGENCY PROTECTIVE DEVICES
    • H01H9/00Details of switching devices, not covered by groups H01H1/00 - H01H7/00
    • H01H9/02Bases, casings, or covers
    • H01H9/0214Hand-held casings
    • H01H9/0235Hand-held casings specially adapted for remote control, e.g. of audio or video apparatus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/04106Multi-sensing digitiser, i.e. digitiser using at least two different sensing technologies simultaneously or alternatively, e.g. for detecting pen and finger, for saving power or for improving position detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/044Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
    • G06F3/0447Position sensing using the local deformation of sensor cells

Definitions

  • Controlling devices for use in issuing commands to entertainment and other appliances, for example remote controls, and the features and functionality provided by such controlling devices are well known in the art.
  • user input means on such controlling devices has comprised a series of buttons each of which may result in the transmission of a specific command when activated.
  • controlling devices must be used to interact with displayed menu systems, browse web pages, manipulate pointers, and perform other similar activities which may require directional control input, e.g., to scroll displayed information on a screen, to move a pointer, to control a game activity or avatar, to zoom in or out, to control functions such as fast forward or slow motion, or the like (such activities collectively referred to hereinafter as “navigation”).
  • a controlling device may be provided with input means such as for example a resistive or capacitive touch sensor, etc., whereby motion and/or pressure by a user's finger may be translated into navigation commands to be transmitted to a target controlled device. These commands may be applied at the target device to control operations such as scrolling a menu, movement of a cursor on the screen, motion of a game object, etc., as appropriate for a particular application.
  • the touch sensitive input means may be adapted to provide for conventional keypress input operations, such as for example without limitation a numeric keypad in an illustrative embodiment.
  • FIG. 1 illustrates an exemplary system in which an exemplary controlling device according to the instant invention may be used
  • FIG. 2 illustrates a block diagram of exemplary components of the exemplary controlling device of FIG. 1 ;
  • FIG. 3 illustrates the structure and operation of an exemplary touch sensitive input area of the exemplary controlling device of FIG. 1 ;
  • FIG. 4 illustrates multiple modes of operation of the exemplary controlling device of FIG. 1 ;
  • FIG. 5 illustrates exemplary interpretations of user input interactions with a touch sensitive area of the exemplary controlling device of FIG. 4 ;
  • FIG. 6 illustrates in flow chart form an exemplary method for performing the interpretations illustrated in FIG. 5 ;
  • FIG. 7 illustrates an alternate embodiment of a controlling device and system in which the teachings of the instant invention may be used.
  • FIG. 1 there is illustrated an exemplary system in which a controlling device 100 is configured to control various controllable appliances, such as for example a television 102 and a set top box (“STB”) 104 .
  • the controlling device 100 may be capable of transmitting commands to the appliances, using any convenient IR, RF, Point-to-Point, or networked protocol, to cause the appliances to perform operational functions.
  • controlling device 100 may further include an input area 106 for generation of navigation commands for transmission from the controlling device 100 to one or more appliances in response to user interaction with that area, used for example to scroll a program guide menu display 108 on TV 102 by issuing a series of commands to set top box 104 .
  • input area 106 may be further adapted to offer keypad-like functionality during certain modes of operation, all as will be described in further detail hereafter.
  • the controlling device 100 may include, as needed for a particular application, a processor 200 coupled to a ROM memory 204 ; a RAM memory 202 ; a key matrix 216 (e.g., hard keys, soft keys such as a touch sensitive surface overlaid on a liquid crystal (LCD), and/or an electroluminescent (EL) display); a scrolling and/or navigation function input means 218 such as a capacitive or resistive touch sensor; transmission circuit(s) and/or transceiver circuit(s) 210 (e.g., IR and/or RF); a non-volatile read/write memory 206 ; a means 220 to provide visual feedback to the user (e.g., one or more LEDs, display, and/or the like); a means 222 to provide audible feedback to a user (e.g., a speaker, piezoelectric buzzer, etc.); a power source 208
  • a processor 200 coupled to a ROM memory 204 ; a RAM memory
  • the memories 202 , 204 , 206 may include executable instructions (collectively, the program memory) that are intended to be executed by the processor 200 to control the operation of the remote control 100 , as well as data which serves to define to the operational software the necessary control protocols and command values for use in transmitting command signals to controllable appliances (collectively, the command data).
  • the processor 200 may be programmed to control the various electronic components within the remote control 100 , e.g., to monitor the key matrix 216 , to cause the transmission of signals, etc.
  • the non-volatile read/write memory 206 may additionally be provided to store setup data and parameters as necessary. While the memory 204 is illustrated and described as a ROM memory, memory 204 can also be comprised of any type of readable media, such as ROM, FLASH, EEPROM, or the like. Preferably, the memories 204 and 206 are non-volatile or battery-backed such that data is not required to be reloaded after battery changes. In addition, the memories 202 , 204 and 206 may take the form of a chip, a hard disk, a magnetic disk, an optical disk, and/or the like.
  • some or all of the illustrated memory devices may be physically combined (for example, a single FLASH memory may be logically partitioned into different portions to support the functionality of memories 204 and 206 respectively), and/or may be physically incorporated within the same IC chip as the microprocessor 200 (a so called “microcontroller”) and, as such, they are shown separately in FIG. 2 only for the sake of clarity.
  • the controlling device 100 may be adapted to be responsive to events, such as a sensed user interaction with the key matrix 216 , touchpad 218 , etc.
  • events such as a sensed user interaction with the key matrix 216 , touchpad 218 , etc.
  • appropriate instructions within the program memory hereafter the “operating program” may be executed.
  • the controlling device 100 may retrieve from the command data stored in memory 202 , 204 , 206 a command value and control protocol corresponding to the actuated function key and, where necessary, current device mode, and will use the retrieved command data to transmit to an intended target appliance, e.g., STB 104 , a command in a format recognizable by that appliance to thereby control one or more functional operations of that appliance.
  • an intended target appliance e.g., STB 104
  • the operating program can be used not only to cause the transmission of commands and/or data to the appliances, but also to perform local operations. While not limiting, local operations that may be performed by the controlling device 100 may include displaying information/data, favorite channel setup, macro key setup, function key relocation, etc. Examples of local operations can be found in U.S. Pat. Nos. 5,481,256, 5,959,751, and 6,014,092.
  • controlling device 100 may be the universal type, that is provisioned with a library comprising a multiplicity of command codes and protocols suitable for controlling various appliances.
  • a setup procedure for selecting sets of command data to be associated with the specific appliances to be controlled (hereafter referred to as a setup procedure), data may be entered into the controlling device 100 that serves to identify each intended target appliance by its make, and/or model, and/or type.
  • the data may typically be entered via activation of those keys that are also used to cause the transmission of commands to an appliance, preferably the keys that are labeled with numerals.
  • Such data allows the controlling device 100 to identify the appropriate command data set within the library of command data that is to be used to transmit recognizable commands in formats appropriate for such identified appliances.
  • the library of command data may represent a plurality of controllable appliances of different types and manufacture, a plurality of controllable appliances of the same type but different manufacture, a plurality of appliances of the same manufacture but different type or model, etc., or any combination thereof as appropriate for a given embodiment.
  • data used to identify an appropriate command data set may take the form of a numeric setup code (obtained, for example, from a printed list of manufacturer names and/or models with corresponding code numbers, from a support Web site, etc.).
  • controlling device 100 may include input means for accepting user touch input to be translated into navigation commands.
  • input means 218 may take the form of a multiple-electrode capacitive touch sensor.
  • input means 218 may accept finger sliding gestures on either axis for translation into navigation step commands in an X or Y direction, as well as finger pressure at, for example, the cardinal points and center area for translation into discrete commands, for example equivalent to a conventional keypad's four arrow keys and a select key, all as will be described in further detail hereafter.
  • Such an input means may comprise the before-mentioned multiple-electrode capacitive touch sensor 302 and an associated acrylic keycap 304 , positioned upon a group of conventional silicon rubber keypad buttons 310 , 311 , 312 , 313 (hereafter a “floating touch sensor”).
  • Silicon rubber keypad 306 and buttons 310 through 313 which may comprise a portion of key matrix 216 as well known in the art, may be supported by printed circuit board 308 and may serve to hold touch input assembly 302 , 304 elevated and flush with an associated opening formed in the upper casing 316 of controlling device 100 .
  • the surface of acrylic keycap 304 covering touch sensor 302 may include indicia which provide cues to the functionality of input means 106 , which indicia may be embossed or engraved 320 or printed 318 upon the keycap surface.
  • additional indicia may also be present on acrylic keycap 304 , which additional indicia may be illuminated or otherwise brought into prominence during certain modes of operation, as will be described in further detail hereafter.
  • a user may slide a finger across the surface of the touch surface, e.g., keycap 304 , to cause navigation command output, for example as described in co-pending U.S. patent application Ser. No. 12/552,761, of like assignee and incorporated herein by reference in its entirety.
  • Such navigation step commands resulting from finger sliding gestures may be reported to a target appliance using any convenient transmission protocol, IR or RF, as known in the art. In general, such reports may include information representative of both direction and speed of the input gesture. Since exemplary gesture interpretation and reporting techniques are presented in the above referenced '761 application, for the sake of brevity these will not be repeated herein.
  • a user may press downwards 322 anywhere upon the touch surface, e.g., acrylic keycap 304 . As illustrated, this will result in compression of one or more of the underlying silicon rubber buttons 310 through 313 , for example button 310 ′ as shown in FIG. 3 . As in a conventional keypad, compression of such a button may cause a conductive contact area on the underside of said button to complete an electrical circuit provided for that purpose on printed circuit board 308 , i.e., cause a key press event to be detected by the operating program of controlling device 100 .
  • any one or more of silicon rubber buttons 310 through 313 may be interpreted by the operating program of controlling device 100 simply as a general signal that the touch pad input area 106 has received a finger press. The actual significance of the event and the command to be issued may then be determined by the operating program of controlling device 100 based on the position of the user's finger as reported by touch sensor 302 at the time the electrical circuit was completed.
  • buttons 310 through 313 may be completed individually or collectively in either order and within a short time of one another, which may lead to uncertainty as to the exact location of the actuating finger.
  • the decoding function of the controlling device operating program may in this instance be required to distinguish between a finger tap action and the commencement or termination of a finger slide action.
  • finger press detection and finger position detection are performed separately in the manner described above, which may result in a more robust and reliable overall detection mechanism.
  • provision of keypad elements as part of such a floating touch sensor may also result in improved user tactile feedback.
  • controlling device 100 may support multiple modes of operation of touch input area 106 .
  • the operation of touch input area 106 of controlling device 100 may be user switchable between navigation mode and digit entry mode, for example via a “numeric” toggle button 402 , labeled “1-2-3” in the illustrative example.
  • navigation mode user finger swipes and presses on touch input area 106 may be interpreted by the operating program of controlling device 100 as requests to issue navigation commands as described previously.
  • interpretation of touch area input by the operating program of controlling device 100 may change to a represent a twelve-key numeric input pad, with only finger press input recognized.
  • the appearance of touch input pad 106 in particular that of acrylic keycap 304 , may be altered to signal this mode of operation to a user, as illustrated at 404 .
  • Such a change in appearance may be effected, for example, by illumination via backlight of digit indicia laser etched into the surface of acrylic keycap 304 .
  • Illumination may be achieved by one or more LEDs directed towards the edge of keycap 304 , i.e., using the acrylic material as a light pipe; by conventional backlighting using one or more LEDs mounted on the surface of capacitive touch sensor 302 ; or any other means as appropriate for a particular embodiment.
  • an example of such an illuminable interface is described in commonly assigned, published application no. 2006/0283697.
  • exemplary controlling device 100 when exemplary controlling device 100 is functioning in the normal (i.e., navigation) mode, upon actuation of one or more of the keypad keys 310 through 313 associated with navigation pad 106 the operating program of controlling device 100 may retrieve the current finger position coordinates “X” 502 and “Y” 504 and translate these values into a command request based upon which one of five zones 506 the X,Y coordinates are determined to fall within.
  • a finger press at the indicated location 512 may be interpreted as occurring within zone 508 which corresponds in this example to the “left arrow” navigation indicia 514 , and the corresponding navigation command issued to the target appliance.
  • exemplary controlling device 100 when exemplary controlling device 100 is functioning in a digit entry mode as a result of actuation of “1-2-3” button 402 the retrieved X,Y coordinates may be interpreted by the operating program of controlling device 100 according to a twelve zone schema 520 , each zone now corresponding to one of the digits “ 0 ” through “ 9 ” together with an “Enter” and a “Separator” functions.
  • the operating program of controlling device 100 may interpret a finger press at location 512 ′ to correspond to the numeric digit “ 4 ”, and the corresponding numeric digit command issued to the target appliance.
  • the flowchart of FIG. 6 in conjunction with Tables 1 and 2 present an exemplary method for processing and interpreting user interactions which may be implemented by the operating program of controlling device 100 .
  • the actuated key is the “1-2-3” digit entry toggle button 402 . If so, at step 604 it may be next determined if controlling device is already functioning in the numeric entry mode. If not, at step 606 numeric mode operation status is set to “true” and numeric indicia 404 illuminated as described earlier. If however, the device is already functioning in the numeric entry mode, then actuation of button 402 may be interpreted a request to exit this mode and return to the navigation mode of operation, which is performed at step 608 .
  • the operating program of controlling device 100 may next determine if the actuated key is one of the group 310 through 313 associated with touch sensor assembly 302 , 304 . If not, the key input may represent a conventional button, for example “volume up” 406 , and is processed at step 612 . Since such conventional key decoding and command output are well known in the art, for the sake of brevity this aspect of controlling device 100 and associated operating program will not be discussed further herein.
  • the operating program of controlling device 100 determines that the actuated key is one or more of the group 310 through 313 .
  • the “X” and “Y” coordinates of the user's actuating finger position may be ascertained from touch sensor 302 .
  • the operating program of controlling device 100 may determine if touch pad input is currently to be interpreted as digit entry or as navigation entry. If navigation entry is the current operational mode, then at step 618 the reported X,Y coordinates may be interpreted according to a five zone model 506 illustrated in FIG. 5 .
  • command equals “left arrow”
  • command If Y is less than X, command equals “down arrow”
  • command If X is in the range 5 through 10, then command equals “down arrow”
  • command equals “right arrow”
  • command equals “down arrow”.
  • the operating program of controlling device 100 determines at step 616 that digit, i.e., numeric key, entry is the current operational mode, then at step 620 the reported X,Y coordinates may be interpreted according to the twelve zone model 520 illustrated in FIG. 5 . Assuming the same range of coordinate values as presented in the previous example, an algorithm as represented in Table 2 below may be applied to resolve the reported coordinate data into one of the twelve zones and thereby determine the requested appliance digit keypad command function.
  • the operating program of controlling device 100 may transmit the indicated command to the target appliance.
  • actuation of the numeric “Enter” key 408 may be defined to also cause controlling device 100 to exit the digit entry mode.
  • at step 624 it may be determined if the command just issued was “Enter” in which case processing continues at step 608 in order to clear the digit entry mode status, whereafter processing of the key matrix input is complete.
  • controlling device 100 ′ utilizing a floating touch sensor 106 ′ in accordance with the instant invention is presented.
  • the operating program of controlling device 100 ′ may simply report the raw X,Y coordinates of the actuation point to an appliance, for example cable STB 104 , for interpretation by that appliance.
  • cable STB 104 may for example then tailor its interpretation of the reported actuation location based upon STB 104 ′s current mode of operation.
  • STB 104 may interpret reported floating touch pad data as navigation commands while, when in direct channel tuning mode, STB 104 may interpret reported floating touch pad data as digit keys, scan or skip functions, etc. as appropriate.
  • the floating touch pad of controlling device 100 ′ may comprise markings 704 which serve to visually divide the touch surface into generic areas, and STB 104 may display on TV 102 a representation 702 of the current interpretation of those areas.

Abstract

A controlling device has a moveable touch sensitive panel positioned above a plurality of switches. When the controlling device senses an activation of at least one of the plurality of switches when caused by a movement of the touch sensitive panel resulting from an input at an input location upon the touch sensitive surface, the controlling device responds by transmitting a signal to an appliance wherein the signal is reflective of the input location upon the touch sensitive surface.

Description

    RELATED APPLICATION INFORMATION
  • This application claims the benefit of and is a continuation of U.S. application Ser. No. 16/279,095, filed on Feb. 19, 2019, which application claims the benefit of and is a continuation of U.S. application Ser. No. 15/902,007, filed on Feb. 22, 2018, which application claims the benefit of and is a continuation of U.S. application Ser. No. 12/645,037, filed on Dec. 22, 2009, the disclosures of which are incorporated herein by reference in its entirety.
  • BACKGROUND
  • Controlling devices for use in issuing commands to entertainment and other appliances, for example remote controls, and the features and functionality provided by such controlling devices are well known in the art. Traditionally, user input means on such controlling devices has comprised a series of buttons each of which may result in the transmission of a specific command when activated. Increasingly in today's environment, such controlling devices must be used to interact with displayed menu systems, browse web pages, manipulate pointers, and perform other similar activities which may require directional control input, e.g., to scroll displayed information on a screen, to move a pointer, to control a game activity or avatar, to zoom in or out, to control functions such as fast forward or slow motion, or the like (such activities collectively referred to hereinafter as “navigation”). Although certain navigation functions may be performed using conventional controlling device input mechanisms, such as a group of up, down, left, and right arrow keys, in many instances the user experience may be improved by the provision of an input mechanism which is better suited to this type of activity. Additionally, multi-functional use of this input mechanism may further improve user experience by reducing the number of keys or buttons on a controlling device.
  • SUMMARY
  • The following generally describes a system and method for providing improved user input functionality on a controlling device. To this end, in addition to a conventional key matrix for receiving button inputs as is well known in the art, a controlling device may be provided with input means such as for example a resistive or capacitive touch sensor, etc., whereby motion and/or pressure by a user's finger may be translated into navigation commands to be transmitted to a target controlled device. These commands may be applied at the target device to control operations such as scrolling a menu, movement of a cursor on the screen, motion of a game object, etc., as appropriate for a particular application. Furthermore, in addition to, or when not required for, the performance of navigation functions, the touch sensitive input means may be adapted to provide for conventional keypress input operations, such as for example without limitation a numeric keypad in an illustrative embodiment.
  • A better understanding of the objects, advantages, features, properties and relationships of the invention will be obtained from the following detailed description and accompanying drawings which set forth illustrative embodiments and which are indicative of the various ways in which the principles of the invention may be employed.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • For a better understanding of the various aspects of the invention, reference may be had to preferred embodiments shown in the attached drawings in which:
  • FIG. 1 illustrates an exemplary system in which an exemplary controlling device according to the instant invention may be used;
  • FIG. 2 illustrates a block diagram of exemplary components of the exemplary controlling device of FIG. 1;
  • FIG. 3 illustrates the structure and operation of an exemplary touch sensitive input area of the exemplary controlling device of FIG. 1;
  • FIG. 4 illustrates multiple modes of operation of the exemplary controlling device of FIG. 1;
  • FIG. 5 illustrates exemplary interpretations of user input interactions with a touch sensitive area of the exemplary controlling device of FIG. 4;
  • FIG. 6 illustrates in flow chart form an exemplary method for performing the interpretations illustrated in FIG. 5; and
  • FIG. 7 illustrates an alternate embodiment of a controlling device and system in which the teachings of the instant invention may be used.
  • DETAILED DESCRIPTION
  • Turning now to FIG. 1, there is illustrated an exemplary system in which a controlling device 100 is configured to control various controllable appliances, such as for example a television 102 and a set top box (“STB”) 104. As is known in the art, the controlling device 100 may be capable of transmitting commands to the appliances, using any convenient IR, RF, Point-to-Point, or networked protocol, to cause the appliances to perform operational functions. While illustrated in the context of a television 102 and STB 104, it is to be understood that controllable appliances may include, but need not be limited to, televisions, VCRs, DVRs, DVD players, cable or satellite converter set-top boxes (“STBs”), amplifiers, CD players, game consoles, home lighting, drapery, fans, HVAC systems, thermostats, personal computers, etc. In a particular illustrative embodiment, in addition to conventional control functionality as is well know in the art, controlling device 100 may further include an input area 106 for generation of navigation commands for transmission from the controlling device 100 to one or more appliances in response to user interaction with that area, used for example to scroll a program guide menu display 108 on TV 102 by issuing a series of commands to set top box 104. Additionally, in the exemplary embodiment, input area 106 may be further adapted to offer keypad-like functionality during certain modes of operation, all as will be described in further detail hereafter.
  • With reference to FIG. 2, for use in commanding the functional operations of one or more appliances, the controlling device 100 may include, as needed for a particular application, a processor 200 coupled to a ROM memory 204; a RAM memory 202; a key matrix 216 (e.g., hard keys, soft keys such as a touch sensitive surface overlaid on a liquid crystal (LCD), and/or an electroluminescent (EL) display); a scrolling and/or navigation function input means 218 such as a capacitive or resistive touch sensor; transmission circuit(s) and/or transceiver circuit(s) 210 (e.g., IR and/or RF); a non-volatile read/write memory 206; a means 220 to provide visual feedback to the user (e.g., one or more LEDs, display, and/or the like); a means 222 to provide audible feedback to a user (e.g., a speaker, piezoelectric buzzer, etc.); a power source 208; an input/output port 224 such as a serial interface, USB port, modem, Zigbee, WiFi , or Bluetooth transceiver, etc.; one or more means 226 for backlighting areas of touchpad 218 and/or key matrix 216; and clock and timer logic 212 with associated crystal or resonator 214.
  • As will be understood by those skilled in the art, some or all of the memories 202, 204, 206 may include executable instructions (collectively, the program memory) that are intended to be executed by the processor 200 to control the operation of the remote control 100, as well as data which serves to define to the operational software the necessary control protocols and command values for use in transmitting command signals to controllable appliances (collectively, the command data). In this manner, the processor 200 may be programmed to control the various electronic components within the remote control 100, e.g., to monitor the key matrix 216, to cause the transmission of signals, etc. The non-volatile read/write memory 206, for example an EEPROM, battery-backed up RAM, FLASH, Smart Card, memory stick, or the like, may additionally be provided to store setup data and parameters as necessary. While the memory 204 is illustrated and described as a ROM memory, memory 204 can also be comprised of any type of readable media, such as ROM, FLASH, EEPROM, or the like. Preferably, the memories 204 and 206 are non-volatile or battery-backed such that data is not required to be reloaded after battery changes. In addition, the memories 202, 204 and 206 may take the form of a chip, a hard disk, a magnetic disk, an optical disk, and/or the like. Still further, it will be appreciated that some or all of the illustrated memory devices may be physically combined (for example, a single FLASH memory may be logically partitioned into different portions to support the functionality of memories 204 and 206 respectively), and/or may be physically incorporated within the same IC chip as the microprocessor 200 (a so called “microcontroller”) and, as such, they are shown separately in FIG. 2 only for the sake of clarity.
  • To cause the controlling device 100 to perform an action, the controlling device 100 may be adapted to be responsive to events, such as a sensed user interaction with the key matrix 216, touchpad 218, etc. In response to an event, appropriate instructions within the program memory (hereafter the “operating program”) may be executed. For example, when a function key is actuated on the controlling device 100, the controlling device 100 may retrieve from the command data stored in memory 202, 204, 206 a command value and control protocol corresponding to the actuated function key and, where necessary, current device mode, and will use the retrieved command data to transmit to an intended target appliance, e.g., STB 104, a command in a format recognizable by that appliance to thereby control one or more functional operations of that appliance. It will be appreciated that the operating program can be used not only to cause the transmission of commands and/or data to the appliances, but also to perform local operations. While not limiting, local operations that may be performed by the controlling device 100 may include displaying information/data, favorite channel setup, macro key setup, function key relocation, etc. Examples of local operations can be found in U.S. Pat. Nos. 5,481,256, 5,959,751, and 6,014,092.
  • In some embodiments, controlling device 100 may be the universal type, that is provisioned with a library comprising a multiplicity of command codes and protocols suitable for controlling various appliances. In such cases, for selecting sets of command data to be associated with the specific appliances to be controlled (hereafter referred to as a setup procedure), data may be entered into the controlling device 100 that serves to identify each intended target appliance by its make, and/or model, and/or type. The data may typically be entered via activation of those keys that are also used to cause the transmission of commands to an appliance, preferably the keys that are labeled with numerals. Such data allows the controlling device 100 to identify the appropriate command data set within the library of command data that is to be used to transmit recognizable commands in formats appropriate for such identified appliances. The library of command data may represent a plurality of controllable appliances of different types and manufacture, a plurality of controllable appliances of the same type but different manufacture, a plurality of appliances of the same manufacture but different type or model, etc., or any combination thereof as appropriate for a given embodiment. In conventional practice as is well known in the art, such data used to identify an appropriate command data set may take the form of a numeric setup code (obtained, for example, from a printed list of manufacturer names and/or models with corresponding code numbers, from a support Web site, etc.). Alternative setup procedures known in the art include scanning bar codes, sequentially transmitting a predetermined command in different formats until a target appliance response is detected, interaction with a Web site culminating in downloading of command data and/or setup codes to the controlling device, etc. Since such methods for setting up a controlling device to command the operation of specific home appliances are well-known, these will not be described in greater detail herein. Nevertheless, for additional information pertaining to setup procedures, the reader may turn, for example, to U.S. Patent Nos. 4,959,810, 5,614,906, or 6,225,938 all of like assignee and incorporated herein by reference in their entirety.
  • In keeping with the teachings of this invention, controlling device 100 may include input means for accepting user touch input to be translated into navigation commands. In an exemplary embodiment, input means 218 may take the form of a multiple-electrode capacitive touch sensor. In this form, input means 218 may accept finger sliding gestures on either axis for translation into navigation step commands in an X or Y direction, as well as finger pressure at, for example, the cardinal points and center area for translation into discrete commands, for example equivalent to a conventional keypad's four arrow keys and a select key, all as will be described in further detail hereafter.
  • Turning to FIG. 3, the construction of an exemplary navigation input means 218, which may comprise area 106 of exemplary controlling device 100, will now be discussed in detail. Such an input means may comprise the before-mentioned multiple-electrode capacitive touch sensor 302 and an associated acrylic keycap 304, positioned upon a group of conventional silicon rubber keypad buttons 310, 311, 312, 313 (hereafter a “floating touch sensor”). Silicon rubber keypad 306 and buttons 310 through 313, which may comprise a portion of key matrix 216 as well known in the art, may be supported by printed circuit board 308 and may serve to hold touch input assembly 302,304 elevated and flush with an associated opening formed in the upper casing 316 of controlling device 100. In an exemplary embodiment the surface of acrylic keycap 304 covering touch sensor 302 may include indicia which provide cues to the functionality of input means 106, which indicia may be embossed or engraved 320 or printed 318 upon the keycap surface. In certain embodiments additional indicia may also be present on acrylic keycap 304, which additional indicia may be illuminated or otherwise brought into prominence during certain modes of operation, as will be described in further detail hereafter.
  • In a first input mode, a user may slide a finger across the surface of the touch surface, e.g., keycap 304, to cause navigation command output, for example as described in co-pending U.S. patent application Ser. No. 12/552,761, of like assignee and incorporated herein by reference in its entirety. Such navigation step commands resulting from finger sliding gestures may be reported to a target appliance using any convenient transmission protocol, IR or RF, as known in the art. In general, such reports may include information representative of both direction and speed of the input gesture. Since exemplary gesture interpretation and reporting techniques are presented in the above referenced '761 application, for the sake of brevity these will not be repeated herein.
  • In a second input mode, which may be used in conjunction with or separately from finger slide input, a user may press downwards 322 anywhere upon the touch surface, e.g., acrylic keycap 304. As illustrated, this will result in compression of one or more of the underlying silicon rubber buttons 310 through 313, for example button 310′ as shown in FIG. 3. As in a conventional keypad, compression of such a button may cause a conductive contact area on the underside of said button to complete an electrical circuit provided for that purpose on printed circuit board 308, i.e., cause a key press event to be detected by the operating program of controlling device 100. In this instance however, the actuation of any one or more of silicon rubber buttons 310 through 313 may be interpreted by the operating program of controlling device 100 simply as a general signal that the touch pad input area 106 has received a finger press. The actual significance of the event and the command to be issued may then be determined by the operating program of controlling device 100 based on the position of the user's finger as reported by touch sensor 302 at the time the electrical circuit was completed.
  • By way of further example, if conventional keypress decoding based only on the status of silicon rubber buttons 310 through 313 were to be employed in this example and user finger pressure was applied at location 324, it will be appreciated that the circuits associated with either or both of buttons 310 and 313 may be completed individually or collectively in either order and within a short time of one another, which may lead to uncertainty as to the exact location of the actuating finger. Likewise, considering for a moment an alternate embodiment in which the silicon buttons are dispensed with and the touch input pad fixedly mounted in the controlling device casing, the decoding function of the controlling device operating program may in this instance be required to distinguish between a finger tap action and the commencement or termination of a finger slide action. Accordingly, it will be appreciated that in the exemplary embodiment presented, advantageously finger press detection and finger position detection are performed separately in the manner described above, which may result in a more robust and reliable overall detection mechanism. Further, the provision of keypad elements as part of such a floating touch sensor may also result in improved user tactile feedback.
  • Certain embodiments of controlling device 100 may support multiple modes of operation of touch input area 106. By way of example, with reference to FIG. 4, in an exemplary embodiment the operation of touch input area 106 of controlling device 100 may be user switchable between navigation mode and digit entry mode, for example via a “numeric” toggle button 402, labeled “1-2-3” in the illustrative example. When in the navigation mode, user finger swipes and presses on touch input area 106 may be interpreted by the operating program of controlling device 100 as requests to issue navigation commands as described previously. However, when toggled into digit entry mode by activation of button 402, interpretation of touch area input by the operating program of controlling device 100 may change to a represent a twelve-key numeric input pad, with only finger press input recognized. In some embodiments the appearance of touch input pad 106, in particular that of acrylic keycap 304, may be altered to signal this mode of operation to a user, as illustrated at 404. Such a change in appearance may be effected, for example, by illumination via backlight of digit indicia laser etched into the surface of acrylic keycap 304. Illumination may be achieved by one or more LEDs directed towards the edge of keycap 304, i.e., using the acrylic material as a light pipe; by conventional backlighting using one or more LEDs mounted on the surface of capacitive touch sensor 302; or any other means as appropriate for a particular embodiment. Without limitation, an example of such an illuminable interface is described in commonly assigned, published application no. 2006/0283697.
  • Turning now to FIG. 5, when exemplary controlling device 100 is functioning in the normal (i.e., navigation) mode, upon actuation of one or more of the keypad keys 310 through 313 associated with navigation pad 106 the operating program of controlling device 100 may retrieve the current finger position coordinates “X” 502 and “Y” 504 and translate these values into a command request based upon which one of five zones 506 the X,Y coordinates are determined to fall within. By way of example, a finger press at the indicated location 512 may be interpreted as occurring within zone 508 which corresponds in this example to the “left arrow” navigation indicia 514, and the corresponding navigation command issued to the target appliance. In contrast, in an illustrative embodiment, when exemplary controlling device 100 is functioning in a digit entry mode as a result of actuation of “1-2-3” button 402 the retrieved X,Y coordinates may be interpreted by the operating program of controlling device 100 according to a twelve zone schema 520, each zone now corresponding to one of the digits “0” through “9” together with an “Enter” and a “Separator” functions. By way of further example, when functioning in this mode the operating program of controlling device 100 may interpret a finger press at location 512′ to correspond to the numeric digit “4”, and the corresponding numeric digit command issued to the target appliance.
  • By way of more detailed example, the flowchart of FIG. 6 in conjunction with Tables 1 and 2 present an exemplary method for processing and interpreting user interactions which may be implemented by the operating program of controlling device 100. Turning to FIG. 6, upon detection of key matrix input 600 by the operating program of controlling device 100, it may be first determined at step 602 if the actuated key is the “1-2-3” digit entry toggle button 402. If so, at step 604 it may be next determined if controlling device is already functioning in the numeric entry mode. If not, at step 606 numeric mode operation status is set to “true” and numeric indicia 404 illuminated as described earlier. If however, the device is already functioning in the numeric entry mode, then actuation of button 402 may be interpreted a request to exit this mode and return to the navigation mode of operation, which is performed at step 608.
  • If the actuated key is not the “1-2-3” button, at step 610 the operating program of controlling device 100 may next determine if the actuated key is one of the group 310 through 313 associated with touch sensor assembly 302, 304. If not, the key input may represent a conventional button, for example “volume up” 406, and is processed at step 612. Since such conventional key decoding and command output are well known in the art, for the sake of brevity this aspect of controlling device 100 and associated operating program will not be discussed further herein.
  • If however, the operating program of controlling device 100 determines that the actuated key is one or more of the group 310 through 313, at step 614 the “X” and “Y” coordinates of the user's actuating finger position may be ascertained from touch sensor 302. Next, in order to establish the interpretation to be applied to these values, at step 616 the operating program of controlling device 100 may determine if touch pad input is currently to be interpreted as digit entry or as navigation entry. If navigation entry is the current operational mode, then at step 618 the reported X,Y coordinates may be interpreted according to a five zone model 506 illustrated in FIG. 5. By way of example, without limitation, if the X and Y coordinates are each reported as a linear value in the range 0 to 15 with origin 0,0 at the bottom left corner of touchpad 302, then an exemplary algorithm as presented in Table 1 below may be applied to resolve the reported coordinate data into one of the five zones and thereby determine the requested appliance navigation command function.
  • TABLE 1
    Reported X value
    0-4 5-10 11-15
    Reported Y 11-15 (Y-11) > X: UP UP Y > X: UP
    value (Y-11) < X: LEFT Y < X: RIGHT
     5-10 LEFT SEL RIGHT
    0-4 Y > X: LEFT DOWN Y > (X-11): RIGHT
    Y < X: DOWN Y < (X-11): DOWN
  • For example, with reference to the bottom row of Table 1, i.e., when reported Y coordinate is in the range 0 through 4:
  • If X is in the range 0 through 4, then
  • If Y is greater than X, command equals “left arrow”
  • else
  • If Y is less than X, command equals “down arrow”
  • else
  • If X is in the range 5 through 10, then command equals “down arrow”
  • else
  • If X is in the range 11 through 15, then
  • If Y is greater than (X-11), command equals “right arrow”
  • else
  • If Y is less than (X-11), command equals “down arrow”.
  • As will be evident from an examination of Table 1, similar algorithms may be symmetrically applied to the other possible ranges of X and Y to resolve these values as locations within the five zone pattern 506 of FIG. 5 and generate command transmissions accordingly.
  • If however, the operating program of controlling device 100 determines at step 616 that digit, i.e., numeric key, entry is the current operational mode, then at step 620 the reported X,Y coordinates may be interpreted according to the twelve zone model 520 illustrated in FIG. 5. Assuming the same range of coordinate values as presented in the previous example, an algorithm as represented in Table 2 below may be applied to resolve the reported coordinate data into one of the twelve zones and thereby determine the requested appliance digit keypad command function.
  • TABLE 2
    Reported X value
    0-5 5-10 11-15
    Reportted Y value 12-15 1 2 3
     8-11 4 5 6
    4-7 7 8 9
    0-3 0 Enter
  • After determining the requested appliance command function in the manner described above, at step 622 the operating program of controlling device 100 may transmit the indicated command to the target appliance. In certain embodiments, actuation of the numeric “Enter” key 408 may be defined to also cause controlling device 100 to exit the digit entry mode. In such embodiments, at step 624 it may be determined if the command just issued was “Enter” in which case processing continues at step 608 in order to clear the digit entry mode status, whereafter processing of the key matrix input is complete.
  • Turning now to FIG. 7, an alternative exemplary embodiment of a controlling device 100′ utilizing a floating touch sensor 106′ in accordance with the instant invention is presented. In this embodiment, upon actuation of one or more of silicon keypad keys 310 through 313, the operating program of controlling device 100′ may simply report the raw X,Y coordinates of the actuation point to an appliance, for example cable STB 104, for interpretation by that appliance. In such an embodiment cable STB 104 may for example then tailor its interpretation of the reported actuation location based upon STB 104′s current mode of operation. For example, when displaying program guide information 108 on TV 102, STB 104 may interpret reported floating touch pad data as navigation commands while, when in direct channel tuning mode, STB 104 may interpret reported floating touch pad data as digit keys, scan or skip functions, etc. as appropriate. To facilitate the user interface in this environment, the floating touch pad of controlling device 100′ may comprise markings 704 which serve to visually divide the touch surface into generic areas, and STB 104 may display on TV 102 a representation 702 of the current interpretation of those areas.
  • While various concepts have been described in detail, it will be appreciated by those skilled in the art that various modifications and alternatives to those concepts could be developed in light of the overall teachings of the disclosure. For example, while the exemplary embodiment presented above utilizes a silicon rubber keypad as an actuation element for the floating touch sensor, it will be appreciated that various other mechanisms such as metallic dome switches, micro switches, flexible leaf contacts, etc. may be successfully utilized in other embodiments.
  • Further, while described in the context of functional modules and illustrated using block diagram format, it is to be understood that, unless otherwise stated to the contrary, one or more of the described functions and/or features may be integrated in a single physical device and/or a software module, or one or more functions and/or features may be implemented in separate physical devices or software modules. It will also be appreciated that a detailed discussion of the actual implementation of each module is not necessary for an enabling understanding of the invention. Rather, the actual implementation of such modules would be well within the routine skill of an engineer, given the disclosure herein of the attributes, functionality, and inter-relationship of the various functional modules in the system. Therefore, a person skilled in the art, applying ordinary skill, will be able to practice the invention set forth in the claims without undue experimentation. It will be additionally appreciated that the particular concepts disclosed are meant to be illustrative only and not limiting as to the scope of the invention which is to be given the full breadth of the appended claims and any equivalents thereof.
  • All publications cited within this document are hereby incorporated by reference in their entirety.

Claims (11)

What is claimed is:
1. A controlling device, comprising:
a casing having an opening; and
an input device disposed in the opening comprised of a moveable touch sensitive surface positioned above a plurality of switches;
wherein the controlling device is adapted to respond to an activation event indicative of an activation of at least one of the plurality of switches by:
a) using a determined touch location upon the touch sensitive surface to select a one of the plurality of switches;
b) using the selected one of the plurality of switches to retrieve from a library of command data stored in a memory of the controlling device a command data for use in controlling a functional operation of an appliance; and
d) using the retrieved command data to transmit a command signal to the appliance via use of a transmission protocol recognizable by the appliance.
2. The controlling device as recited in claim 1, wherein, in a navigation operating mode of the controlling device, the controlling device additionally responds to a moving input upon the touch sensitive surface by transmitting to the appliance a navigation signal that comprises data representative of coordinates for the moving input upon the touch sensitive surface.
3. The controlling device as recited in claim 1, wherein the touch sensitive surface comprises a keycap disposed over a multiple-electrode capacitive touch sensor.
4. The controlling device as recited in claim 3, wherein the plurality of switches comprise silicon rubber keypad buttons supported upon a printed circuit board.
5. The controlling device as recited in claim 4, wherein the keycap displays a plurality of user interface elements.
6. A method performed by a controlling device, comprising a casing having an opening and an input device disposed in the opening comprised of a moveable touch sensitive surface positioned above a plurality of switches, to transmit a signal to an appliance, the method comprising:
responding to an activation of at least one of the plurality of switches by:
a) using a determined touch location on the moveable touch sensitive surface to select a one of the plurality of switches;
b) using the selected one of the plurality of switches to retrieve from a library of command data stored in a memory of the controlling device a command data for use in controlling a functional operation of an appliance; and
c) using the retrieved command data to transmit a command signal to the appliance via use of a transmission protocol recognizable by the appliance.
7. The method as recited in claim 6, wherein a plurality of surface touch zones for the touch sensitive surface is defined.
8. The method as recited in claim 7, wherein the plurality of surface touch zones for the touch sensitive surface are defined as a function of an active one of a plurality of operational modes for the controlling device.
9. The method as recited in claim 6, wherein the touch sensitive surface comprises a keycap disposed over a multiple-electrode capacitive touch sensor and the method further comprises using the keycap to display a plurality of user interface elements.
10. The method as recited in claim 9, wherein the method further comprises defining a plurality of surface touch zones for the touch sensitive surface each corresponding to a respective one of the plurality of user interface elements.
11. The method as recited in claim 10, further comprising using an active one of a plurality of operational modes for the controlling device to determine the plurality of user interface elements to be displayed.
US17/737,664 2009-12-22 2022-05-05 System and method for multi-mode command input Pending US20220261090A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/737,664 US20220261090A1 (en) 2009-12-22 2022-05-05 System and method for multi-mode command input

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US12/645,037 US20110148762A1 (en) 2009-12-22 2009-12-22 System and method for multi-mode command input
US15/902,007 US20180181210A1 (en) 2009-12-22 2018-02-22 System and method for multi-mode command input
US16/279,095 US20190187808A1 (en) 2009-12-22 2019-02-19 System and method for multi-mode command input
US17/737,664 US20220261090A1 (en) 2009-12-22 2022-05-05 System and method for multi-mode command input

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US16/279,095 Continuation US20190187808A1 (en) 2009-12-22 2019-02-19 System and method for multi-mode command input

Publications (1)

Publication Number Publication Date
US20220261090A1 true US20220261090A1 (en) 2022-08-18

Family

ID=44150311

Family Applications (6)

Application Number Title Priority Date Filing Date
US12/645,037 Pending US20110148762A1 (en) 2009-12-22 2009-12-22 System and method for multi-mode command input
US15/902,007 Abandoned US20180181210A1 (en) 2009-12-22 2018-02-22 System and method for multi-mode command input
US16/279,095 Pending US20190187808A1 (en) 2009-12-22 2019-02-19 System and method for multi-mode command input
US17/737,524 Pending US20220261089A1 (en) 2009-12-22 2022-05-05 System and method for multi-mode command input
US17/737,664 Pending US20220261090A1 (en) 2009-12-22 2022-05-05 System and method for multi-mode command input
US17/955,756 Pending US20230195239A1 (en) 2009-12-22 2022-09-29 System and method for multi-mode command input

Family Applications Before (4)

Application Number Title Priority Date Filing Date
US12/645,037 Pending US20110148762A1 (en) 2009-12-22 2009-12-22 System and method for multi-mode command input
US15/902,007 Abandoned US20180181210A1 (en) 2009-12-22 2018-02-22 System and method for multi-mode command input
US16/279,095 Pending US20190187808A1 (en) 2009-12-22 2019-02-19 System and method for multi-mode command input
US17/737,524 Pending US20220261089A1 (en) 2009-12-22 2022-05-05 System and method for multi-mode command input

Family Applications After (1)

Application Number Title Priority Date Filing Date
US17/955,756 Pending US20230195239A1 (en) 2009-12-22 2022-09-29 System and method for multi-mode command input

Country Status (5)

Country Link
US (6) US20110148762A1 (en)
EP (3) EP3667473B1 (en)
CN (1) CN102667686A (en)
BR (1) BR112012015405B1 (en)
WO (1) WO2011079097A1 (en)

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9520056B2 (en) 2010-05-11 2016-12-13 Universal Electronics Inc. System and methods for enhanced remote control functionality
US8907892B2 (en) 2010-11-22 2014-12-09 Hillcrest Laboratories, Inc. 3D pointing device with up-down-left-right mode switching and integrated swipe detector
TW201322294A (en) * 2011-11-29 2013-06-01 Darfon Electronics Corp Keyboard
CN102566913B (en) * 2011-12-16 2018-06-19 中兴通讯股份有限公司 The implementation method and remote controler of a kind of remote controler
CN102750811B (en) * 2012-03-09 2015-01-28 张伟明 Misoperation-preventing remote control component, intelligent system and method
CN103517109A (en) * 2012-06-29 2014-01-15 Tcl集团股份有限公司 Touch remote controller and touch remote control system
CN204009771U (en) * 2014-08-06 2014-12-10 胡竞韬 A kind of sense of touch type controller
KR20170029180A (en) * 2015-09-07 2017-03-15 현대자동차주식회사 Vehicle, and control method for the same
CN105894797A (en) * 2015-12-08 2016-08-24 乐视移动智能信息技术(北京)有限公司 Infrared remote control method and device and mobile terminal
CA3017635A1 (en) * 2016-03-22 2017-09-28 Spectrum Brands, Inc. Garage door opener with touch sensor authentication
US11301064B2 (en) * 2017-05-12 2022-04-12 Razer (Asia-Pacific) Pte. Ltd. Pointing devices and methods for providing and inhibiting user inputs to a computing device

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090273573A1 (en) * 2006-07-06 2009-11-05 Apple Inc. Mutual capacitance touch sensing device

Family Cites Families (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6014092A (en) 1987-10-14 2000-01-11 Universal Electronics Inc. Key mover
US4959810A (en) 1987-10-14 1990-09-25 Universal Electronics, Inc. Universal remote control device
US5481256A (en) 1987-10-14 1996-01-02 Universal Electronics Inc. Direct entry remote control with channel scan
US5614906A (en) 1996-04-23 1997-03-25 Universal Electronics Inc. Method for selecting a remote control command set
US6429846B2 (en) * 1998-06-23 2002-08-06 Immersion Corporation Haptic feedback for touchpads and other touch controls
US6225938B1 (en) 1999-01-14 2001-05-01 Universal Electronics Inc. Universal remote control system with bar code setup
US6507306B1 (en) * 1999-10-18 2003-01-14 Contec Corporation Universal remote control unit
US6390699B1 (en) * 1999-11-29 2002-05-21 Associate Technology Limited Keyboard with moveable base plate providing key travel
US8863184B2 (en) * 2001-07-13 2014-10-14 Universal Electronics Inc. System and method for presenting program guide information in an electronic portable device
CN102609088B (en) * 2001-11-01 2015-12-16 意美森公司 For providing the method and system of sense of touch
US7274303B2 (en) * 2002-03-01 2007-09-25 Universal Electronics Inc. Power strip with control and monitoring functionality
TWM240050U (en) * 2003-04-02 2004-08-01 Elan Microelectronics Corp Capacitor touch panel with integrated keyboard and handwriting function
KR100568227B1 (en) * 2003-04-21 2006-04-07 삼성전자주식회사 Remote control of providing navigation function and method thereof
US7499040B2 (en) * 2003-08-18 2009-03-03 Apple Inc. Movable touch pad with added functionality
US7460050B2 (en) * 2003-09-19 2008-12-02 Universal Electronics, Inc. Controlling device using cues to convey information
GB0403854D0 (en) * 2004-02-20 2004-03-24 Pelikon Ltd Switches
US7872642B2 (en) * 2004-03-12 2011-01-18 Universal Electronics Inc. Controlling device having multiple user interfaces
US8531392B2 (en) * 2004-08-04 2013-09-10 Interlink Electronics, Inc. Multifunctional scroll sensor
US7319426B2 (en) 2005-06-16 2008-01-15 Universal Electronics Controlling device with illuminated user interface
TWI316908B (en) * 2005-06-23 2009-11-11 Honda Motor Co Ltd Tank cap
US7652660B2 (en) * 2005-10-11 2010-01-26 Fish & Richardson P.C. Mobile device customizer
KR100718138B1 (en) * 2005-11-01 2007-05-14 삼성전자주식회사 Function input method and apparatus for inputting function in portable terminal thereof
JP4163713B2 (en) * 2005-12-07 2008-10-08 株式会社東芝 Information processing apparatus and touchpad control method
US20070152983A1 (en) * 2005-12-30 2007-07-05 Apple Computer, Inc. Touch pad with symbols based on mode
US8421602B2 (en) * 2006-09-13 2013-04-16 Savant Systems, Llc Remote control unit for a programmable multimedia controller
WO2008074066A1 (en) * 2006-12-19 2008-06-26 Gesswein, Andreas Klaus A set of components able to be coupled together
US8166558B2 (en) * 2007-03-23 2012-04-24 Universal Electronics Inc. System and method for upgrading the functionality of a controlling device in a secure manner
JP2008270023A (en) * 2007-04-23 2008-11-06 Tokai Rika Co Ltd Switch equipped with touch sensor
US8065624B2 (en) * 2007-06-28 2011-11-22 Panasonic Corporation Virtual keypad systems and methods
US8031175B2 (en) * 2008-04-21 2011-10-04 Panasonic Corporation Touch sensitive remote control system that detects hand size characteristics of user and adapts mapping to screen display
JP2009200542A (en) * 2008-02-19 2009-09-03 Panasonic Corp Remote control transmitter
US9454256B2 (en) * 2008-03-14 2016-09-27 Apple Inc. Sensor configurations of an input device that are switchable based on mode
US20100220065A1 (en) * 2009-02-27 2010-09-02 Research In Motion Limited Touch-sensitive display including a force-sensor and portable electronic device including same
US8395590B2 (en) * 2008-12-17 2013-03-12 Apple Inc. Integrated contact switch and touch sensor elements
US20110095988A1 (en) * 2009-10-24 2011-04-28 Tara Chand Singhal Integrated control mechanism for handheld electronic devices
US8330639B2 (en) * 2009-12-24 2012-12-11 Silverlit Limited Remote controller

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090273573A1 (en) * 2006-07-06 2009-11-05 Apple Inc. Mutual capacitance touch sensing device

Also Published As

Publication number Publication date
US20190187808A1 (en) 2019-06-20
EP3663898B1 (en) 2021-11-17
BR112012015405A2 (en) 2020-09-01
EP2517088A1 (en) 2012-10-31
BR112012015405B1 (en) 2021-03-02
EP3663898B8 (en) 2021-12-22
WO2011079097A1 (en) 2011-06-30
US20220261089A1 (en) 2022-08-18
EP3667473A1 (en) 2020-06-17
US20110148762A1 (en) 2011-06-23
US20180181210A1 (en) 2018-06-28
EP2517088A4 (en) 2016-03-09
CN102667686A (en) 2012-09-12
US20230195239A1 (en) 2023-06-22
EP2517088B1 (en) 2020-03-11
EP3667473B1 (en) 2021-07-14
EP3663898A1 (en) 2020-06-10

Similar Documents

Publication Publication Date Title
US20220261090A1 (en) System and method for multi-mode command input
CN101518059B (en) Method of generating key code in coordinate recognition device and video device controller using same
EP1183590B1 (en) Communication system and method
KR20050013578A (en) A graphic user interface having touch detectability
US20160018911A1 (en) Touch pen
CN102750811B (en) Misoperation-preventing remote control component, intelligent system and method
JP2008140182A (en) Input device, transmission/reception system, input processing method and control program
JP2013131087A (en) Display device
KR100514736B1 (en) method of controlling movement of pointer
CN101470575B (en) Electronic device and its input method
US9060153B2 (en) Remote control device, remote control system and remote control method thereof
WO2013106319A1 (en) Features for use with a multi-sided controlling device
US10409485B2 (en) Adaptive user input device
KR101451942B1 (en) Screen control method and system for changing category
US7626570B2 (en) Input device
TW201427401A (en) Television, remote controller and menu displaying method
WO2003071377A2 (en) Display device and pointing device
EP2924668A1 (en) Remote control for a remotely controlling a device
JP2011124718A (en) Remote control device
JP2005340869A (en) Image processor
KR20140077607A (en) Optical finger mouse

Legal Events

Date Code Title Description
AS Assignment

Owner name: UNIVERSAL ELECTRONICS INC., ARIZONA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HATAMBEIKI, ARSHAM;KOHANEK, JEFFREY;KEILES, PAMELA EICHLER;SIGNING DATES FROM 20091217 TO 20091221;REEL/FRAME:059832/0193

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STCV Information on status: appeal procedure

Free format text: APPEAL BRIEF (OR SUPPLEMENTAL BRIEF) ENTERED AND FORWARDED TO EXAMINER

STCV Information on status: appeal procedure

Free format text: EXAMINER'S ANSWER TO APPEAL BRIEF MAILED

STCV Information on status: appeal procedure

Free format text: APPEAL READY FOR REVIEW

STCV Information on status: appeal procedure

Free format text: ON APPEAL -- AWAITING DECISION BY THE BOARD OF APPEALS