US20040233158A1 - Systems and methods for identifying user input - Google Patents

Systems and methods for identifying user input Download PDF

Info

Publication number
US20040233158A1
US20040233158A1 US10/442,838 US44283803A US2004233158A1 US 20040233158 A1 US20040233158 A1 US 20040233158A1 US 44283803 A US44283803 A US 44283803A US 2004233158 A1 US2004233158 A1 US 2004233158A1
Authority
US
United States
Prior art keywords
force
input
user interface
input sensors
location
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US10/442,838
Inventor
Donald Stavely
Wilfred Brake
Dan Dalton
James Dow
Amy Battles
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hewlett Packard Development Co LP
Original Assignee
Hewlett Packard Development Co LP
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hewlett Packard Development Co LP filed Critical Hewlett Packard Development Co LP
Priority to US10/442,838 priority Critical patent/US20040233158A1/en
Assigned to HEWLETT-PACKARD DEVELOPMENT COMPANY, L.P. reassignment HEWLETT-PACKARD DEVELOPMENT COMPANY, L.P. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: DALTON, DAN L., STAVELY, DONALD J., BATTLES, AMY E., BRAKE, WILFRED F., DOW, JAMES C.
Priority to TW092135387A priority patent/TW200426663A/en
Priority to JP2004143381A priority patent/JP2004348725A/en
Publication of US20040233158A1 publication Critical patent/US20040233158A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0414Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using force sensing means to determine a position
    • G06F3/04142Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using force sensing means to determine a position the force sensing means being located peripherally, e.g. disposed at the corners or at the side of a touch sensing plate
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/043Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using propagating acoustic waves

Definitions

  • Many devices comprise user interfaces with which users can enter inputs (e.g., selections and/or data) into the device.
  • Such interfaces typically comprise one or more of mechanical buttons and a touch-sensitive display.
  • touch-sensitive displays are not as susceptible to the entry of dust, debris, or other contaminants in that such displays are normally scaled from the outside environment in which they are used, touch-sensitive displays are relatively fragile.
  • the plastic layers that form typical touch-sensitive displays e.g., resistive displays
  • the components (conductive films, protective layers, etc.) used to make a display touch-sensitive often reduce the brightness of the display. Therefore, the displayed information may be difficult to see, or the dimming effect of the touch-sensitive components must be overcome, thereby wasting power.
  • a system and method pertain to detecting application of a force to a user interface using input sensors that are laterally spaced from the point at which the force is applied, and calculating the location at which the force is applied using information collected by the input sensors.
  • FIG. 1 illustrates an exemplary embodiment of a first device that is configured to identify user input.
  • FIG. 2 illustrates an exemplary embodiment of a second device that is configured to identify user input.
  • FIG. 3 is a block diagram of an exemplary embodiment of the architecture for either or both of the devices shown in FIGS. 1 and 2.
  • FIG. 4 is a flow diagram that illustrates an exemplary embodiment of a method for identifying user input.
  • An exemplary user interface comprises input sensors that are provided within the device housing, for instance inside a device control panel or display, that detect the application of force by a user. The information collected by the sensors is then analyzed to determine the location at which the force was applied, and thereby identify the user input.
  • FIG. 1 illustrates an embodiment of a first device 100 that is configured to identify user input.
  • the device 100 is a personal digital assistant (PDA).
  • the device 100 comprises a housing 102 that supports a touch-sensitive display 104 .
  • the display 104 typically comprises a liquid crystal display (LCD) that receives user input entered using an appropriate input implement such as a stylus 106 .
  • LCD liquid crystal display
  • input sensors 108 Provided within the housing 102 , for instance underneath or within the display 104 .
  • three such input sensors 108 are used, one being provided at each of the top and bottom left corners of the display 104 , and one sensor being provided near the center of the right edge of the display.
  • three sensors 108 are shown and particular locations for these sensors are depicted, alternative arrangements may be used. For example, greater than three sensors can be used and/or can be positioned beyond the edges of the display 104 .
  • the display 104 need not comprise layers of flexible plastic as in conventional touch-sensitive displays. Accordingly, a harder material, such as glass or scratch-resistant plastic, may be used in the construction of the display 104 . In such a case, the display 104 is more robust and requires less energy to provide bright images to the user.
  • FIG. 2 illustrates an embodiment of a second device 200 that is configured to identify user input.
  • the device 200 is an imaging device, such as a photocopier, a scanner, a facsimile machine, a printer, or other electronic device configured to identify user input.
  • the device 200 comprises a housing 202 that includes a control panel 204 that, in turn, comprises a display 206 and a “keypad” 208 .
  • the display 206 typically comprises an LCD.
  • the keypad 208 includes a plurality of mock buttons 210 .
  • the mock buttons 210 do not comprise actual, mechanical buttons. Instead, the mock buttons 210 merely represent such mechanical buttons.
  • the mock buttons 210 comprise one or more of indicia that identifies a given function (e.g., “print,” “copy,” etc.) that can be selected when the mock button is pressed, as well as an indication as to the boundaries of the mock button.
  • the boundary indication may comprise indentations and/or raised portions (e.g., raised edges) that communicate the bounds of the mock buttons 210 .
  • a relatively small number of input sensors 212 are provided that are used to detect user selections. These input sensors 212 are mounted inside the control panel 204 adjacent the keypad 208 . More particularly, in the embodiment of FIG. 2, four input sensors 212 are provided around the periphery of the keypad 208 , one near each corner of the keypad. Although four input sensors 212 are shown in FIG. 2, an alternative number of sensors could be used. Moreover, the input sensors 212 can be provided at other locations within the device housing 202 , if desired.
  • FIG. 3 is a block diagram illustrating an example architecture for one or both of the devices 100 , 200 shown in FIGS. 1 and 2.
  • the device 100 , 200 generally comprises a processing device 300 , memory 302 , a user interface 304 (either the display 104 or the keypad 208 ), and input/output (I/O) devices 306 , each of which is connected to a local interface 308 .
  • I/O input/output
  • the processing device 300 comprises any one of a general-purpose processor, a microprocessor, one or more application-specific integrated circuits (ASICs), a plurality of suitably configured digital logic gates, or other electrical configuration comprised of discrete elements that coordinate the overall operation of the device 100 , 200 .
  • the memory 302 includes any one or a combination of volatile memory elements (e.g., random access memory (RAM)) and/or nonvolatile memory elements (e.g., Flash memory, hard disk, etc.).
  • volatile memory elements e.g., random access memory (RAM)
  • nonvolatile memory elements e.g., Flash memory, hard disk, etc.
  • the configuration of the user interface 304 depends upon the nature of the device in which it is used. In any case, however, the user interface 304 includes or is associated with a limited number of (e.g., three or four) input sensors 305 that are used to locate a point at which force was applied to the user interface. These input sensors 305 are laterally spaced from the point at which the force is applied. For example, the sensors 305 can be placed at or around the perimeter of the user interface 304 and therefore comprise perimeter input sensors. As used herein the term “input sensor” designates any sensor that is configured to detect a force applied to the user interface.
  • the input sensors 305 can comprise a force sensor that is configured to measure force that is transmitted through the user interface 308 , such as a strain gauge or force transducer.
  • the input sensors 305 may be configured to measure deflection of the user interface 304 .
  • the input sensors 305 may comprise an optical sensor (e.g., optical transducer) that measures deflection of a discrete portion of a device display or control panel.
  • the input sensors 305 may be configured to detect the arrival of and/or measure the intensity of vibrations (e.g., sound waves) that propagate through the user interface 304 .
  • the input sensors 305 may comprise an accelerometer that, for example, includes a beam that deflects when vibrations are transmitted to it.
  • the input sensors 305 may comprise a microphone. Mere detection of the vibrations can be used to identify the various times at which the vibrations arrived at the sensors so that the location of the applied force can be calculated. Similarly, the measured intensity of the vibrations may be used for the same purpose.
  • sensors that detect the arrival of vibrations are particularly well-suited to detect application of an impulsive force, such as a “tap” that is applied to the user interface (e.g., display or control panel) using a hard implement such as a stylus or pen.
  • an impulsive force such as a “tap” that is applied to the user interface (e.g., display or control panel) using a hard implement such as a stylus or pen.
  • the I/O devices 306 comprise those devices that enable communication between the device 100 , 200 and another device. Accordingly, these devices 306 can comprise, for example, a universal serial bus (USB) connector, a wireless (e.g., infrared (IR) or radio frequency (RF)) transceiver, or a network card.
  • USB universal serial bus
  • IR infrared
  • RF radio frequency
  • the memory 302 comprises various programs (in software and/or firmware) including, among others, an operating system (O/S) 310 and an input analysis manager 312 .
  • the O/S 310 controls the execution of other programs and provides scheduling, input-output control, file and data management, memory management, and communication control and related services.
  • the input analysis manager 312 for example using an applied force location algorithm 314 , evaluates the information collected by the input sensors 305 (e.g., sensors 106 or 212 ) of the user interface 304 and calculates the exact location at which the user applied force to the device (e.g., display 104 or control panel 208 ). Examples of operation of the input analysis manager 312 are described in relation to FIG. 4.
  • the computer-readable medium can be, for example, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, device, or propagation medium. More specific examples of the computer-readable medium include an electrical connection having one or more wires, a portable computer diskette, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM, EEPROM, or Flash memory), an optical fiber, and a portable compact disc read-only memory (CDROM).
  • RAM random access memory
  • ROM read-only memory
  • EPROM erasable programmable read-only memory
  • CDROM portable compact disc read-only memory
  • the computer-readable medium can even be paper or another suitable medium upon which a program is printed, as the program can be electronically captured, via for instance optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in a computer memory.
  • Example devices having been described above, device operation in identifying user input will now be discussed with reference to the flow diagram of FIG. 4.
  • Any process steps or blocks in this flow diagram may represent modules, segments, or portions of code that include one or more executable instructions for implementing specific logical functions or steps in the process.
  • Any process steps or blocks in this flow diagram may represent modules, segments, or portions of code that include one or more executable instructions for implementing specific logical functions or steps in the process.
  • Any process steps or blocks in this flow diagram may represent modules, segments, or portions of code that include one or more executable instructions for implementing specific logical functions or steps in the process.
  • steps may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved.
  • FIG. 4 illustrates an embodiment of a method for identifying user input into a device.
  • the input analysis manager 312 is activated. This activation occurs in response to an application of force to the user interface being sensed by at least one of the input sensors, presumably indicating that a user is attempting to enter an input (e.g., a selection or data).
  • an input e.g., a selection or data
  • the user can have pressed or tapped an onscreen “button” or written something using a stylus.
  • the user can have pressed down on a mock button provided in a device control panel.
  • the input analysis manager 312 identifies the information collected by the input sensors, as indicated in block 402 .
  • the nature of this information depends upon the nature of the input sensors that collected it.
  • the information may comprise voltages that can be correlated to measured forces.
  • the information comprises a measurement of deflection of the user interface at discrete portions of the interface.
  • the input sensors are configured to detect and/or measure vibrations (e.g., sound waves) that propagate through the user interface, the information comprises identification of the arrival of the vibrations or indication of the intensity of the vibrations.
  • the information is then evaluated by the input analysis manager 312 , as indicated in block 404 .
  • the information collected by each individual input sensor is analyzed, knowing the distance between each input sensor, to calculate the strength and location at which the force was applied by the user. From these distances, the exact location at which the force was applied can be calculated, as indicated in block 406 . If the information comprises the forces measured by each sensor, the differential forces are used to calculate, using the applied force location algorithm 314 , the distances. Similarly, differential deflection information or vibration intensity information can be used to calculate the distances between each input sensor and the force application point using the location algorithm 314 .
  • evaluation of the information comprises identification of the time at which the vibrations arrived at each of the input sensors and correlating these times to distance from the force application point.
  • this location is used to interpret the intended user input, as indicated in block 408 . For example, if it is determined that the force was applied within the boundaries of an onscreen button or mock button of a control panel, selection of the function associated with that button is inferred. Similarly, if it is determined that the force was applied within a text entry box, the force application is interpreted as comprising a data entry (e.g., entry of an alphanumeric character). Such interpretation is facilitated by mapping, i.e., by using a “map” of the user interface that correlates locations with associated functions.
  • the input sensors are described as being used to detect the application of force within particular user interfaces, such as a device display or control panel.
  • user interfaces such as a device display or control panel.
  • user input other than that registered using such an interface can be identified. For instance, squeezing of the device housing at a given location may be detected and therefore used to input a given command (e.g., to scroll a device display). Therefore, the term “user interface” broadly applies to any location on the device housing at which an applied force may be detected and interpreted.

Abstract

Disclosed are systems and methods for identifying user input. In one embodiment, a system and method pertain to detecting application of a force to a user interface using input sensors that are laterally spaced from the point at which the force is applied, and calculating the location at which the force is applied using information collected by the input sensors.

Description

    BACKGROUND
  • Many devices comprise user interfaces with which users can enter inputs (e.g., selections and/or data) into the device. Such interfaces typically comprise one or more of mechanical buttons and a touch-sensitive display. [0001]
  • With regard to mechanical buttons, physical switches are required for each button to register its selection by the user. Such switches are susceptible to errors, such as failure to register user selection. For example, dust and other debris can collect on the switches and interfere with the proper operation of the switches, and therefore the buttons that they serve. Moreover, liquids spilled on the switches may cause the switches to short-circuit, thereby requiring replacement of the user interface that comprises the switches, or even the entire device. [0002]
  • Although touch-sensitive displays are not as susceptible to the entry of dust, debris, or other contaminants in that such displays are normally scaled from the outside environment in which they are used, touch-sensitive displays are relatively fragile. In particular, the plastic layers that form typical touch-sensitive displays (e.g., resistive displays) may be easily scratched, gouged, or torn. In addition, the components (conductive films, protective layers, etc.) used to make a display touch-sensitive often reduce the brightness of the display. Therefore, the displayed information may be difficult to see, or the dimming effect of the touch-sensitive components must be overcome, thereby wasting power. [0003]
  • SUMMARY
  • Disclosed are systems and methods for identifying user input. In one embodiment, a system and method pertain to detecting application of a force to a user interface using input sensors that are laterally spaced from the point at which the force is applied, and calculating the location at which the force is applied using information collected by the input sensors.[0004]
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The disclosed systems and methods can be better understood with reference to the following drawings. The components in the drawings are not necessarily to scale. [0005]
  • FIG. 1 illustrates an exemplary embodiment of a first device that is configured to identify user input. [0006]
  • FIG. 2 illustrates an exemplary embodiment of a second device that is configured to identify user input. [0007]
  • FIG. 3 is a block diagram of an exemplary embodiment of the architecture for either or both of the devices shown in FIGS. 1 and 2. [0008]
  • FIG. 4 is a flow diagram that illustrates an exemplary embodiment of a method for identifying user input.[0009]
  • DETAILED DESCRIPTION
  • An exemplary user interface comprises input sensors that are provided within the device housing, for instance inside a device control panel or display, that detect the application of force by a user. The information collected by the sensors is then analyzed to determine the location at which the force was applied, and thereby identify the user input. [0010]
  • Various embodiments of systems and methods that incorporate the disclosed user interfaces are described herein. Although particular embodiments are disclosed, these embodiments are provided for purposes of example only to facilitate description of the disclosed systems and methods. Accordingly, other embodiments are possible. [0011]
  • Referring now in more detail to the figures in which like numerals identify corresponding parts, FIG. 1 illustrates an embodiment of a [0012] first device 100 that is configured to identify user input. In this embodiment, the device 100 is a personal digital assistant (PDA). As indicated in FIG. 1, the device 100 comprises a housing 102 that supports a touch-sensitive display 104. Although capable of other configurations, the display 104 typically comprises a liquid crystal display (LCD) that receives user input entered using an appropriate input implement such as a stylus 106.
  • Provided within the [0013] housing 102, for instance underneath or within the display 104, are input sensors 108. In the embodiment of FIG. 1, three such input sensors 108 are used, one being provided at each of the top and bottom left corners of the display 104, and one sensor being provided near the center of the right edge of the display. Although three sensors 108 are shown and particular locations for these sensors are depicted, alternative arrangements may be used. For example, greater than three sensors can be used and/or can be positioned beyond the edges of the display 104.
  • Due to the provision of the input sensors [0014] 108 (example configurations for which are described below) the display 104 need not comprise layers of flexible plastic as in conventional touch-sensitive displays. Accordingly, a harder material, such as glass or scratch-resistant plastic, may be used in the construction of the display 104. In such a case, the display 104 is more robust and requires less energy to provide bright images to the user.
  • FIG. 2 illustrates an embodiment of a [0015] second device 200 that is configured to identify user input. In this case, the device 200 is an imaging device, such as a photocopier, a scanner, a facsimile machine, a printer, or other electronic device configured to identify user input. The device 200 comprises a housing 202 that includes a control panel 204 that, in turn, comprises a display 206 and a “keypad” 208. As with the display 104 of the device 100 shown in FIG. 1, the display 206 typically comprises an LCD. The keypad 208 includes a plurality of mock buttons 210. Although the term “buttons” is used, the mock buttons 210 do not comprise actual, mechanical buttons. Instead, the mock buttons 210 merely represent such mechanical buttons. Accordingly, the mock buttons 210 comprise one or more of indicia that identifies a given function (e.g., “print,” “copy,” etc.) that can be selected when the mock button is pressed, as well as an indication as to the boundaries of the mock button. In some cases, the boundary indication may comprise indentations and/or raised portions (e.g., raised edges) that communicate the bounds of the mock buttons 210.
  • Instead of a dedicated switch being provided for each of the [0016] mock buttons 210, a relatively small number of input sensors 212 are provided that are used to detect user selections. These input sensors 212 are mounted inside the control panel 204 adjacent the keypad 208. More particularly, in the embodiment of FIG. 2, four input sensors 212 are provided around the periphery of the keypad 208, one near each corner of the keypad. Although four input sensors 212 are shown in FIG. 2, an alternative number of sensors could be used. Moreover, the input sensors 212 can be provided at other locations within the device housing 202, if desired.
  • FIG. 3 is a block diagram illustrating an example architecture for one or both of the [0017] devices 100, 200 shown in FIGS. 1 and 2. As indicated in FIG. 3, the device 100, 200 generally comprises a processing device 300, memory 302, a user interface 304 (either the display 104 or the keypad 208), and input/output (I/O) devices 306, each of which is connected to a local interface 308.
  • The [0018] processing device 300 comprises any one of a general-purpose processor, a microprocessor, one or more application-specific integrated circuits (ASICs), a plurality of suitably configured digital logic gates, or other electrical configuration comprised of discrete elements that coordinate the overall operation of the device 100, 200. The memory 302 includes any one or a combination of volatile memory elements (e.g., random access memory (RAM)) and/or nonvolatile memory elements (e.g., Flash memory, hard disk, etc.).
  • The configuration of the user interface [0019] 304 depends upon the nature of the device in which it is used. In any case, however, the user interface 304 includes or is associated with a limited number of (e.g., three or four) input sensors 305 that are used to locate a point at which force was applied to the user interface. These input sensors 305 are laterally spaced from the point at which the force is applied. For example, the sensors 305 can be placed at or around the perimeter of the user interface 304 and therefore comprise perimeter input sensors. As used herein the term “input sensor” designates any sensor that is configured to detect a force applied to the user interface. Therefore, the input sensors 305 can comprise a force sensor that is configured to measure force that is transmitted through the user interface 308, such as a strain gauge or force transducer. In an alternative arrangement, the input sensors 305 may be configured to measure deflection of the user interface 304. In such a case, the input sensors 305 may comprise an optical sensor (e.g., optical transducer) that measures deflection of a discrete portion of a device display or control panel.
  • In yet another arrangement, the [0020] input sensors 305 may be configured to detect the arrival of and/or measure the intensity of vibrations (e.g., sound waves) that propagate through the user interface 304. For instance, the input sensors 305 may comprise an accelerometer that, for example, includes a beam that deflects when vibrations are transmitted to it. To cite another example, the input sensors 305 may comprise a microphone. Mere detection of the vibrations can be used to identify the various times at which the vibrations arrived at the sensors so that the location of the applied force can be calculated. Similarly, the measured intensity of the vibrations may be used for the same purpose. As is described below, sensors that detect the arrival of vibrations (e.g., sound waves) are particularly well-suited to detect application of an impulsive force, such as a “tap” that is applied to the user interface (e.g., display or control panel) using a hard implement such as a stylus or pen.
  • The I/[0021] O devices 306 comprise those devices that enable communication between the device 100, 200 and another device. Accordingly, these devices 306 can comprise, for example, a universal serial bus (USB) connector, a wireless (e.g., infrared (IR) or radio frequency (RF)) transceiver, or a network card.
  • The [0022] memory 302 comprises various programs (in software and/or firmware) including, among others, an operating system (O/S) 310 and an input analysis manager 312. The O/S 310 controls the execution of other programs and provides scheduling, input-output control, file and data management, memory management, and communication control and related services. The input analysis manager 312, for example using an applied force location algorithm 314, evaluates the information collected by the input sensors 305 (e.g., sensors 106 or 212) of the user interface 304 and calculates the exact location at which the user applied force to the device (e.g., display 104 or control panel 208). Examples of operation of the input analysis manager 312 are described in relation to FIG. 4.
  • Various programs have been described above. These programs can be stored on any computer-readable medium for use by or in connection with any computer-related system or method. In the context of this document, the computer-readable medium can be, for example, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, device, or propagation medium. More specific examples of the computer-readable medium include an electrical connection having one or more wires, a portable computer diskette, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM, EEPROM, or Flash memory), an optical fiber, and a portable compact disc read-only memory (CDROM). The computer-readable medium can even be paper or another suitable medium upon which a program is printed, as the program can be electronically captured, via for instance optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in a computer memory. [0023]
  • Example devices having been described above, device operation in identifying user input will now be discussed with reference to the flow diagram of FIG. 4. Any process steps or blocks in this flow diagram may represent modules, segments, or portions of code that include one or more executable instructions for implementing specific logical functions or steps in the process. Although particular example process steps are described, alternative implementations are feasible. Moreover, steps may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved. [0024]
  • FIG. 4 illustrates an embodiment of a method for identifying user input into a device. Beginning with [0025] block 400, the input analysis manager 312 is activated. This activation occurs in response to an application of force to the user interface being sensed by at least one of the input sensors, presumably indicating that a user is attempting to enter an input (e.g., a selection or data). For example, the user can have pressed or tapped an onscreen “button” or written something using a stylus. In another example, the user can have pressed down on a mock button provided in a device control panel.
  • Once the [0026] input analysis manager 312 is activated, it identifies the information collected by the input sensors, as indicated in block 402. The nature of this information depends upon the nature of the input sensors that collected it. In cases in which actual force sensors are used, the information may comprise voltages that can be correlated to measured forces. In cases in which a deflection sensor is used, the information comprises a measurement of deflection of the user interface at discrete portions of the interface. In cases in which the input sensors are configured to detect and/or measure vibrations (e.g., sound waves) that propagate through the user interface, the information comprises identification of the arrival of the vibrations or indication of the intensity of the vibrations.
  • Irrespective of the nature of the collected information, the information is then evaluated by the [0027] input analysis manager 312, as indicated in block 404. During this evaluation, the information collected by each individual input sensor is analyzed, knowing the distance between each input sensor, to calculate the strength and location at which the force was applied by the user. From these distances, the exact location at which the force was applied can be calculated, as indicated in block 406. If the information comprises the forces measured by each sensor, the differential forces are used to calculate, using the applied force location algorithm 314, the distances. Similarly, differential deflection information or vibration intensity information can be used to calculate the distances between each input sensor and the force application point using the location algorithm 314.
  • If the information collected by the sensors comprises mere indication of arrival of vibrations (e.g., sound waves), evaluation of the information comprises identification of the time at which the vibrations arrived at each of the input sensors and correlating these times to distance from the force application point. [0028]
  • Once the location of the applied force has been determined, this location is used to interpret the intended user input, as indicated in [0029] block 408. For example, if it is determined that the force was applied within the boundaries of an onscreen button or mock button of a control panel, selection of the function associated with that button is inferred. Similarly, if it is determined that the force was applied within a text entry box, the force application is interpreted as comprising a data entry (e.g., entry of an alphanumeric character). Such interpretation is facilitated by mapping, i.e., by using a “map” of the user interface that correlates locations with associated functions.
  • At this point, it is determined whether other force applications are sensed by the input sensors, as indicated in [0030] decision block 410. If not, flow for the input analysis manager 312 is terminated. If, on the other hand, other force applications are sensed, flow returns to block 402 and the above-described process continues such that all user inputs are identified.
  • In the foregoing discussions, the input sensors are described as being used to detect the application of force within particular user interfaces, such as a device display or control panel. Using the systems and methods described herein, however, user input other than that registered using such an interface can be identified. For instance, squeezing of the device housing at a given location may be detected and therefore used to input a given command (e.g., to scroll a device display). Therefore, the term “user interface” broadly applies to any location on the device housing at which an applied force may be detected and interpreted. [0031]

Claims (28)

What is claimed is:
1. A method for identifying a user input, comprising:
detecting application of a force to a user interface using input sensors that are laterally spaced from a point at which the force is applied; and
calculating the location at which the force is applied using information collected by the input sensors.
2. The method of claim 1, wherein detecting application of a force comprises detecting application of a force using input sensors positioned at a perimeter of the user interface.
3. The method of claim 1, wherein calculating the location at which the force is applied comprises comparing the information collected by each of the input sensors.
4. The method of claim 1, wherein calculating the location at which the force is applied comprises calculating the location from forces measured by the input sensors.
5. The method of claim 1, wherein calculating the location at which the force is applied comprises calculating the location from deflections measured by the input sensors.
6. The method of claim 1, wherein calculating the location at which the force is applied comprises calculating the location from vibrations measured by the input sensors.
7. The method of claim 1, wherein calculating the location at which the force is applied comprises calculating the location from the times at which vibrations are detected by the input sensors.
8. The method of claim 1, further comprising interpreting an intended user input from the calculated location.
9. The method of claim 8, wherein interpreting an intended user input comprises correlating the calculated location using a user interface map.
10. A system for identifying a user input, comprising:
a user interface to which a force may be applied to register a user input;
input sensors that are laterally spaced from areas of the user interface to which the force may be applied; and
an input analysis manager that is configured to calculate the location at which a force is applied to the user interface using information collected by the input sensors.
11. The system of claim 10, wherein the user interface comprises a display.
12. The system of claim 11, wherein the input sensors are positioned at a perimeter of the display.
13. The system of claim 10, wherein the user interface comprises a control panel including at least one mock button.
14. The system of claim 13, wherein the input sensors are positioned at a perimeter of the control panel.
15. The system of claim 10, wherein the input sensors comprise a force sensor that measures force transmitted through the user interface.
16. The system of claim 10, wherein the input sensors comprise a displacement sensor that measures displacement of a discrete portion of the user interface.
17. The system of claim 10, wherein the input sensors comprise a vibration sensor that measures vibrations that propagate through the user interface.
18. The system of claim 10, wherein the input sensors comprise a vibration sensor that detects arrival of vibrations that propagate through the user interface and wherein the input analysis manager is configured to identify the time at which the vibrations arrived at each vibration sensor.
19. The system of claim 10, wherein the input analysis manager is configured to calculate the distances between the input sensors and the location at which the force was applied by comparing the information collected by the sensors.
20. The system of claim 10, wherein the input analysis manager is configured to interpret an intended user input from the calculated location at which the force was applied.
21. A system for identifying a user input, comprising:
means for detecting a force applied to a user interface, the means for detecting being laterally from the location at which the force was applied; and
means for calculating the location of the applied force.
22. The system of claim 21, wherein the means for detecting comprise no greater than four input sensors.
23. The system of claim 21, wherein the input sensors are positioned around a perimeter of the user interface.
24. The system of claim 21, wherein the means for calculating comprise means for calculating the distances between input sensors and the location at which the force was applied.
25. A user interface, comprising:
a display having an outer perimeter; and
input sensors provided only around the outer perimeter of the display, the input sensors being configured to detect application of force to the display.
26. The user interface of claim 25, wherein the interface only comprises three input sensors.
27. A user interface, comprising:
a control panel including a plurality of mock buttons; and
input sensors laterally spaced from the mock buttons that are configured to detect application of force to the mock buttons.
28. The user interface of claim 27, wherein the mock buttons comprise indicia that identify an associated function and an indication as to the boundaries of the mock buttons, the interface being absent of dedicated switches for the mock buttons.
US10/442,838 2003-05-21 2003-05-21 Systems and methods for identifying user input Abandoned US20040233158A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US10/442,838 US20040233158A1 (en) 2003-05-21 2003-05-21 Systems and methods for identifying user input
TW092135387A TW200426663A (en) 2003-05-21 2003-12-15 Systems and methods for identifying user input
JP2004143381A JP2004348725A (en) 2003-05-21 2004-05-13 System and method for identifying user input

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US10/442,838 US20040233158A1 (en) 2003-05-21 2003-05-21 Systems and methods for identifying user input

Publications (1)

Publication Number Publication Date
US20040233158A1 true US20040233158A1 (en) 2004-11-25

Family

ID=33450300

Family Applications (1)

Application Number Title Priority Date Filing Date
US10/442,838 Abandoned US20040233158A1 (en) 2003-05-21 2003-05-21 Systems and methods for identifying user input

Country Status (3)

Country Link
US (1) US20040233158A1 (en)
JP (1) JP2004348725A (en)
TW (1) TW200426663A (en)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060097983A1 (en) * 2004-10-25 2006-05-11 Nokia Corporation Tapping input on an electronic device
US20060211499A1 (en) * 2005-03-07 2006-09-21 Truls Bengtsson Communication terminals with a tap determination circuit
US20070095636A1 (en) * 2005-11-03 2007-05-03 Viktors Berstis Cadence controlled actuator
US20080136678A1 (en) * 2006-12-11 2008-06-12 International Business Machines Corporation Data input using knocks
US20110080352A1 (en) * 2009-10-07 2011-04-07 Yeonchul Kim Systems and methods for providing an enhanced keypad
EP2437144A1 (en) * 2010-09-17 2012-04-04 Research In Motion Limited Touch-sensitive display with optical sensor and method
EP2439619A1 (en) * 2010-09-17 2012-04-11 Research In Motion Limited Touch-sensitive display with optical sensor and method
WO2013154850A2 (en) * 2012-04-13 2013-10-17 Google Inc. Apparatus and method for a pressure sensitive device interface
US8803797B2 (en) 2008-01-18 2014-08-12 Microsoft Corporation Input through sensing of user-applied forces
US20140347326A1 (en) * 2013-05-21 2014-11-27 Samsung Electronics Co., Ltd. User input using hovering input
US9223431B2 (en) 2010-09-17 2015-12-29 Blackberry Limited Touch-sensitive display with depression detection and method
US9513737B2 (en) 2010-09-17 2016-12-06 Blackberry Limited Touch-sensitive display with optical sensor and method
CN111263927A (en) * 2017-10-20 2020-06-09 雷蛇(亚太)私人有限公司 User input device and method for recognizing user input in user input device

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006323943A (en) * 2005-05-19 2006-11-30 Sony Corp Player, program and playback control method
KR101231106B1 (en) * 2006-03-17 2013-02-07 삼성전자주식회사 apparatus and method of providing user interface for flexible mobile device
KR101572730B1 (en) 2013-12-20 2015-12-01 엘지전자 주식회사 Portable terminal and driving method of the same

Citations (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4429612A (en) * 1979-06-18 1984-02-07 Gt - Devices Method and apparatus for accelerating a solid mass
US4590842A (en) * 1983-03-01 1986-05-27 Gt-Devices Method of and apparatus for accelerating a projectile
US4715261A (en) * 1984-10-05 1987-12-29 Gt-Devices Cartridge containing plasma source for accelerating a projectile
US4821509A (en) * 1985-06-10 1989-04-18 Gt-Devices Pulsed electrothermal thruster
US4821508A (en) * 1985-06-10 1989-04-18 Gt-Devices Pulsed electrothermal thruster
US4897558A (en) * 1987-12-01 1990-01-30 Gt-Devices Superconducting device, apparatus and method for selectively supplying current to a load
US4901621A (en) * 1987-07-09 1990-02-20 Gt-Devices Superconducting projectile for a rail gun and the combination of a rail gun with a superconducting projectile
US4907487A (en) * 1986-11-12 1990-03-13 Gt-Devices Apparatus for and method of accelerating a projectile through a capillary passage and projectile therefor
US4913029A (en) * 1986-11-12 1990-04-03 Gt-Devices Method and apparatus for accelerating a projectile through a capillary passage with injector electrode and cartridge for projectile therefor
US4917335A (en) * 1988-03-31 1990-04-17 Gt-Devices Apparatus and method for facilitating supersonic motion of bodies through the atmosphere
US4974487A (en) * 1984-10-05 1990-12-04 Gt-Devices Plasma propulsion apparatus and method
US5012719A (en) * 1987-06-12 1991-05-07 Gt-Devices Method of and apparatus for generating hydrogen and projectile accelerating apparatus and method incorporating same
US5012720A (en) * 1989-08-29 1991-05-07 Gt-Devices Plasma projectile accelerator with valve means for preventing the backward flow of plasma in passage through which projectile is accelerated
US5033355A (en) * 1983-03-01 1991-07-23 Gt-Device Method of and apparatus for deriving a high pressure, high temperature plasma jet with a dielectric capillary
US5429030A (en) * 1993-11-09 1995-07-04 Gt-Devices Hybrid electrothermal light gas gun and method
US5581484A (en) * 1994-06-27 1996-12-03 Prince; Kevin R. Finger mounted computer input device
US5699779A (en) * 1995-08-25 1997-12-23 Tidman; Derek A. Method of and apparatus for moving a mass
US5703322A (en) * 1995-02-02 1997-12-30 General Dynamics Land Systems Inc. Cartridge having high pressure light gas
US5708460A (en) * 1995-06-02 1998-01-13 Avi Systems, Inc. Touch screen
US6014964A (en) * 1998-10-29 2000-01-18 Advanced Launch Corporation Method and apparatus for moving a mass in a spiral track
US6297810B1 (en) * 1998-09-30 2001-10-02 Rockwell Collins Programmable switch array with tactical feel
US6492979B1 (en) * 1999-09-07 2002-12-10 Elo Touchsystems, Inc. Dual sensor touchscreen utilizing projective-capacitive and force touch sensors
US6686906B2 (en) * 2000-06-26 2004-02-03 Nokia Mobile Phones Ltd. Tactile electromechanical data input mechanism

Patent Citations (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4429612A (en) * 1979-06-18 1984-02-07 Gt - Devices Method and apparatus for accelerating a solid mass
US4590842A (en) * 1983-03-01 1986-05-27 Gt-Devices Method of and apparatus for accelerating a projectile
US5033355A (en) * 1983-03-01 1991-07-23 Gt-Device Method of and apparatus for deriving a high pressure, high temperature plasma jet with a dielectric capillary
US4974487A (en) * 1984-10-05 1990-12-04 Gt-Devices Plasma propulsion apparatus and method
US4715261A (en) * 1984-10-05 1987-12-29 Gt-Devices Cartridge containing plasma source for accelerating a projectile
US4821509A (en) * 1985-06-10 1989-04-18 Gt-Devices Pulsed electrothermal thruster
US4821508A (en) * 1985-06-10 1989-04-18 Gt-Devices Pulsed electrothermal thruster
US4907487A (en) * 1986-11-12 1990-03-13 Gt-Devices Apparatus for and method of accelerating a projectile through a capillary passage and projectile therefor
US4913029A (en) * 1986-11-12 1990-04-03 Gt-Devices Method and apparatus for accelerating a projectile through a capillary passage with injector electrode and cartridge for projectile therefor
US5012719A (en) * 1987-06-12 1991-05-07 Gt-Devices Method of and apparatus for generating hydrogen and projectile accelerating apparatus and method incorporating same
US4901621A (en) * 1987-07-09 1990-02-20 Gt-Devices Superconducting projectile for a rail gun and the combination of a rail gun with a superconducting projectile
US4897558A (en) * 1987-12-01 1990-01-30 Gt-Devices Superconducting device, apparatus and method for selectively supplying current to a load
US4917335A (en) * 1988-03-31 1990-04-17 Gt-Devices Apparatus and method for facilitating supersonic motion of bodies through the atmosphere
US5012720A (en) * 1989-08-29 1991-05-07 Gt-Devices Plasma projectile accelerator with valve means for preventing the backward flow of plasma in passage through which projectile is accelerated
US5429030A (en) * 1993-11-09 1995-07-04 Gt-Devices Hybrid electrothermal light gas gun and method
US5581484A (en) * 1994-06-27 1996-12-03 Prince; Kevin R. Finger mounted computer input device
US5703322A (en) * 1995-02-02 1997-12-30 General Dynamics Land Systems Inc. Cartridge having high pressure light gas
US5708460A (en) * 1995-06-02 1998-01-13 Avi Systems, Inc. Touch screen
US5699779A (en) * 1995-08-25 1997-12-23 Tidman; Derek A. Method of and apparatus for moving a mass
US5950608A (en) * 1995-08-25 1999-09-14 Advanced Launch Corporation Method of and apparatus for moving a mass
US6297810B1 (en) * 1998-09-30 2001-10-02 Rockwell Collins Programmable switch array with tactical feel
US6014964A (en) * 1998-10-29 2000-01-18 Advanced Launch Corporation Method and apparatus for moving a mass in a spiral track
US6492979B1 (en) * 1999-09-07 2002-12-10 Elo Touchsystems, Inc. Dual sensor touchscreen utilizing projective-capacitive and force touch sensors
US6686906B2 (en) * 2000-06-26 2004-02-03 Nokia Mobile Phones Ltd. Tactile electromechanical data input mechanism

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060097983A1 (en) * 2004-10-25 2006-05-11 Nokia Corporation Tapping input on an electronic device
US20060211499A1 (en) * 2005-03-07 2006-09-21 Truls Bengtsson Communication terminals with a tap determination circuit
US7966084B2 (en) 2005-03-07 2011-06-21 Sony Ericsson Mobile Communications Ab Communication terminals with a tap determination circuit
KR101238104B1 (en) * 2005-03-07 2013-02-27 소니 모빌 커뮤니케이션즈 에이비 Communication terminal with a tap sound detecting circuit
US20070095636A1 (en) * 2005-11-03 2007-05-03 Viktors Berstis Cadence controlled actuator
US7760192B2 (en) 2005-11-03 2010-07-20 International Business Machines Corporation Cadence controlled actuator
US20080136678A1 (en) * 2006-12-11 2008-06-12 International Business Machines Corporation Data input using knocks
US8803797B2 (en) 2008-01-18 2014-08-12 Microsoft Corporation Input through sensing of user-applied forces
US9201538B2 (en) 2008-01-18 2015-12-01 Microsoft Technology Licensing, Llc Input through sensing of user-applied forces
US20110080352A1 (en) * 2009-10-07 2011-04-07 Yeonchul Kim Systems and methods for providing an enhanced keypad
EP2439619A1 (en) * 2010-09-17 2012-04-11 Research In Motion Limited Touch-sensitive display with optical sensor and method
EP2437144A1 (en) * 2010-09-17 2012-04-04 Research In Motion Limited Touch-sensitive display with optical sensor and method
US9223431B2 (en) 2010-09-17 2015-12-29 Blackberry Limited Touch-sensitive display with depression detection and method
US9513737B2 (en) 2010-09-17 2016-12-06 Blackberry Limited Touch-sensitive display with optical sensor and method
WO2013154850A2 (en) * 2012-04-13 2013-10-17 Google Inc. Apparatus and method for a pressure sensitive device interface
WO2013154850A3 (en) * 2012-04-13 2014-10-09 Google Inc. Apparatus and method for a pressure sensitive device interface
US20140347326A1 (en) * 2013-05-21 2014-11-27 Samsung Electronics Co., Ltd. User input using hovering input
CN111263927A (en) * 2017-10-20 2020-06-09 雷蛇(亚太)私人有限公司 User input device and method for recognizing user input in user input device

Also Published As

Publication number Publication date
JP2004348725A (en) 2004-12-09
TW200426663A (en) 2004-12-01

Similar Documents

Publication Publication Date Title
US20040233158A1 (en) Systems and methods for identifying user input
US8669963B2 (en) Sensor system
EP2214091B1 (en) Mobile terminal having dual touch screen and method for displaying user interface thereof
US9959005B2 (en) 5-wire resistive touch screen pressure measurement circuit and method
US6504530B1 (en) Touch confirming touchscreen utilizing plural touch sensors
US6611258B1 (en) Information processing apparatus and its method
US5396443A (en) Information processing apparatus including arrangements for activation to and deactivation from a power-saving state
EP1330779B1 (en) Dual sensor touchscreen utilizing projective-capacitive and force touch sensors
US8269733B2 (en) Input precision
US7573462B2 (en) Image display apparatus, multi display system, coordinate information output method, and program for implementing the method
US20160274710A1 (en) System and method for measuring individual force in multi-object sensing
US20120001859A1 (en) Device and method for detecting noise
US20100220062A1 (en) Touch sensitive display
US10175786B2 (en) Force touch display device and force touch control method
US7737955B2 (en) Electronic device and method providing a touch-based interface for a display control
US20080150911A1 (en) Hand-held device with touchscreen and digital tactile pixels
EP1059604A2 (en) Coordinate input device allowing input by finger, pen or the like
WO2002035460A1 (en) Touch confirming touchscreen utilizing plural touch sensors
US8928595B2 (en) Touch screen calibration sensor
JP4266363B2 (en) Pressure sensitive fingerprint sensor
US8195060B2 (en) Electronic device, method for forming error information of electronic device, and image forming apparatus
JP5639527B2 (en) Electronics
TWI553515B (en) Touch panel systems and electronic information machines
JPH08221203A (en) Input device with input time decision function
US8547343B2 (en) Display apparatus

Legal Events

Date Code Title Description
AS Assignment

Owner name: HEWLETT-PACKARD DEVELOPMENT COMPANY, L.P., TEXAS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:STAVELY, DONALD J.;BRAKE, WILFRED F.;DALTON, DAN L.;AND OTHERS;REEL/FRAME:013890/0762;SIGNING DATES FROM 20030507 TO 20030508

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION