CN112313609B - Method and apparatus for integrating swipe and touch on input device - Google Patents

Method and apparatus for integrating swipe and touch on input device Download PDF

Info

Publication number
CN112313609B
CN112313609B CN201980040527.8A CN201980040527A CN112313609B CN 112313609 B CN112313609 B CN 112313609B CN 201980040527 A CN201980040527 A CN 201980040527A CN 112313609 B CN112313609 B CN 112313609B
Authority
CN
China
Prior art keywords
input
type
physical
touch
touch motion
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201980040527.8A
Other languages
Chinese (zh)
Other versions
CN112313609A (en
Inventor
艾江
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202211081423.1A priority Critical patent/CN115562502B/en
Publication of CN112313609A publication Critical patent/CN112313609A/en
Application granted granted Critical
Publication of CN112313609B publication Critical patent/CN112313609B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/0227Cooperation and interconnection of the input arrangement with other functional units of a computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/043Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using propagating acoustic waves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03547Touch pads, in which fingers can move on a surface
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/033Indexing scheme relating to G06F3/033
    • G06F2203/0339Touch strips, e.g. orthogonal touch strips to control cursor movement or scrolling; single touch strip to adjust parameter or to implement a row of soft keys

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Acoustics & Sound (AREA)
  • Position Input By Displaying (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A device may include a physical input to activate a first function of the device when a first type of touch motion on the physical input is detected by the physical input. The apparatus may also include an ultrasonic sensor including an ultrasonic emitter and a first ultrasonic sensor. The ultrasonic sensor is disposed beneath the physical input and is configured to detect a second type of touch motion on the physical input and activate a second function of the device when the ultrasonic sensor detects the second type of touch motion on the physical input. Thus, by enabling different functions in response to different touch forces and/or touch motions, different functions may be accommodated in a single physical input.

Description

Method and apparatus for integrating swipe and touch on input device
Said application claims priority from us 62/687,796 entitled "method for integrating sliding and touching on a button" filed on 2018, 6, month 20.
Technical Field
The present disclosure relates generally to a system and method for using a functional input device on a handheld device and, in particular embodiments, to a system and method for integrating slide and touch functionality on inputs on a handheld device.
Background
The handheld devices such as the mobile phone, the tablet personal computer and the iPad are greatly convenient and enrich the daily life of people. Various technologies, including software and hardware, have been developed to facilitate and facilitate the use of handheld devices over the past, and new technologies continue to be developed so that handheld devices may be more conveniently operated.
Disclosure of Invention
According to an aspect of the present disclosure, there is provided an apparatus, the apparatus comprising: a physical input for activating a first function of the device when a first type of touch motion is detected on the physical input; and an ultrasonic sensor including an ultrasonic transmitter and a first ultrasonic receiver. The physical input may take the form of, for example, a depressible or laterally displaceable button or knob. The ultrasonic sensor may be disposed underneath the physical input and configured to detect a second type of touch motion on the physical input and activate a second function of the device when the second type of touch motion is detected on the physical input. The second type of touch motion may have a different force or motion than the first type of touch motion.
The above aspects enable the integration of two different types of touch motions on the same physical input, respectively activating different functions on the device. This allows the user to manipulate the same physical input and activate different functions. This greatly simplifies the operation for the user activating the different functions and operating the device, especially when the user is already familiar with the location of the physical input on the device. Thus, the above aspects improve the user experience.
Optionally, in any of the preceding aspects, the second function comprises launching an application on the device.
Optionally, in any of the preceding aspects, the second function comprises activating hardware on the device.
Optionally, in any of the preceding aspects, the second function comprises performing an operation on a touch screen of the device.
Optionally, in any of the preceding aspects, the first type of touch motion and the second type of touch motion have different gestures on the physical input.
Optionally, in any of the preceding aspects, the first type of touch motion comprises a pressing of the physical input, and the second type of touch motion comprises a sliding touch of the physical input.
Optionally, in any of the preceding aspects, the first type of touch motion and the second type of touch motion have different touch forces on the physical input.
Optionally, in any of the preceding aspects, the physical input is a volume input of the device.
Optionally, in any of the preceding aspects, the physical input is a power input of the device.
Optionally, in any of the preceding aspects, the physical input is a key for entering information into the device.
Optionally, in any one of the preceding aspects, the apparatus further comprises: a touch screen for displaying information.
Optionally, in any one of the preceding aspects, the apparatus further comprises: a second ultrasonic receiver disposed beneath the physical button.
According to another aspect of the present disclosure, there is provided a method comprising: an apparatus detects a first type of touch motion imparted on a physical component of the apparatus using an ultrasonic sensor disposed beneath the physical component, the ultrasonic sensor including an ultrasonic transmitter and an ultrasonic receiver and being for detecting the first type of touch motion on the physical component; and upon detecting the first type of touch motion imparted on the physical component, the device activates a first function on the device associated with detection of the first type of touch motion on the physical component.
The above aspects enable the use of existing physical components of a device to activate a function on the device by applying a touch motion on the physical components. This greatly simplifies the user's operation to activate the function, especially when the user is already familiar with the location of the physical component. Thus, the above aspects improve the user experience.
Optionally, in any one of the preceding aspects, the method further comprises: the device detecting, using the physical component, a second type of touch motion applied on the physical component different from the first type of touch motion; and upon detecting the second type of touch motion imparted on the physical component, the device activates a second function on the device, the second function being associated with detection of the second type of touch motion on the physical component and being different from the first function.
Optionally, in any of the preceding aspects, the first type of touch motion and the second type of touch motion have different gestures on the physical component.
Optionally, in any of the preceding aspects, the first type of touch motion and the second type of touch motion have different touch forces on the physical component.
Optionally, in any of the preceding aspects, the first type of touch motion comprises a touch to the physical component, and the second type of touch motion comprises a press on the physical component.
Optionally, in any of the preceding aspects, the physical component is a physical button or key.
Optionally, in any of the preceding aspects, the physical component is a volume button of the device.
Optionally, in any of the preceding aspects, the physical component is a power button of the device.
Optionally, in any of the preceding aspects, the physical component is a key for inputting information.
Optionally, in any of the preceding aspects, the first function comprises launching an application on the device.
Optionally, in any of the preceding aspects, the first function comprises activating hardware on the device.
Optionally, in any of the preceding aspects, the first function comprises performing an operation on a touch screen of the device.
According to another aspect of the present disclosure, there is provided an apparatus comprising a physical component in which the apparatus may be used, and an ultrasonic sensor comprising an ultrasonic transmitter and an ultrasonic receiver, the ultrasonic sensor being disposed beneath the physical component and being operative to detect a first type of touch motion applied to the physical component and to activate a first function of the apparatus when the first type of touch motion on the physical component is detected.
The above aspect is such that when touch motion is detected on the physical component, use of an existing physical component of the device activates a function on the device. This greatly simplifies the user's operation of activating the function and operating the device, especially when the user is already familiar with the location of the physical components. Thus, the above aspects improve the user experience.
Optionally, in any of the preceding aspects, the physical component is to activate a second function of the device upon detection by the physical component of a second type of touch motion applied to the physical component different from the first type of touch motion.
Optionally, in any of the preceding aspects, the first type of touch motion and the second type of touch motion have different gestures on the physical component.
Optionally, in any of the preceding aspects, the first type of touch motion and the second type of touch motion have different touch forces on the physical component.
Optionally, in any of the preceding aspects, the first type of touch motion comprises a touch to the physical component, and the second type of touch motion comprises a press on the physical component.
Optionally, in any of the preceding aspects, the physical component is a physical button of the device.
Optionally, in any of the preceding aspects, the physical component is a volume button of the device.
Optionally, in any of the preceding aspects, the physical component is a power button of the device.
Optionally, in any of the preceding aspects, the physical button is a key for inputting information.
Optionally, in any of the preceding aspects, the first function comprises launching an application on the device.
Optionally, in any of the preceding aspects, the first function comprises activating hardware on the device.
Optionally, in any of the preceding aspects, the first function comprises performing an operation on a touch screen of the device.
Drawings
For a more complete understanding of the present invention, and the advantages thereof, reference is now made to the following descriptions taken in conjunction with the accompanying drawings, in which:
FIG. 1 shows a diagram of a handset of an embodiment;
fig. 2 shows a diagram of a volume input of a handset of one embodiment assembled in a chassis of the handset;
FIG. 3 is a diagram illustrating volume input with an ultrasonic sensor attached to the bottom of another embodiment;
FIG. 4 shows a flow diagram of an embodiment method of displaying operation of a handheld device;
FIG. 5 shows a block diagram of a processing system;
fig. 6 shows a block diagram of a processing system.
Detailed Description
The construction, manufacture, and use of the presently preferred embodiments are discussed in detail below. It should be appreciated that the novel concepts of the present invention, as embodied in many applicable inventive forms, can be practiced in many specific environments. The specific embodiments discussed are merely illustrative of specific ways to make and use the invention, and do not delimit the scope of the invention.
Various techniques have been developed and utilized on handheld devices to facilitate operation of the handheld device, for example, to facilitate one-handed manipulation of the handheld device. Embodiments of the present disclosure utilize existing physical components of the handheld device and the ultrasonic sensing process to provide one or more functions for operating the handheld device.
In some embodiments, an ultrasonic sensor comprising an ultrasonic transmitter and an ultrasonic receiver is placed under a physical input, such as a volume button or a power button. The physical input is used to activate a first function of the handheld device when a first type of touch motion, e.g., a press (or "click") motion, is applied on the physical input. The ultrasonic sensor is configured to detect a second type of touch motion on the physical input and activate a second function of the device when the ultrasonic sensor detects the second type of touch motion on the physical input, e.g., a sliding touch or a non-sliding touch. A touch force on a physical input detectable by the ultrasonic sensor may be less than 20 grams and adjustable, while a press motion on a physical input may require a touch force of greater than 200 grams. Thus, a touch motion may be performed on a physical input without interfering with an existing press motion (or press or swing) configured for the physical input.
By using the existing physical input and ultrasonic sensors, different touch motions on the same physical input may activate different functions, which greatly simplifies the user's operation of activating different functions, especially when the user is already familiar with the location of the physical input on the handheld device. Thus, these embodiments improve the user experience without placing much additional burden on the user. Details of the embodiments will be provided below.
The following embodiments of the present disclosure will be described with respect to a mobile phone. However, these embodiments may also be applied to other handheld devices, such as handheld tablets or ipads, without departing from the spirit of the present disclosure.
Cell phones are becoming increasingly popular because of their powerful capabilities and high portability. With the technology developed, cell phones have evolved from having a large number of physical inputs, keys and/or switches, and small screens, to having a small number of physical inputs and large screens. Among the few physical inputs are the volume button and the power button, which are still provided in many current cell phones. Many users are already familiar with the location of the physical volume buttons and power buttons on the handset. The user can easily find and operate the buttons. The user can get a tactile feel and hear a clear "click" sound when operating these buttons, which gives them a sense of hold in using the handset.
Fig. 1 shows a diagram of a handset 100 of an embodiment. The handset may be a smartphone. As shown, the handset 100 includes a screen 102 for displaying information, a volume input 104 for adjusting (increasing or decreasing) the volume, and a power input 106 for turning the handset 100 on or off. The screen 102 may be a touch screen or a multi-touch screen. Typically, the volume input and power input are located on the periphery (or side) of the handset. Fig. 1 shows the volume input 104 on the left side of the handset 100 and the power input 106 on the right side of the handset 100. The volume input 104 or power button 106 may be located in a different location than shown in fig. 1 and may be used for a number of different types of input devices, such as switches, depressible buttons, and the like. For example, the volume input 104 may be located on the right side while the power input 106 is located on the left side. In another example, the volume input 104 may be located on the top side of the handset 100. Fig. 1 also shows the hand 110 of a user holding the handset 100 without losing the generality of many users holding a handheld handset. The user's fingers hold the periphery of the handset 100. The user's thumb 112 holds the cell phone 100 on one side (right as shown) and the other four fingers hold the cell phone 100 on the other side (left as shown). Holding the handset 100 in this manner, the user may operate the power input 106 with his/her thumb 112 and the volume input 104 with his/her other fingers, such as the index finger 114 or the middle finger 116.
When the user holds the handset 100 with the hand 110, he/she can interact with the screen 102, i.e. on the screen, with the handset 100 using the other hand. Interacting with the screen may include performing an operation on the screen. For example, interaction with a screen may include making a selection on the screen (e.g., an option or notification), making a confirmation on the screen (e.g., an option or notification), launching or closing an application, browsing a web page, moving an item displayed on the screen, and so forth. The user may interact with the screen 102 in a limited manner using his/her thumb 112, such as sliding or touching within an on-screen area that is accessible to the thumb 112. However, it is often inconvenient for the user to interact with the screen 102 with only the hand 110 at the time the hand 110 is holding the periphery of the handset 100. For example, the user may have to stretch the thumb 112 over the touch screen while holding the periphery of the handset 100, or the thumb 112 may need to reach the upper half of the screen 102, which may be difficult or impossible. An intuitive and convenient way for a user to interact with information on a screen with a single hand is desired.
There are currently cell phones on the market that use virtual input for squeeze-based selection. But it is difficult to adjust the pressing force. Further, by using virtual inputs rather than physical inputs, the user experience can be degraded because it is difficult to find the location of the virtual input, create a tactile sensation, and hear a clear "click" sound that the user desires to begin when operating a physical input such as a button.
Embodiments of the present disclosure provide methods for utilizing existing physical input and ultrasonic sensing technology on a handheld device to provide one or more functions for operating the handheld device. In particular, one embodiment may intuitively find and register on top of a handheld device's physical input (e.g., volume or power input) by integrating a sliding and/or tapping motion using ultrasonic sensing technology. Examples of functionality that may be provided include interacting with a screen of the handheld device (e.g., making a selection on the screen, making a confirmation on the screen, launching or closing an application, browsing a web page, moving an item displayed on the screen, or other suitable interaction), launching and closing an application or hardware (e.g., turning a flashlight on and off, or launching and closing a camera program, a voice assistant, making screenshots, enabling or disabling WiFi, etc.), receiving an email, a message or notification, and other suitable functionality.
As used herein, a "physical input" of a handheld device refers to a tangible component of the handheld device that is visible to and accessible by a user. Examples of physical inputs may include a volume button or key, a power button or key, a lock key, a keyboard key (e.g., for entering information, such as typing letters or numbers), or any other button or key. Embodiments that utilize existing physical inputs have the advantage that they have become familiar components to users, and they are intuitive and convenient for users to find and register. The terms "physical input" and "input" will be used interchangeably throughout this disclosure.
Ultrasonic sensing is a technique for measuring the distance to an object using sound waves. This is typically done with an ultrasonic sensor, which may also be referred to as an ultrasonic transducer (e.g., a piezoelectric ceramic transducer) or a transceiver. The ultrasonic sensor may be configured, for example, by control of a driver IC, as an ultrasonic transmitter and/or an ultrasonic receiver. An ultrasonic transmitter (as a transmitter) generates and transmits high frequency sound waves, while an ultrasonic receiver receives the sound waves reflected from the object. The time interval between the transmission of the sound wave and the reception of the reflected sound wave is calculated to determine the distance to the object. The technique can be used to detect high precision contact of any object with a surface to which an ultrasonic transceiver is attached on the backside. The techniques may be used to detect touch location, touch area, and touch force of a touch point on a surface. The touch force on the surface detectable by the ultrasonic sensor may be less than 50 grams or even 10 grams and may be adjustable.
In the present disclosure, the terms "ultrasound transmitter" and "transmitter" are used interchangeably, the terms "ultrasound receiver" and "receiver" are used interchangeably, and the terms "ultrasound transceiver" and "transceiver" are used interchangeably. As used in this disclosure, an ultrasonic sensor (or ultrasonic transceiver or transducer) may include one or more transmitters and one or more receivers.
In some embodiments, one or more ultrasonic sensors may be placed under physical input of the handheld device and used to detect touch motions performed on the physical input. For example, the ultrasonic sensor may be attached to the rear side (or back side). The ultrasonic sensor may be connected to a main Printed Circuit Board (PCB) through a flexible PCB. The ultrasonic sensor is placed under the input so that a touch on the input can be detected by the ultrasonic sensor. This also facilitates manufacturing, where the ultrasonic sensor may be attached to the input inwardly facing surface.
The input, e.g. an existing input of the handheld device, may have been used to detect a first type of operation movement on the input and the detection of the first type of operation movement may be used to activate a first function of the handheld device. The first type of operating movement may be a pressing movement for input that has been conventionally configured, such as a pushing (or squeezing) movement downwards, a rotating movement or a rocking movement. Typically, a pressing motion requires an activation force of more than 200 grams.
The ultrasonic sensor may be used to detect a second type of operational movement on the input and the detection of the second type of operational movement may be used for a second function of the handheld device different from the first function activated by the pressing movement. The second type of operational movement does not move the input, e.g., does not press a physical input. The second type of operational motion may be a touch motion detectable by the ultrasonic sensor. The touch motion may include a sliding touch on the input or a non-sliding touch on the slide. At this time, the touch motion and the pressing action are integrated on the input. Various functions may be activated through the use of inputs and ultrasonic sensors. The user can adjust the sensitivity of the touch motion on the input and the ultrasonic sensor can detect. The user may also adjust the types of functions that may be activated when a touch motion on the input is detected. The adjustment of the sensitivity and the type of function may be performed by configuring the settings of the handheld device.
In one example, the user may perform a sliding touch on the input surface gently (i.e., in comparison to the pressing motion on the input), and the ultrasonic sensor detects the position of one or more fingers (e.g., index finger 114 or thumb 112, depending on which finger is performing the touch and where the input is located on the handheld device), translating the detection of the sliding touch into a function for selecting from various options (e.g., activating the function), for example, to launch one of a plurality of most commonly used applications, such as a message, a camera, a video call, or an email. In response to the sliding touch, the cell phone may display selectable options on the screen. The user may then touch (a non-sliding touch detectable by the ultrasonic sensor instead of an input) the input (the surface of the input) to make a selection of an option (e.g., selecting an application to launch). Accordingly, a touch motion on an input may be used to perform an operation that a user performed with an additional touch on the screen.
The configuration of the second function activated by the second type of operational movement on the input detected by the ultrasonic sensor has little or no effect on the existing function of the physical input, as the ultrasonic sensor can detect light force movements. The touch force on the surface detectable by the ultrasonic sensor may be less than 20 grams and may be adjustable, while a pressing motion may require a touch force of greater than 200 grams. Touch motions can be performed on the input without disturbing the existing press motions (or presses or swipes) configured for the input. The configuration of the second function enabled by using the ultrasonic sensor makes it possible for a user to operate the handheld device using a single-handed manipulation without having to frequently stretch a finger (e.g., thumb 112) back and forth across the screen to perform multiple interactions with the screen.
Taking the volume input of a cell phone as an illustrative example, an ultrasonic sensor may be embedded underneath the volume input so that different operational movements performed on the volume input may be applied and detected and used to activate different functions on the cell phone. Fig. 2 shows a diagram of a volume input 200 of a handset of one embodiment assembled in a chassis 220 of the handset. The volume input 200 has a top (or front) surface 210, which is an externally exposed surface and will be touched by the user to adjust the volume, and a bottom (or back) surface 212 that is internal to the handset and not touchable by the user. The volume input 200 is used to adjust the volume of the handset. Specifically, pressing ("clicking") on the top 214 of the top surface 210 of the volume input 200 increases the volume, while pressing ("clicking") on the bottom 216 of the top surface 210 of the volume input 200 decreases the volume. Volume input 200 also includes two legs 202 and 204 protruding from bottom surface 212 away from top surface 210, each connected to tact switch 206 and tact switch 208, respectively. A push down motion on the top 214 of the volume input 200 toggles the contact tact switch 206 and activates an increase in volume. The push down action on the bottom 216 of the volume input 200 toggles the contact tact switch 208 and activates a reduction in volume.
The ultrasonic sensor may be embedded under the volume input 200. For example, the ultrasonic sensor may be attached (e.g., glued) to the bottom surface 212, as shown in FIG. 2. The ultrasonic sensor in the example includes an ultrasonic transmitter 232 and two ultrasonic receivers 234 and 236. The ultrasonic transmitter 232 and ultrasonic receivers 234 and 236 are connected to a PCB in the chassis 220. Based on the transmitted and received ultrasonic signals, it can be determined whether a touch motion is performed on the volume input 200, which is what type of touch motion, such as a sliding touch or a non-sliding touch.
A press action on volume input 210 toggles the tactile switch 206 or 208 to activate a volume up or down function, as in conventional configurations. Touch motion on the volume input 210 detectable by the ultrasonic sensor may be used to activate different functions, such as launching an application or making a selection on a screen, or any of the functions discussed above. Because the activation force (e.g., typically greater than 200 grams) required to switch the contact tact switch 206 or 208 is much greater than the touch (or contact) force required by the ultrasonic sensor (e.g., less than 50 grams or even 10 grams), it can be configured so that the conventional function of volume input is not affected by touch motion on volume input detectable by the ultrasonic sensor. For example, the ultrasonic sensor may be configured such that it does not activate any function when it detects a force greater than a threshold, such as 150 grams, applied to the volume input. Thus, when the volume input 210 is depressed in a conventional manner, which toggles the contact tact switch, the user will adjust the volume. When the user makes a sliding touch or a light contact touch to the volume input 210, which is detectable by the ultrasonic sensor, the user activates different functions that can be configured by the user. The circuitry may be designed to hide the detection of the ultrasonic sensor from activation of a function on the handset. Activating the function based on the sensor detection result is a technique well known to those of ordinary skill in the art and thus will not be described herein.
For illustrative purposes, fig. 2 shows one ultrasonic transmitter 232 and two ultrasonic receivers 234 and 236 placed below the volume input 200. The volume input 200 may have one or more ultrasonic transmitters and one or more ultrasonic receivers attached to its bottom surface. For example, the volume input 200 may be embedded with one transmitter and one receiver, or one transmitter and three receivers, or two transmitters and two receivers. The number of ultrasonic transmitters and receivers that may be embedded below the volume input may be determined based on the size of the volume input and the size of the ultrasonic transmitters and receivers. The location of the ultrasonic transmitters and receivers on the bottom surface 212 may also vary, for example, based on the volume input and the size of the ultrasonic transmitters and receivers, and/or the structural location of the volume input, such as the legs 202 and 204. For example, the ultrasonic transmitters and receivers may be evenly spaced on the bottom surface 212. In another example, as shown in fig. 2, the emitter 232 is placed between the two legs 202 and 204. One receiver 234 is positioned between the leg 202 and one end of the volume input 200 and another receiver 236 is positioned between the leg 204 and the other end of the volume input 200. The number and location of the ultrasonic transmitters and receivers may be determined during the design phase of the handheld device. In some embodiments, when multiple inputs are available on the handheld device, a longer or larger input may be selected to have one ultrasonic sensor embedded underneath to integrate the sliding/touching motion and the pressing motion on the input. The connection of the ultrasonic sensor to the PCB may also be considered when selecting the input.
Fig. 3 shows a diagram of a volume input 300 with an ultrasonic sensor attached underneath according to another embodiment. The volume input 300 includes a top surface 312, a bottom surface 314, and two legs 316 and 318 for opening contact with the tactile switch. An ultrasonic transmitter 322 and an ultrasonic receiver 324 are attached to the bottom surface 314 between the two legs 316 and 318. The volume input has a length of L1. The ultrasonic transmitter 322 and the ultrasonic receiver 324 have the same length L2. The distance between legs 316 and 318 is L3. Based on L1, L2, and L3, the number and location of ultrasonic transmitters and receivers can be determined. For example, where L1 is 20mm, L2 is 2.5mm, and L3 is 12mm, the transmitter and one receiver may be placed between the two legs 316 and 318 (as shown in fig. 3), or one transmitter and two receivers may be placed, for example, evenly between the two legs 316 and 318 (not shown).
In manufacture, a predetermined number of ultrasonic transmitters and receivers may be attached (e.g., glued) to the bottom surface (e.g., 212 or 314) of a predetermined input at a predetermined location, and then the input may be assembled on a rack. To accurately detect touch motion on an input, it is desirable that the distance of each ultrasonic transmitter and receiver from the top surface of the input (e.g., 210 or 312) be the same or about the same, and that the difference satisfy a threshold.
The embodiments described above may be applied to any applicable input on the periphery of a handheld device. Embodiments may also be applied to other physical components of a handheld device that are accessible to operate or use the handheld device. Such a physical component may be removable from the hand-held device, accessible by a user of the hand-held device, or may be separate from the housing of the hand-held device, so that one or more ultrasonic sensors may be attached to the component, and the PCB may also be conventionally attached prior to assembly of the physical component into the housing of the hand-held device during manufacture. The ultrasonic sensor may be used to detect touch motion on the physical component. Examples of physical components may include a SIM card door or a rear camera trim or other suitable components. For example, an ultrasonic sensor may be attached to the back of the glass cover of the handset (for aesthetic purposes). A touch on the surface of the glass cover can be detected by the ultrasonic sensor and activate the function.
Embodiments utilize existing physical components of the handheld device and one or more ultrasonic sensors or ultrasonic arrays to provide more functionality for using or operating the handheld device. These functions may be used to enable one-handed operation of the handheld device or to further facilitate operation of the handheld device. By using existing physical components, the user does not need to learn and remember new locations on the handheld device to use new functionality and the appearance of the handheld device is not affected. Thus, these embodiments improve the user experience without burdening the user.
FIG. 4 shows a flow diagram of an embodiment method 400 of operation of a display device. As illustrated, in step 402, the device detects a first type of touch motion imparted on a physical component of the device using an ultrasonic sensor disposed beneath the physical component, wherein the ultrasonic sensor includes an ultrasonic transmitter and an ultrasonic receiver and is configured to detect the first type of touch motion on the physical component. At step 404, the device activates a first function on the device when a first type of touch motion imparted on the physical component is detected, wherein the first function is associated with the detection of the first type of touch motion on the physical component.
Fig. 5 is a block diagram of a processing system 500 that may be used to implement the apparatus and methods disclosed herein. A particular device may utilize all of the illustrated components or only a subset of the components, and the degree of integration between devices may vary. Further, a device may include multiple instances of a component, such as multiple processing units, processors, memories, transmitters, receivers, etc. The processing system may include a processing unit equipped with one or more input/output devices, such as a speaker, a microphone, a mouse, a touch screen, keys, a keyboard, a printer, a display, and so forth. The processing unit may include a Central Processing Unit (CPU), memory, mass storage device, video adapter, and an I/O interface connected to the bus.
The bus may be one or more of any type of several bus architectures including a memory bus or memory controller, a peripheral bus, a video bus, and the like. The CPU may comprise any type of electronic data processor. The memory may include any type of system memory such as Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), synchronous DRAM (sdram), Read Only Memory (ROM), combinations thereof, and the like. In an embodiment, the memory may include ROM for use at boot-up and DRAM for program and data storage for use in executing programs.
The mass memory device may include any type of non-transitory storage device for storing data, programs, and other information and for making the data, programs, and other information accessible via the bus. The mass storage device may include one or more of the following: solid state drives, hard disk drives, magnetic disk drives, optical disk drives, and the like.
The display card and the I/O interface provide interfaces to couple external input and output devices to the processing unit. As shown, examples of input and output devices include a display coupled to a video adapter, and a mouse/keyboard/printer/input/keys coupled to an I/O interface. Other devices may be coupled to the processing unit and additional or fewer interface cards may be utilized. For example, the interface may be provided to the printer using a serial interface such as a Universal Serial Bus (USB) (not shown).
The processing unit also contains one or more network interfaces that may include wired links, such as ethernet cables or the like, and/or wireless links to access nodes or different networks. The network interface allows the processing unit to communicate with remote units via a network. For example, the network interface may provide wireless communication via one or more transmitters/transmit antennas and one or more receivers/receive antennas. In one embodiment, the processing unit is coupled to a local or wide area network for data processing and communication with remote devices such as other processing units, the internet, remote storage devices, and the like.
FIG. 6 illustrates a block diagram of a processing system 600, which may be installed in a host device, that may be used to implement another embodiment of the apparatus and methods disclosed herein. As shown, processing system 600 includes a processor 604, a memory 606, and an interface 610, 614, which may (or may not) be arranged. Processor 604 may be any component or collection of components adapted to perform computing and/or other processing-related tasks, and memory 606 may be any component or collection of components adapted to store programs and/or instructions for execution by processor 604. In one embodiment, memory 606 includes non-transitory computer-readable media. Interfaces 610, 612, 614 may be any component or collection of components that allow processing system 600 to communicate with other devices/components and/or users. For example, one or more of the interfaces 610, 612, 614 may be adapted to communicate data, control, or management messages from the processor 604 to applications installed on the host device and/or remote device. As another example, one or more of the interfaces 610, 612, 614 may be adapted to allow a user or user device (e.g., a Personal Computer (PC), etc.) to interact/communicate with the processing system 600. Processing system 600 may include additional components not depicted in the figure, such as long-term storage (e.g., non-volatile memory, etc.).
In some embodiments, the processing system 600 is in a user-side device accessing a wireless or wired telecommunications network, such as a mobile station, User Equipment (UE), Personal Computer (PC), tablet, wearable communication device (e.g., smart watch, etc.), or any other handheld device. The user-side device may be adapted to access a telecommunications network. In some embodiments, one or more of the interfaces 610, 612, 614 connect the processing system 600 to a transceiver adapted to send and receive signaling over a telecommunications network.
It should be understood that one or more steps of the embodiment methods provided herein may be performed by corresponding units or modules. For example, the signal may be transmitted by a transmitting unit or a transmitting module. The signal may be received by a receiving unit or a receiving module. The signals may be processed by a processing unit or processing module. Further steps may be performed by the detection unit/module, the activation unit/module and/or the input unit/module. The respective units/modules may be hardware, software or a combination thereof. For example, one or more of the units/modules may be an integrated circuit, such as a Field Programmable Gate Array (FPGA) or an Application Specific Integrated Circuit (ASIC).
While the invention has been described with reference to illustrative embodiments, this description is not intended to be construed in a limiting sense. Various modifications and combinations of the illustrative embodiments, as well as other embodiments of the invention, will be apparent to persons skilled in the art upon reference to the description. It is therefore intended that the appended claims cover any such modifications or embodiments.

Claims (34)

1. A device for integrating sliding and touching on an input device, comprising:
a physical input for activating a first function of the device when a first type of touch motion is detected on the physical input; and
an ultrasonic sensor comprising an ultrasonic transmitter and a first ultrasonic receiver, the ultrasonic sensor disposed beneath the physical input and configured to detect a second type of touch motion on the physical input and to activate a second function of the device when the second type of touch motion is detected on the physical input, wherein the second type of touch motion is different from the first type of touch motion, the first type being different from the second type;
the ultrasonic sensor is further configured to detect position information of the plurality of fingers, and activate a third function of the device according to the position information.
2. The device of claim 1, wherein the second function comprises launching an application on the device.
3. The device of claim 1, wherein the second function comprises activating hardware on the device.
4. The device of claim 1, wherein the second function comprises performing an operation on a touch screen of the device.
5. The device of any of claims 1-4, wherein the first input and the second input are responsive to different gestures.
6. The device of any of claims 1-4, wherein the first type of touch motion comprises a press of the physical input and the second type of touch motion comprises a sliding touch of the physical input.
7. The device of any of claims 1-4, wherein the first type of touch motion and the second type of touch motion have different touch forces on the physical input.
8. The device of any of claims 1-4, wherein the physical input is a volume input of the device.
9. The device of any of claims 1-4, wherein the physical input is a power input of the device.
10. The device of any of claims 1-4, wherein the physical input is a key for entering information into the device.
11. The apparatus of any of claims 1 to 4, further comprising:
a touch screen for displaying information.
12. The apparatus of any of claims 1 to 4, further comprising:
a second ultrasonic receiver disposed below the physical input.
13. A method of integrating sliding and touching on an input device, comprising:
an apparatus detects a first type of touch motion imparted on a physical component of the apparatus using an ultrasonic sensor disposed beneath the physical component, the ultrasonic sensor including an ultrasonic transmitter and an ultrasonic receiver and being for detecting the first type of touch motion on the physical component; and
upon detecting the first type of touch motion imparted on the physical component, the device activating a first function on the device associated with detection of the first type of touch motion on the physical component;
the device detecting, using the physical component, a second type of touch motion applied on the physical component that is different from the first type of touch motion; and
when detecting the second type of touch motion imparted on the physical component, the device activates a second function on the device that is associated with detection of the second type of touch motion on the physical component and that is different from the first function, the first type being different from the second type;
the ultrasonic sensor is further configured to detect position information of the plurality of fingers, and activate a third function of the device according to the position information.
14. The method of claim 13, wherein the first type of touch motion and the second type of touch motion have different gestures on the physical component.
15. The method of claim 13, wherein the first type of touch motion and the second type of touch motion have different touch forces on the physical component.
16. The method of any of claims 13-15, wherein the first type of touch motion comprises a touch to the physical component and the second type of touch motion comprises a press to the physical component.
17. The method of any one of claims 13 to 15, wherein the physical component is a physical input or key.
18. The method of any one of claims 13 to 15, wherein the physical component is a volume input of the device.
19. The method of any one of claims 13 to 15, wherein the physical component is a power input of the device.
20. The method of any one of claims 13 to 15, wherein the physical component is a key for entering information.
21. The method of any of claims 13-15, wherein the first function comprises launching an application on the device.
22. The method of any of claims 13-15, wherein the first function comprises activating hardware on the device.
23. The method of any of claims 13-15, wherein the first function comprises performing an operation on a touch screen of the device.
24. A device for integrating sliding and touching on an input device, comprising:
physical input; and
an ultrasonic sensor comprising an ultrasonic transmitter and an ultrasonic receiver, the ultrasonic sensor disposed beneath the physical input and configured to detect a first type of touch motion imparted on the physical input and to activate a first function of the device when the first type of touch motion on the physical input is detected;
the physical input is to activate a second function of the device upon detection of a second type of touch motion, different from the first type of touch motion, imposed on the physical input by the physical input;
the ultrasonic sensor is further configured to detect position information of the plurality of fingers, and activate a third function of the device according to the position information.
25. The device of claim 24, wherein the first input and the second input are responsive to different gestures.
26. The device of claim 24, wherein the first type of touch motion and the second type of touch motion have different touch forces on the physical input.
27. The device of any of claims 24-26, wherein the first type of touch motion comprises a touch to the physical input and the second type of touch motion comprises a press to the physical input.
28. The apparatus of any of claims 24-26, wherein the physical input is a physical operation.
29. The device of any of claims 24-26, wherein the physical input is a volume input of the device.
30. The device of any one of claims 24 to 26, wherein the physical input is a power input of the device.
31. The apparatus of any of claims 24-26, wherein the physical input is a key for entering information.
32. The device of any of claims 24-26, wherein the first function comprises launching an application on the device.
33. The device of any of claims 24 to 26, wherein the first function comprises activating hardware on the device.
34. The device of any of claims 24-26, wherein the first function comprises performing an operation on a device touch screen.
CN201980040527.8A 2018-06-20 2019-06-19 Method and apparatus for integrating swipe and touch on input device Active CN112313609B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211081423.1A CN115562502B (en) 2018-06-20 2019-06-19 Method and apparatus for integrating swipe and touch on an input device

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US201862687796P 2018-06-20 2018-06-20
US62/687,796 2018-06-20
PCT/US2019/038032 WO2019246295A1 (en) 2018-06-20 2019-06-19 Method and apparatus of integrating slide and touch on an input device

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202211081423.1A Division CN115562502B (en) 2018-06-20 2019-06-19 Method and apparatus for integrating swipe and touch on an input device

Publications (2)

Publication Number Publication Date
CN112313609A CN112313609A (en) 2021-02-02
CN112313609B true CN112313609B (en) 2022-09-16

Family

ID=68984362

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202211081423.1A Active CN115562502B (en) 2018-06-20 2019-06-19 Method and apparatus for integrating swipe and touch on an input device
CN201980040527.8A Active CN112313609B (en) 2018-06-20 2019-06-19 Method and apparatus for integrating swipe and touch on input device

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN202211081423.1A Active CN115562502B (en) 2018-06-20 2019-06-19 Method and apparatus for integrating swipe and touch on an input device

Country Status (2)

Country Link
CN (2) CN115562502B (en)
WO (1) WO2019246295A1 (en)

Family Cites Families (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3725747B2 (en) * 2000-01-07 2005-12-14 三菱電線工業株式会社 Ultrasonic degradation diagnostic equipment for low-voltage cable for railway equipment
AU2012202122B2 (en) * 2005-03-04 2013-07-18 Apple Inc. Multi-functional hand-held device
US20070176903A1 (en) * 2006-01-31 2007-08-02 Dahlin Jeffrey J Capacitive touch sensor button activation
TWM317616U (en) * 2007-02-06 2007-08-21 Inventec Appliances Corp Touch input device
US20090207140A1 (en) * 2008-02-19 2009-08-20 Sony Ericsson Mobile Communications Ab Identifying and responding to multiple time-overlapping touches on a touch panel
US8421483B2 (en) * 2008-06-13 2013-04-16 Sony Ericsson Mobile Communications Ab Touch and force sensing for input devices
CN101907959B (en) * 2010-08-20 2012-10-10 鸿富锦精密工业(深圳)有限公司 Slidably controlled touch type electronic device
CN102629184A (en) * 2012-02-29 2012-08-08 北京创造力教育科技有限公司 Handheld terminal and operation method thereof
WO2014018115A1 (en) * 2012-07-26 2014-01-30 Changello Enterprise Llc Ultrasound-based force sensing of inputs
WO2014081721A1 (en) * 2012-11-21 2014-05-30 The Board Of Trustees Of The Leland Stanford Junior University A multi-touch ultrasonic touch screen
US9389718B1 (en) * 2013-04-04 2016-07-12 Amazon Technologies, Inc. Thumb touch interface
CN104932815A (en) * 2015-05-06 2015-09-23 努比亚技术有限公司 Mobile terminal and operation method thereof
KR102383790B1 (en) * 2015-05-22 2022-04-08 삼성전자주식회사 Environment recognition method and electronic device thereof
CN104915030A (en) * 2015-05-28 2015-09-16 努比亚技术有限公司 Operation method and device based on mobile terminal rolling wheel key
CN106293221B (en) * 2016-08-05 2018-07-06 歌尔股份有限公司 Touch pressure control method and equipment
US10908741B2 (en) * 2016-11-10 2021-02-02 Sentons Inc. Touch input detection along device sidewall
CN107943267A (en) * 2017-11-21 2018-04-20 北京小米移动软件有限公司 A kind of method and apparatus for controlling screen

Also Published As

Publication number Publication date
CN115562502A (en) 2023-01-03
WO2019246295A1 (en) 2019-12-26
CN112313609A (en) 2021-02-02
CN115562502B (en) 2023-10-20

Similar Documents

Publication Publication Date Title
US9024877B2 (en) Method for automatically switching user interface of handheld terminal device, and handheld terminal device
EP3236343B1 (en) Customizing method, responding method and mobile terminal of self-defined touch
US20110050575A1 (en) Method and apparatus for an adaptive touch screen display
CN111240789A (en) Widget processing method and related device
JP7331245B2 (en) Target position adjustment method and electronic device
RU2689430C1 (en) System and method of touch screen control by means of two knuckles of fingers
WO2017161826A1 (en) Functional control method and terminal
CN107077283A (en) Wearable device and its touch-screen, touch operation method and graphic user interface
CN105867810A (en) Menu switching method and terminal
WO2023016372A1 (en) Control method and apparatus, and electronic device and storage medium
CN105867684A (en) Object controlling method and device
WO2018039914A1 (en) Method for copying data, and user terminal
CN108920052B (en) Page display control method and related product
WO2018112803A1 (en) Touch screen-based gesture recognition method and device
CN112313609B (en) Method and apparatus for integrating swipe and touch on input device
CN105824566A (en) Method for controlling terminal and terminal
WO2016121034A1 (en) Wearable device, input method, and program
CN105867807A (en) Terminal unlocking method and device
CN109683721A (en) A kind of input information display method and terminal
US20170277417A1 (en) Method for Adjusting Setting and Terminal
US20110001716A1 (en) Key module and portable electronic device
KR20120134399A (en) Method for providing schedule information using movement sensing device and apparatus therefof
KR20120134383A (en) Method for controlling dialer of mobile termianl using movement sensing device and apparatus therefof
KR20120134429A (en) Method for studyng words using movement sensing device and apparatus therefof
KR20120134476A (en) Method for displaying e-mail content using movement sensing device and apparatus therefof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant