CN107924261B - Method for selecting text - Google Patents

Method for selecting text Download PDF

Info

Publication number
CN107924261B
CN107924261B CN201580030746.XA CN201580030746A CN107924261B CN 107924261 B CN107924261 B CN 107924261B CN 201580030746 A CN201580030746 A CN 201580030746A CN 107924261 B CN107924261 B CN 107924261B
Authority
CN
China
Prior art keywords
touch
text
joint
track
touch gesture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201580030746.XA
Other languages
Chinese (zh)
Other versions
CN107924261A (en
Inventor
张宁
王伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Publication of CN107924261A publication Critical patent/CN107924261A/en
Application granted granted Critical
Publication of CN107924261B publication Critical patent/CN107924261B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A method and device for selecting text include that when joint touch gestures acting on a touch-sensitive surface are detected, whether a user interface displayed by a display is a text application interface or not is recognized, and if the user interface displayed by the display is the text application interface and a track of the joint touch gesture is matched with a preset track, a text selection area is displayed on the text application interface in response to the joint touch gestures. The operation steps of selecting the text are simplified, and the user experience is improved.

Description

Method for selecting text
Technical Field
Embodiments of the present invention relate to a method of selecting text, and more particularly, to a method of selecting text on a display having a touch-sensitive surface using an articulated touch gesture.
Background
With the rapid popularization and development of touch screen type portable electronic devices, more and more people handle multimedia and text applications through the portable electronic devices. For example: people can browse web pages, send and receive e-mails, send and receive instant messages and the like through the portable electronic equipment. When the user needs to share the text with the friend or copy the text, the user can select the target selection text. The operation of selecting text is generally complicated. For example: a user finger contacts a text area to be selected in the touch screen; after the finger keeps the touch state for a preset time interval, popping up a left marker post and a right marker post in the text area; the user touching and dragging the signpost may adjust the text selection area. Like this kind of operation, the user and the interactive step of touch-sensitive screen are more, and user experience is worth improving.
Disclosure of Invention
In order to improve the user experience of text selection operation in the prior art, the embodiment of the invention provides a technical scheme for selecting a text. The technical scheme comprises the following steps:
in a first aspect, an embodiment of the present invention provides a method for selecting text, which is applied to a portable electronic device including a display having a touch-sensitive surface, and includes:
when an articulation touch gesture is detected acting on the touch-sensitive surface;
identifying whether a user interface displayed by the display is a text application interface;
if the user interface displayed by the display is a text application interface and the track of the joint touch gesture is matched with a preset track, responding to the joint touch gesture, and displaying a text selection area on the text application interface, wherein the text selection area is positioned between a first endpoint and a second endpoint;
the first endpoint is located at a first position in the text application interface;
the second endpoint is located at a second location in the text application interface.
In a first possible implementation manner of the first aspect, the method further includes executing the first application function if the user interface displayed by the display is not a text application interface and there is the first application function associated with the trajectory of the joint touch gesture.
In a second possible implementation manner of the first aspect, the method further includes, if the user interface displayed by the display is a text application interface but the trajectory of the joint touch gesture does not match the preset trajectory, executing a second application function when the second application function associated with the trajectory of the joint touch gesture exists.
With reference to the first aspect or any one of the first to the second possible implementation manners of the first aspect, in a third possible implementation manner, the joint touch gesture is composed of a joint touch action; when the grid capacitance value of the touch-sensitive surface generated by the touch action on the touch-sensitive surface meets a first preset capacitance value range, the grid number of the non-zero capacitance values is smaller than a preset value, and the acceleration signal in the Z-axis direction is within the first preset acceleration range, the touch action is the joint touch action, and the gesture formed by the joint touch action is the joint touch gesture.
In a second aspect, an embodiment of the present invention provides a portable electronic device, including:
a display having a touch-sensitive surface;
the acceleration sensor is used for acquiring the acceleration in the Z-axis direction;
a memory to store instructions;
a processor that invokes instructions stored in the memory to implement:
when an articulation touch gesture is detected acting on the touch-sensitive surface;
identifying whether a user interface displayed by the display is a text application interface;
if the user interface displayed by the display is a text application interface and the track of the joint touch gesture is matched with a preset track, responding to the joint touch gesture, and displaying a text selection area on the text application interface, wherein the text selection area is positioned between a first endpoint and a second endpoint;
the first endpoint is located at a first position in the text application interface;
the second endpoint is located at a second location in the text application interface.
In a first possible implementation manner of the second aspect, the instructions are further configured to: executing a first application function associated with the trajectory of the joint touch gesture if the user interface displayed by the display is not a text application interface and the first application function exists.
In a second possible implementation manner of the second aspect, the instructions are further configured to: and if the user interface displayed by the display is a text application interface but the track of the joint touch gesture is not matched with the preset track, executing a second application function when the second application function associated with the track of the joint touch gesture exists.
With reference to the second aspect or any one of the first to the second possible implementation manners of the second aspect, in a third possible implementation manner, the joint touch gesture is composed of a joint touch action; when the grid capacitance value of the touch-sensitive surface generated by the touch action on the touch-sensitive surface meets a first preset capacitance value range, the grid number of the non-zero capacitance values is smaller than a preset value, and the acceleration signal in the Z-axis direction is within the first preset acceleration range, the touch action is the joint touch action, and the gesture formed by the joint touch action is the joint touch gesture.
In a third aspect, an embodiment of the present invention provides an apparatus, where the apparatus includes: the device comprises a detection unit, an identification unit, a judgment unit and a text selection unit;
the detection unit is used for detecting a joint touch gesture acting on the touch-sensitive surface;
the identification unit is used for identifying whether the user interface displayed by the display is a text application interface;
the judging unit is used for judging whether the track of the joint touch gesture is matched with a preset track;
the text selection unit is used for responding to the joint touch gesture and displaying a text selection area on the text application interface when the joint touch gesture acting on the touch-sensitive surface is detected and the user interface displayed by the display is a text application interface and the track of the joint touch gesture is matched with a preset track, wherein the text selection area is positioned between a first endpoint and a second endpoint;
the first endpoint is located at a first position in the text application interface;
the second endpoint is located at a second location in the text application interface.
In a first possible implementation manner of the third aspect, the apparatus further includes: first judging unit, first execution unit:
the first judging unit is used for judging whether an application function associated with the track of the joint touch gesture exists or not when the user interface displayed by the display is not a text application interface;
the first execution unit is used for executing a first application function when the first application function associated with the track of the joint touch gesture exists.
In a second possible implementation manner of the third aspect, the apparatus further includes: a second judging unit and a second executing unit:
the second judging unit is used for judging whether an application function associated with the track of the joint touch gesture exists or not when the user interface displayed by the display is a text application interface but the track of the joint touch gesture is not matched with a preset track;
the second execution unit is configured to execute a second application function associated with the joint touch gesture when the second application function exists.
With reference to the third aspect or any one of the first to the second possible implementation manners of the third aspect, in a third possible implementation manner, the joint touch gesture is composed of a joint touch action; when the grid capacitance value of the touch-sensitive surface generated by the touch action on the touch-sensitive surface meets a first preset capacitance value range, the grid number of the non-zero capacitance values is smaller than a preset value, and the acceleration signal in the Z-axis direction is within the first preset acceleration range, the touch action is the joint touch action, and the gesture formed by the joint touch action is the joint touch gesture.
In a fourth aspect, embodiments of the present invention provide a user interface on a portable electronic device, the portable electronic device including a display, a memory, and a processor for executing instructions stored in the memory, wherein the display has a touch-sensitive surface, the user interface comprising:
an interface for displaying a text application;
when joint touch gestures acting on the touch-sensitive surface are detected, if a user interface displayed by a display is a text application interface and the track of the joint touch gestures is matched with a preset track, responding to the joint touch gestures, and displaying an interface of a text selection area on the text application interface, wherein the text selection area is positioned between a first endpoint and a second endpoint;
the first endpoint is located at a first position in the text application interface;
the second endpoint is located at a second location in the text application interface.
In a fifth aspect, embodiments of the invention provide a non-volatile computer readable storage medium storing one or more programs, the one or more programs comprising instructions, which when executed by a portable electronic device comprising a display with a touch-sensitive surface, cause the portable electronic device to perform the following:
when an articulation touch gesture is detected acting on the touch-sensitive surface;
identifying whether a user interface displayed by the display is a text application interface;
if the user interface displayed by the display is a text application interface and the track of the joint touch gesture is matched with a preset track, responding to the joint touch gesture, and displaying a text selection area on the text application interface, wherein the text selection area is positioned between a first endpoint and a second endpoint;
the first endpoint is located at a first position in the text application interface;
the second endpoint is located at a second location in the text application interface.
The technical scheme of the embodiment of the invention discloses that when a joint touch gesture acting on a touch-sensitive surface is detected, whether a user interface displayed by a display is a text application interface is identified, and if the user interface displayed by the display is the text application interface and the track of the joint touch gesture is matched with a preset track, a text selection area is displayed on the text application interface in response to the joint touch gesture. The scheme of the embodiment of the invention simplifies the operation steps of selecting the text, thereby improving the user experience.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic diagram of an internal structure of a portable electronic device 100 according to an embodiment of the present invention;
fig. 2 is a schematic external structural diagram of a portable electronic device 100 according to an embodiment of the present invention;
FIG. 3 is a diagram illustrating a text selection area according to an embodiment of the present invention;
FIG. 4 is a flowchart of a method for selecting a text according to an embodiment of the present invention;
FIG. 5 is an exemplary user interface with a cross-line (i.e., ") trace of a joint touch gesture according to an embodiment of the present invention;
FIG. 6 is an exemplary user interface of the text selection area resulting from the trajectory of the joint touch gesture shown in FIG. 5, according to an embodiment of the present invention;
FIG. 7 is an exemplary user interface of an embodiment of the present invention in which the trajectory of the joint touch gesture is a vertical line (i.e., ");
FIG. 8 is an exemplary user interface of the text selection area resulting from the trajectory of the joint touch gesture shown in FIG. 7 in accordance with embodiments of the present invention;
FIG. 9 is an exemplary user interface with a diagonal trajectory (i.e., "/") for a joint touch gesture according to embodiments of the present invention;
FIG. 10 is an exemplary user interface of the text selection area resulting from the trajectory of the joint touch gesture shown in FIG. 9 according to an embodiment of the present invention;
FIG. 11 is an exemplary user interface with a diagonal trajectory (i.e., "\") for a joint touch gesture in accordance with an embodiment of the present invention;
FIG. 12 is an exemplary user interface of the text selection area resulting from the trajectory of the joint touch gesture shown in FIG. 11 in accordance with embodiments of the present invention;
FIG. 13 is an exemplary user interface for performing word processing functions on a text selection field provided by embodiments of the present invention;
FIG. 14 is a simplified schematic diagram of an internal structure of an electronic device with a touch-sensitive display unit provided in accordance with an embodiment of the invention;
fig. 15 is a functional structure diagram of an apparatus according to an embodiment of the present invention.
Detailed Description
The technical solutions of the embodiments of the present invention will be described clearly and completely with reference to the accompanying drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be obtained by a person skilled in the art without any inventive step based on the embodiments of the present invention, are within the scope of the present invention.
For convenience of illustration, the portable electronic device 100 including the touch-sensitive display unit is used as an example of the embodiment of the present invention, and those skilled in the art will appreciate that the embodiment of the present invention is also applicable to other apparatuses, such as handheld devices, vehicle-mounted devices, wearable devices, computing devices, and various forms of User Equipment (UE), Mobile Stations (MS), terminals (Terminal), Terminal Equipment (Terminal Equipment), and the like.
The electronic device 100 may support a variety of applications, such as text applications (e-mail applications, blogging applications, web browsing applications, etc.); the touch-sensitive display unit of the electronic device 100 may visually present a user interface of the application, and various applications may be executed through the touch-sensitive display unit of the electronic device 100.
Fig. 1 is a schematic diagram of an internal structure of a portable electronic device 100 according to an embodiment of the present invention. The electronic device 100 may comprise components such as a touch sensitive display unit 130, an acceleration sensor 151, a proximity light sensor 152, an ambient light sensor 153, a memory 120, a processor 190, a radio frequency unit 110, an audio circuit 160, a speaker 161, a microphone 162, a WiFi (wireless fidelity) module 170, a bluetooth module 180, a power supply 193, an external interface 197, etc.
Those skilled in the art will appreciate that fig. 1 is merely exemplary of a portable electronic device and is not intended to be limiting of portable electronic devices and may include more or fewer components than those shown, or some components may be combined, or different components.
The touch-sensitive display unit 130 is sometimes referred to as a "touch screen" for convenience and may also be referred to as or as a touch-sensitive display system, and may also be referred to as a display having a touch-sensitive surface (touch-sensitive surface). The display with the touch-sensitive surface comprises a touch-sensitive surface and a display screen; a screen interface may be displayed and a touch action may be received.
The touch-sensitive display unit 130 provides an input interface and an output interface between the device and the user. The touch-sensitive display unit 130 may collect touch operations by a user on or near the touch-sensitive display unit, such as user operations on or near the touch-sensitive display unit using a finger 202, joint, stylus, or any suitable object. The touch-sensitive display unit may detect a touch action on the touch-sensitive display unit, a grid capacitance value of the touch-sensitive display unit, and a contact coordinate, send the touch action, the grid capacitance value of the touch-sensitive display unit, and the contact coordinate information to the processor 190, and receive and execute a command sent by the processor 190. The touch-sensitive display unit 130 displays visual output. The visual output may include graphics, text, icons, video, and any combination thereof (collectively "graphics"). In some embodiments, some or all of the visual output may correspond to a user interface object.
The touch-sensitive display unit 130 may use LCD (liquid crystal display) technology, LPD (light emitting polymer display) technology, or LED (light emitting diode) technology, although other display technologies may be used in other embodiments. Touch-sensitive display unit 130 may detect contact and any movement or break thereof using any of a variety of touch sensing technologies now known or later developed, including but not limited to capacitive, resistive, infrared, and surface acoustic wave technologies, as well as other proximity sensor arrays or other elements for determining one or more points of contact with touch-sensitive display unit 130. In an exemplary embodiment, projected mutual capacitance sensing technology is used.
The user may make contact with touch-sensitive display unit 130 using any suitable object or appendage, such as a stylus, a finger, a joint, and so forth. In some embodiments, the user interface is designed to work primarily with joint-based contacts and gestures. In some embodiments, the device translates the coarse joint-based input into a precise pointer/cursor position or command to perform the action desired by the user.
In some embodiments, device 100 may include a touch pad (not shown) for activating or deactivating particular functions in addition to the touch-sensitive display unit. In some embodiments, the trackpad is a touch-sensitive area of the device that, unlike a touch-sensitive display unit, does not display visual output. The trackpad may be a touch-sensitive surface separate from the touch-sensitive display unit 130 or an extension of the touch-sensitive surface formed by the touch-sensitive display unit.
The acceleration sensor 151 can detect the magnitude of acceleration in various directions (typically three axes). Meanwhile, the acceleration sensor 151 may also be used to detect the magnitude and direction of gravity when the terminal is stationary, and may be used in applications for recognizing gestures of a mobile phone (e.g., horizontal and vertical screen switching, related games, magnetometer gesture calibration), vibration recognition related functions (e.g., pedometer, tapping), and the like. In the embodiment of the present invention, the acceleration sensor 151 is configured to obtain a gravitational acceleration of the touch action of the user contacting the touch-sensitive display unit in the Z-axis direction.
Electronic device 100 may also include one or more proximity light sensors 152 for turning off and disabling touch functionality of the touch-sensitive surface when the electronic device 100 is closer to the user (e.g., near the ear when the user is on a phone) to avoid user malfunction of the touch-sensitive display unit. The electronic device 100 may also include one or more ambient light sensors 153 to keep the touch-sensitive display unit off when the electronic device 100 is in a user's pocket or other dark area to prevent the electronic device 100 from consuming unnecessary battery power or being mishandled while in a locked state. In some embodiments, the proximity light sensor and the ambient light sensor may be integrated into one component or may be provided as two separate components. As for the electronic device 100, other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor may be further configured, and are not further described herein. Although fig. 1 shows the proximity light sensor and the ambient light sensor, it is understood that they do not belong to the essential constitution of the electronic apparatus 100, and may be omitted as needed within a range not changing the essence of the invention.
The memory 120 may be used to store instructions and data. The memory 120 may mainly include a storage instruction area and a storage data area. The storage data area can store the incidence relation between the joint touch gesture and the application function and can also store preset track information. The storage instruction area may store an operating system, instructions required for at least one function, and the like. The instructions may cause processor 190 to perform a method comprising: when an articulation touch gesture is detected that acts on the touch-sensitive surface, it is recognized whether a user interface displayed by the display is a text application interface. And if the user interface displayed by the display is a text application interface and the track of the joint touch gesture is matched with the preset track, responding to the joint touch gesture and displaying a text selection area on the text application interface. The text selection area is positioned between the first endpoint and the second endpoint; the first endpoint is located at a first position in the text application interface, and the second endpoint is located at a second position in the text application interface. Executing the first application function if the user interface displayed by the display is not a text application interface and the first application function associated with the trajectory of the joint touch gesture exists. And if the user interface displayed by the display is a text application interface but the track of the joint touch gesture does not match with the preset track, executing a second application function when the second application function associated with the track of the joint touch gesture exists.
The processor 190 is a control center of the electronic device 100, connects various parts of the entire mobile phone by using various interfaces and lines, and performs various functions of the electronic device 100 and processes data by operating or executing instructions stored in the memory 120 and calling data stored in the memory 120, thereby performing overall monitoring of the mobile phone. Alternatively, processor 190 may include one or more processing units; preferably, the processor 190 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 190. In some embodiments, the processor, memory, may be implemented on a single chip; in some embodiments, they may also be implemented separately on separate chips. In an embodiment of the present invention, processor 190 is further configured to invoke instructions in the memory to enable, when an articulation touch gesture is detected that acts on the touch-sensitive surface, identifying whether a user interface displayed by the display is a text application interface. And if the user interface displayed by the display is a text application interface and the track of the joint touch gesture is matched with the preset track, responding to the joint touch gesture and displaying a text selection area on the text application interface. The text selection area is positioned between the first endpoint and the second endpoint; the first endpoint is located at a first position in the text application interface, and the second endpoint is located at a second position in the text application interface. Executing the first application function if the user interface displayed by the display is not a text application interface and the first application function associated with the trajectory of the joint touch gesture exists. And if the user interface displayed by the display is a text application interface but the track of the joint touch gesture does not match with the preset track, executing a second application function when the second application function associated with the track of the joint touch gesture exists.
The radio frequency unit 110 may be configured to receive and transmit information or receive and transmit signals during a call, and in particular, receive downlink information of a base station and then process the downlink information to the processor 190; in addition, the data for designing uplink is transmitted to the base station. Typically, the RF circuitry includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a Low Noise Amplifier (LNA), a duplexer, and the like. In addition, the radio frequency unit 110 may also communicate with network devices and other devices through wireless communication. The wireless communication may use any communication standard or protocol, including but not limited to Global System for Mobile communications (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), Long Term Evolution (LTE), email, Short Messaging Service (SMS), etc.
The audio circuitry 160, speaker 161, microphone 162 may provide an audio interface between a user and the electronic device 100. The audio circuit 160 may transmit the electrical signal converted from the received audio data to the speaker 161, and convert the electrical signal into a sound signal for output by the speaker 161; on the other hand, the microphone 162 converts the collected sound signal into an electrical signal, which is received by the audio circuit 160 and then converted into audio data, and then the audio data is processed by the audio data output processor 190 and then transmitted to another terminal via the rf unit 110, or the audio data is output to the memory 120 for further processing.
WiFi belongs to short-range wireless transmission technology, and the electronic device 100 can help the user send and receive e-mails, browse web pages, access streaming media, etc. through the WiFi module 170, which provides the user with wireless broadband internet access. Although fig. 1 shows the WiFi module 170, it is understood that it does not belong to the essential constitution of the electronic device 100, and may be omitted entirely as needed within the scope not changing the essence of the invention.
Bluetooth is a short-range wireless communication technology. By using the bluetooth technology, the communication between mobile communication terminal devices such as a palm computer, a notebook computer, and a mobile phone can be effectively simplified, and the communication between the above devices and the Internet (Internet) can also be successfully simplified, so that the data transmission between the electronic device 100 and the Internet becomes faster and more efficient through the bluetooth module 180 of the electronic device 100, and a road is widened for wireless communication. Bluetooth technology is an open solution that enables wireless transmission of voice and data. Fig. 1, however, shows a WiFi module 170, but it is understood that it does not belong to the essential components of the electronic device 100, and may be omitted entirely as needed within the scope not changing the essence of the invention.
The electronic device 100 further includes a power source 193 (e.g., a battery) for supplying power to various components, which may preferably be logically connected to the processor 190 via a power management system 194, such that functions of managing charging, discharging, and power consumption are performed via the power management system 194.
The electronic device 100 further includes an external interface 197, which may be a standard Micro USB interface, or a multi-pin connector, and may be used to connect the electronic device 100 to communicate with other devices, or to connect a charger to charge the electronic device 100.
Although not shown, the electronic device 100 may further include a camera, a flash, and the like, which are not described in detail herein.
The method for selecting text is described below by taking the electronic device 100 as an example.
Fig. 2 is a schematic external structural diagram of the portable electronic device 100 according to the embodiment of the present invention. In the present embodiment, the electronic apparatus 100 may include a touch-sensitive display unit 130, an acceleration sensor 151, a volume control key 132, a switch key 133, a microphone 162, a speaker 161, an external interface 197, and an earphone jack 163. The touch-sensitive display unit 130 may display one or more graphics 300 in the user interface 200, receive touch inputs from a user, and may reduce the number of physical inputs or controls on the electronic device 100 by using the touch-sensitive display unit 130 as a primary input or control for operating the electronic device 100. In this embodiment, the touch-sensitive display unit may be referred to as a "menu button". In some other embodiments, a "menu button" may be a physical button or other physical input or control device. The acceleration sensor 151 is used for acquiring the gravity acceleration of the touch action of the user on the touch-sensitive display unit in the Z-axis. Turning on or off the power of the electronic device 100 may be accomplished by depressing and holding the switch key in a depressed state for a predetermined time interval. Locking the electronic device 100 may be achieved by depressing the switch button and releasing it before a predetermined time interval. In other embodiments, voice input to activate some functions may also be received through the microphone 162.
Fig. 3 is a schematic diagram illustrating a text selection area according to an embodiment of the present invention. The text selection area 301 may be represented on the touch-sensitive display unit 130 as text bounded by a first endpoint 302a and a second endpoint 302b, located between the first endpoint and the second endpoint. Those skilled in the art will appreciate that the selected text region 301 may contain any portion of the text shown in FIG. 3, and that the text selected in FIG. 3 is merely one example. Further, the first end point 302a may be associated with a first marker post 303a and the second end point 302b may be associated with a second marker post 303 b. First marker post 303a and second marker post 303b may be used to indicate the location of first endpoint 302a and second endpoint 302b, respectively. Since the signpost is easier to manipulate than the end points, when a user wishes to move one or both of the first end point 302a or the second end point 302b to a new location, moving the end point to the new location may be accomplished by moving the signpost associated with the end point to the new location. In some embodiments, marker posts 303a and 303b may be other shapes, sizes, and colors. This embodiment is merely an example.
Fig. 4 is a flowchart of a method for selecting a text according to an embodiment of the present invention.
Method 400 may be performed on a portable electronic device (e.g., electronic device 100 of fig. 1 or 2) having a touch-sensitive display unit and a plurality of applications, including a text application. In some embodiments, some operations in method 400 may be combined, and/or the order of some operations may be changed.
As described below, the method 400 provides a more efficient way to quickly select text. The method helps the user to select the text by fewer operation steps. The operation steps of selecting the text are simplified, and the user experience is improved.
The portable electronic device detects an articulation touch gesture (401) acting on the touch-sensitive surface.
For example: the step 401 can specifically include steps 4011 to 4013.
The portable electronic device detects a touch action acting on the touch-sensitive surface (4011); judging whether the touch action is a joint touch action (4012); detecting a joint touch gesture consisting of the joint touch actions (4013).
Upon detecting an articulation touch gesture acting on the touch-sensitive surface, identifying whether a user interface displayed by a display is a text application interface (402).
Illustratively, the text application interface may display web browsing, email, notepad, instant messaging, blog application, and the like.
If the user interface displayed by the display is not a text application interface, determining whether an application function associated with the trajectory of the joint touch gesture exists (403). When there is an application function associated with the trajectory of the joint touch gesture, the electronic device executes the application function (404).
And if the user interface displayed by the display is a text application interface, judging whether the track of the joint touch gesture is matched with a preset track (405). And when the track of the joint touch gesture is matched with a preset track, responding to the joint touch gesture, and displaying a text selection area (406) on the text application interface.
Optionally, if the user interface displayed by the display is a text application interface, but the trajectory of the joint touch gesture is not matched with a preset trajectory, determining whether an application function associated with the trajectory of the joint touch gesture exists. When there is an application function associated with the trajectory of the joint touch gesture, executing the application function.
And when the touch-sensitive display unit receives a touch action acting on the touch-sensitive surface, the touch information is transferred to the processor. The touch information may include one or more of touch point coordinates, grid capacitance values of the touch sensitive display unit, and touch actions. The touch actions may include pressing, moving, and lifting actions.
In some embodiments, it may be determined whether the touch action is a joint touch action based on the grid capacitance information and a Z-axis direction acceleration signal generated by the touch action (4012). The touch sensitive surface grid capacitance information includes a grid capacitance value and a grid number of non-zero capacitance values.
When the grid capacitance value meets a first preset capacitance value range, the grid number of the non-zero capacitance value is smaller than a preset value, and the acceleration signal in the Z-axis direction is in the first preset acceleration range, it can be judged that the touch action is a joint touch action. When the grid capacitance value meets a second preset capacitance value range, the grid number of the non-zero capacitance value is larger than or equal to a preset value, and the acceleration signal in the Z-axis direction is in the second preset acceleration range, it can be judged that the touch action is a finger touch action.
For example, when the grid capacitance value of the touch-sensitive display unit indicates that the maximum capacitance value satisfies a first preset capacitance value range (for example, less than or equal to 0.42pF), the number of grids with non-zero capacitance values distributed is less than 7, and the Z-axis direction acceleration signal is within the first preset acceleration range (for example, within 5ms, the acceleration signal is greater than 3g) within a preset time, the touch action may be determined as a joint touch action. When the grid capacitance value of the touch-sensitive display unit indicates that the maximum capacitance value satisfies a second preset capacitance value range (for example, greater than 0.42pF, less than or equal to 0.46pF), the number of grids with non-zero capacitance values distributed is greater than or equal to 7, and in a preset time, the Z-axis direction acceleration signal is within a second preset acceleration range (for example, within 5ms, the acceleration signal is less than 2g, and g is gravity acceleration), it may be determined that the touch action is a finger touch action. It is understood that the joint touch action in the embodiment of the present invention is not necessarily triggered by the finger joint, and other objects may also hit the touch-sensitive display unit 130 at a fast speed, and as long as the determination condition of the joint touch action is satisfied, the joint touch action in the embodiment of the present invention may be referred to as the joint touch action.
The touch gesture may consist of a touch action. For example, a tap gesture consists of two touch actions, press and lift; the swipe gesture consists of three touch actions, a press, a move, and a lift. When it is determined that the touch action is a joint touch action, a joint touch gesture composed of the joint touch actions may be detected (4013). For example: the joint click gesture consists of two joint touch actions of pressing and lifting; the joint sliding gesture consists of three joint touch actions of pressing, moving and lifting.
In some embodiments, by looking up the association relationship between the trajectory of the joint touch gesture and the application function stored in the memory 120, it is determined whether there is an application function associated with the trajectory of the joint touch gesture (403). The joint touch gesture may be composed of different joint touch actions. For example, an articulation tap gesture consists of two articulation touch actions of pressing and lifting, and an articulation slide gesture consists of three articulation touch actions of pressing, moving, and lifting. And the moving track between the pressing and the lifting is the track of the joint touch gesture. The user may preset the association relationship between the trajectories of the various joint touch gestures and the application functions, and store the association relationship between the trajectories of the joint touch gestures and the application functions in the memory 120.
For example, the user may previously set the association relationship of the trajectory "C" of the joint touch gesture with the camera application function, and save the association relationship of the trajectory "C" of the joint touch gesture with the camera application function in the memory 120. When the joint touch gesture with the track of "C" acting on the touch-sensitive surface is detected, the track of the joint touch gesture "C" can be judged to be associated with the camera application function by searching the association relationship between the track of the joint touch gesture stored in the memory 120 and the application function.
In some embodiments, when a depressed articulating touch action is detected in region A of the touch sensitive display unit, the depression moves on the touch sensitive display unit to region B where a raised articulating touch action is detected. The joint touch event that the joint is pressed down in the area A and lifted up after moving to the area B is a joint touch gesture. For example: the position of the touch action (e.g., zone a or zone B) can be determined from the contact coordinate information. The joint touch gesture may be comprised of a joint touch action. For example, an articulation tap gesture consists of two articulation touch actions of pressing and lifting; the joint sliding gesture consists of three joint touch actions of pressing, moving and lifting. The area A is a starting contact area of the joint touch gesture and the touch-sensitive display unit; and the area B is an ending contact area of the joint touch gesture and the touch-sensitive display unit. The moving track from the area A to the area B is the track of the touch gesture. The electronic device compares the detected track of the joint touch gesture with a preset track, and judges whether the track of the joint touch gesture is matched with the preset track (405). The preset track information may be preset by the electronic device when the electronic device leaves a factory, or may be preset by a user. The preset trajectory information may be saved in a memory (e.g., memory 120 in fig. 1);
illustratively, the predetermined track is a straight line, which may be a horizontal line, a vertical line or a diagonal line (e.g., "-", "|", "/" or "\"). It should be noted that the preset trajectory may be a trajectory in other forms, and may be adaptively adjusted according to specific design requirements. In the embodiment of the present invention, the preset track is taken as an example of a straight line, and does not limit the scheme of the present invention.
FIG. 5 is an exemplary user interface with a joint touch gesture traced as a horizontal line (e.g., "-") according to an embodiment of the present invention. For example: the joint touch position detected at the beginning is in the area a (the area a is a start contact area) and then the joint touch position is detected to move to the area B (the area B is an end contact area), and the track of the joint touch gesture is the track moving from the area a to the area B. The arrow direction is the moving direction of the joint touch gesture from zone a to zone B, as indicated by the horizontal line trace shown by the dotted line. And the electronic equipment compares the detected transverse line track with a preset straight line track and judges that the transverse line track is matched with the preset track.
In some embodiments, if it is determined that the trajectory of the joint touch gesture matches a preset trajectory, a text selection area is displayed on the text application interface in response to the joint touch gesture (406).
The text selection area is located between a first endpoint and a second endpoint, the first endpoint is located at a first position in the text application interface, and the second endpoint is located at a second position in the text application interface. For example: after the track of the joint touch gesture is judged to be matched with a preset track, inserting a first endpoint at the position of a starting contact area A of the joint touch gesture and the touch-sensitive display unit, and inserting a second endpoint at the position of an ending contact area B of the joint touch gesture and the touch-sensitive display unit; the insertion position of the first endpoint is a first position, which may be the beginning or end of a text word or word in a text application interface displayed by the display that is closest to the centroid of the region a; the insertion position of the second endpoint is a second position, which may be the beginning or end of a text word or word in the text application interface displayed by the display that is closest to the centroid of the region B; and a text region positioned between the first endpoint and the second endpoint in the text application interface is the text selection region.
FIG. 6 is an exemplary user interface of the text selection area resulting from the trajectory of the joint touch gesture shown in FIG. 5 according to an embodiment of the present invention. The insertion location of the first endpoint 302a is the beginning or end of a text word or word closest to the centroid of the joint touch gesture and the beginning contact area A of the touch-sensitive display unit of the example of FIG. 5; the insertion location of the second endpoint 302B is the beginning or end of the text word or word closest to the centroid of the joint touch gesture and the ending contact region B of the touch-sensitive display unit of the example of fig. 5; the text selection area 301 is located between a first end point 302a and a second end point 302 b.
FIG. 7 is an exemplary user interface with vertical lines (e.g., "|") trajectory for joint touch gestures according to embodiments of the present invention. For example: the joint touch position detected at the beginning is in a zone A (the zone A is a starting contact zone) and then the joint touch position is detected to move to a zone B (the zone B is an ending contact zone), the track of the joint touch gesture is a track moving from the zone A to the zone B, such as a vertical line track shown by a dotted line, and the arrow direction is the moving direction of the joint touch gesture from the zone A to the zone B. And the electronic equipment compares the detected vertical line track with a preset straight line track and judges that the vertical line track is matched with the preset track.
FIG. 8 is an exemplary user interface of the text selection area resulting from the trajectory of the joint touch gesture shown in FIG. 7 according to embodiments of the present invention. The insertion location of the first endpoint 302a is the beginning or end of a text word or word closest to the centroid of the joint touch gesture and the beginning contact area A of the touch-sensitive display unit of the example of FIG. 7; the insertion location of the second endpoint 302B is the beginning or end of the text word or word closest to the centroid of the joint touch gesture and the ending contact region B of the touch-sensitive display unit of the example of FIG. 7; the text selection area 301 is located between a first end point 302a and a second end point 302 b.
FIG. 9 is an exemplary user interface with a diagonal trajectory (e.g., "/") for a joint touch gesture according to embodiments of the present invention. For example: the joint touch position detected at the beginning is in a zone A (the zone A is a starting contact zone) and then the joint touch position is detected to move to a zone B (the zone B is an ending contact zone), the track of the joint touch gesture is a track moving from the zone A to the zone B, such as a vertical line track shown by a dotted line, and the arrow direction is the moving direction of the joint touch gesture from the zone A to the zone B. And the electronic equipment compares the detected oblique line track with a preset straight line track and judges that the oblique line track is matched with the preset track.
FIG. 10 is an exemplary user interface of the text selection area resulting from the trajectory of the joint touch gesture shown in FIG. 9 according to embodiments of the present invention. The insertion location of the first endpoint 302a is the beginning or end of a text word or word closest to the centroid of the joint touch gesture and the beginning contact area A of the touch-sensitive display unit of the example of FIG. 9; the insertion location of the second endpoint 302B is the beginning or end of the text word or word closest to the centroid of the joint touch gesture and the ending contact region B of the touch-sensitive display unit of the example of fig. 9; the text selection area 301 is located between a first end point 302a and a second end point 302 b.
FIG. 11 is an exemplary user interface with a diagonal trajectory (e.g., "\") for a joint touch gesture in accordance with an embodiment of the present invention. For example: the joint touch position detected at the beginning is in a zone A (the zone A is a starting contact zone) and then the joint touch position is detected to move to a zone B (the zone B is an ending contact zone), the track of the joint touch gesture is a track moving from the zone A to the zone B, such as a vertical line track shown by a dotted line, and the arrow direction is the moving direction of the joint touch gesture from the zone A to the zone B; and the electronic equipment compares the detected oblique line track with a preset straight line track and judges that the oblique line track is matched with the preset track.
FIG. 12 is an exemplary user interface of the text selection area resulting from the trajectory of the joint touch gesture shown in FIG. 11 according to embodiments of the present invention. The insertion location of the first endpoint 302a is the beginning or end of a text word or word closest to the centroid of the joint touch gesture and the beginning contact area A of the touch-sensitive display unit of the example of FIG. 11; the insertion location of the second endpoint 302B is the beginning or end of the text word or word closest to the centroid of the joint touch gesture and the ending contact region B of the touch-sensitive display unit of the example of FIG. 11; the text selection area 301 is located between a first end point 302a and a second end point 302 b.
Optionally, the above embodiments may each further perform a word processing function on the text selection area.
Fig. 13 is an exemplary user interface for performing word processing functions on a text selection field provided by an embodiment of the present invention. The word processing functions may include copy, cut, paste, translate, etc. Other word processing functions may also be invoked by selecting "more," which may include underlining the selected text, bolding the selected text, changing the font, font size, font color selection, etc. of the selected text. The arrangement order and the presentation form of the word processing functions illustrated in fig. 13 can be appropriately adjusted according to design requirements.
The technical scheme of the embodiment of the invention discloses that when a joint touch gesture acting on a touch-sensitive surface is detected, whether a user interface displayed by a display is a text application interface is identified, and if the user interface displayed by the display is the text application interface and the track of the joint touch gesture is matched with a preset track, a text selection area is displayed on the text application interface in response to the joint touch gesture. The scheme of the embodiment of the invention simplifies the operation steps of selecting the text, thereby improving the user experience.
Fig. 14 is a simplified schematic diagram of an internal structure of an electronic device with a touch-sensitive display unit according to an embodiment of the present invention.
The functional blocks of the electronic device can be implemented by hardware, software, or a combination of hardware and software to carry out the principles of the present invention. Those skilled in the art will appreciate that the functional blocks described in fig. 14 can be combined or separated into sub-functional blocks to implement the principles of the present invention as described above. Thus, the description herein may support any possible combination or separation or further definition of the functional modules described herein.
Based on the same inventive concept, because the principle of solving the problems of the electronic equipment and the device is similar to the method for selecting the text in the embodiment of the invention, the implementation of the electronic equipment and the device can refer to the implementation of the method, and repeated parts are not described again.
As shown in fig. 14, the electronic device 1400 includes: touch-sensitive display unit 130, acceleration sensor 151, memory 120, processor 190.
The touch-sensitive display unit 130 may be a display having a touch-sensitive surface, and the touch-sensitive display unit 130 includes a touch-sensitive surface and a display screen. The touch-sensitive display unit 130 is used for displaying a screen interface, and is also used for receiving touch actions acting on a touch-sensitive surface and transmitting touch information to the processor 190. The touch information may include one or more signals of touch point coordinates, grid capacitance values of the touch-sensitive display unit, and touch actions; the touch actions may include pressing, moving, and lifting actions.
The acceleration sensor 151 is configured to detect an acceleration signal in the Z-axis direction and transmit the detected acceleration signal in the Z-axis direction to the processor 190.
The memory area 120 stores instructions.
The processor 190 is coupled to the touch-sensitive display unit 130, the acceleration sensor 151 and the memory 120.
The processor 190 invokes instructions stored in the memory 120 to enable identifying whether a user interface displayed by the display is a text application interface upon detecting an articulation touch gesture acting on the touch-sensitive surface. And if the user interface displayed by the display is a text application interface and the track of the joint touch gesture is matched with the preset track, responding to the joint touch gesture and displaying a text selection area on the text application interface. The text selection area is positioned between the first endpoint and the second endpoint; the first endpoint is located at a first position in the text application interface, and the second endpoint is located at a second position in the text application interface.
Optionally, if the user interface displayed by the display is not a text application interface and there is a first application function associated with the trajectory of the joint touch gesture, executing the first application function.
Optionally, if the user interface displayed by the display is a text application interface, but the trajectory of the joint touch gesture does not match a preset trajectory, when there is a second application function associated with the trajectory of the joint touch gesture, executing the second application function.
When the touch-sensitive display unit receives a touch action acting on the touch-sensitive surface, the touch information is transmitted to the processor; the touch information may include one or more of touch point coordinates, grid capacitance values of the touch sensitive display unit, and touch actions. The touch actions may include pressing, moving, and lifting actions.
In some embodiments, it may be determined whether the touch action is a joint touch action based on the grid capacitance information and a Z-axis direction acceleration signal generated by the touch action; the touch sensitive surface grid capacitance information includes a grid capacitance value and a grid number of non-zero capacitance values.
When the grid capacitance value meets a first preset capacitance value range, the grid number of the non-zero capacitance value is smaller than a preset value, and the acceleration signal in the Z-axis direction is in the first preset acceleration range, it can be judged that the touch action is a joint touch action. When the grid capacitance value meets a second preset capacitance value range, the grid number of the non-zero capacitance value is larger than or equal to a preset value, and the acceleration signal in the Z-axis direction is in the second preset acceleration range, it can be judged that the touch action is a finger touch action.
For example, when the grid capacitance value of the touch-sensitive display unit indicates that the maximum capacitance value satisfies a first preset capacitance value range (for example, less than or equal to 0.42pF), the number of grids with non-zero capacitance values distributed is less than 7, and the Z-axis direction acceleration signal is within the first preset acceleration range (for example, within 5ms, the acceleration signal is greater than 3g) within a preset time, the touch action may be determined as a joint touch action. When the grid capacitance value of the touch-sensitive display unit indicates that the maximum capacitance value satisfies a second preset capacitance value range (for example, greater than 0.42pF, less than or equal to 0.46pF), the number of grids with non-zero capacitance values distributed is greater than or equal to 7, and in a preset time, the Z-axis direction acceleration signal is within a second preset acceleration range (for example, within 5ms, the acceleration signal is less than 2g, and g is gravity acceleration), it may be determined that the touch action is a finger touch action. It is understood that the joint touch action in the embodiment of the present invention is not necessarily triggered by the finger joint, and other objects may also hit the touch-sensitive display unit 130 at a fast speed, and as long as the determination condition of the joint touch action is satisfied, the joint touch action in the embodiment of the present invention may be referred to as the joint touch action.
The touch gesture may consist of a touch action. For example, a tap gesture consists of two touch actions, press and lift; the swipe gesture consists of three touch actions, a press, a move, and a lift. And detecting a joint touch gesture consisting of joint touch actions after judging that the touch actions are joint touch actions. For example: the joint click gesture consists of two joint touch actions of pressing and lifting; the joint sliding gesture consists of three joint touch actions of pressing, moving and lifting.
In some embodiments, when a depressed articulating touch action is detected in region A of the touch sensitive display unit, the depression moves on the touch sensitive display unit to region B where a raised articulating touch action is detected. The joint touch event that the joint is pressed down in the area A and lifted up after moving to the area B is a joint touch gesture. For example: the position of the touch action (e.g., zone a or zone B) can be determined from the contact coordinate information. The joint touch gesture may be comprised of a joint touch action. For example, an articulation tap gesture consists of two articulation touch actions of pressing and lifting; the joint sliding gesture consists of three joint touch actions of pressing, moving and lifting. The area A is a starting contact area of the joint touch gesture and the touch-sensitive display unit; and the area B is an ending contact area of the joint touch gesture and the touch-sensitive display unit. The moving track from the area A to the area B is the track of the touch gesture. The electronic equipment compares the detected track of the joint touch gesture with a preset track, and judges whether the track of the joint touch gesture is matched with the preset track. The preset track information may be preset by the electronic device when the electronic device leaves a factory, or may be preset by a user. The preset trajectory information may be stored in a memory (e.g., memory 120 in fig. 1).
In some embodiments, whether an application function associated with the trajectory of the joint touch gesture exists is determined by looking up the association relationship between the trajectory of the joint touch gesture and the application function stored in the memory 120. The joint touch gesture may be composed of different joint touch actions. For example, an articulation tap gesture consists of two articulation touch actions of pressing and lifting, and an articulation slide gesture consists of three articulation touch actions of pressing, moving, and lifting. And the moving track between the pressing and the lifting is the track of the joint touch gesture. The user may preset the association relationship between the trajectories of the various joint touch gestures and the application functions, and store the association relationship between the trajectories of the joint touch gestures and the application functions in the memory 120.
For example, the user may previously set the association relationship of the trajectory "C" of the joint touch gesture with the camera application function, and save the association relationship of the trajectory "C" of the joint touch gesture with the camera application function in the memory 120. When the joint touch gesture with the track of "C" acting on the touch-sensitive surface is detected, the track of the joint touch gesture "C" can be judged to be associated with the camera application function by searching the association relationship between the track of the joint touch gesture stored in the memory 120 and the application function.
The displaying of the text selection area on the text application interface is specifically that after the track of the joint touch gesture is judged to be matched with a preset track, the text selection area is displayed on the text application interface. The text selection area is located between a first endpoint and a second endpoint, the first endpoint is located at a first position in the text application interface, and the second endpoint is located at a second position in the text application interface. For example: and after the track of the joint touch gesture is judged to be matched with the preset track, inserting a first endpoint at the position of the starting contact area A of the joint touch gesture and the touch-sensitive display unit, and inserting a second endpoint at the position of the ending contact area B of the joint touch gesture and the touch-sensitive display unit. The insertion position of the first endpoint is a first position which may be the beginning or end of a text word or word in the text application interface displayed by the display that is closest to the centroid of the region a. The insertion position of the second endpoint is a second position which may be the beginning or end of a text word or word in the text application interface displayed by the display that is closest to the centroid of said zone B. And a text region positioned between the first endpoint and the second endpoint in the text application interface is the text selection region.
Fig. 15 is a functional structure diagram of an apparatus according to an embodiment of the present invention.
In some embodiments, the apparatus comprises a detection unit 1501, a recognition unit 1502, a determination unit 1503, and a select text unit 1504.
The detection unit 1501 is configured to detect a joint touch gesture acting on the touch-sensitive surface.
The identifying unit 1502 is configured to identify whether the user interface displayed by the display is a text application interface.
The determining unit 1503 is configured to determine whether a trajectory of the joint touch gesture matches a preset trajectory.
The select text unit 1504 is configured to recognize whether the user interface displayed by the display is a text application interface when an articulation touch gesture acting on the touch-sensitive surface is detected. And if the user interface displayed by the display is a text application interface and the track of the joint touch gesture is matched with the preset track, responding to the joint touch gesture and displaying a text selection area on the text application interface. The text selection area is positioned between the first endpoint and the second endpoint; the first endpoint is located at a first position in the text application interface, and the second endpoint is located at a second position in the text application interface.
Optionally, the apparatus further includes a first determining unit 1506 and a first executing unit 1507.
The first determining unit 1506 is configured to determine whether an application function associated with the trajectory of the joint touch gesture exists if the user interface displayed by the display is not a text application interface.
The first executing unit 1507 is configured to execute the application function if the user interface displayed by the display is not a text application interface and there is an application function associated with the trajectory of the joint touch gesture.
Optionally, the apparatus further includes a second determining unit 1508 and a second executing unit 1509.
The second determining unit 1508 is configured to determine whether an application function associated with the trajectory of the joint touch gesture exists if the user interface displayed by the display is a text application interface but the trajectory of the joint touch gesture is not a preset trajectory.
The second executing unit 1509 is configured to execute the application function if the user interface displayed by the display is a text application interface, but the trajectory of the joint touch gesture is not a preset trajectory, and an application function associated with the trajectory of the joint touch gesture exists.
And when the touch-sensitive display unit receives a touch action acting on the touch-sensitive surface, the touch information is transferred to the processor. The touch information may include one or more of touch point coordinates, grid capacitance values of the touch sensitive display unit, and touch actions. The touch actions may include pressing, moving, and lifting actions.
In some embodiments, it may be determined whether the touch action is a joint touch action based on the grid capacitance information and a Z-axis direction acceleration signal generated by the touch action. The touch sensitive surface grid capacitance information includes a grid capacitance value and a grid number of non-zero capacitance values.
When the grid capacitance value meets a first preset capacitance value range, the grid number of the non-zero capacitance value is smaller than a preset value, and the acceleration signal in the Z-axis direction is in the first preset acceleration range, it can be judged that the touch action is a joint touch action. When the grid capacitance value meets a second preset capacitance value range, the grid number of the non-zero capacitance value is larger than or equal to a preset value, and the acceleration signal in the Z-axis direction is in the second preset acceleration range, it can be judged that the touch action is a finger touch action.
For example, when the grid capacitance value of the touch-sensitive display unit indicates that the maximum capacitance value satisfies a first preset capacitance value range (for example, less than or equal to 0.42pF), the number of grids with non-zero capacitance values distributed is less than 7, and the Z-axis direction acceleration signal is within the first preset acceleration range (for example, within 5ms, the acceleration signal is greater than 3g) within a preset time, the touch action may be determined as a joint touch action. When the grid capacitance value of the touch-sensitive display unit indicates that the maximum capacitance value satisfies a second preset capacitance value range (for example, greater than 0.42pF, less than or equal to 0.46pF), the number of grids with non-zero capacitance values distributed is greater than or equal to 7, and in a preset time, the Z-axis direction acceleration signal is within a second preset acceleration range (for example, within 5ms, the acceleration signal is less than 2g, and g is gravity acceleration), it may be determined that the touch action is a finger touch action. It is understood that the joint touch action in the embodiment of the present invention is not necessarily triggered by the finger joint, and other objects may also hit the touch-sensitive display unit 130 at a fast speed, and as long as the determination condition of the joint touch action is satisfied, the joint touch action in the embodiment of the present invention may be referred to as the joint touch action.
The touch gesture may consist of a touch action. For example, a tap gesture consists of two touch actions, press and lift; the swipe gesture consists of three touch actions, a press, a move, and a lift. When it is determined that the touch action is a joint touch action, a joint touch gesture composed of joint touch actions may be detected. For example: the joint click gesture consists of two joint touch actions of pressing and lifting; the joint sliding gesture consists of three joint touch actions of pressing, moving and lifting.
In some embodiments, when a depressed articulating touch action is detected in region A of the touch sensitive display unit, the depression moves on the touch sensitive display unit to region B where a raised articulating touch action is detected. The joint touch event that the joint is pressed down in the area A and lifted up after moving to the area B is a joint touch gesture. For example: the position of the touch action (e.g., zone a or zone B) can be determined from the contact coordinate information. The joint touch gesture may be comprised of a joint touch action. For example, an articulation tap gesture consists of two articulation touch actions of pressing and lifting; the joint sliding gesture consists of three joint touch actions of pressing, moving and lifting. The area A is a starting contact area of the joint touch gesture and the touch-sensitive display unit; and the area B is an ending contact area of the joint touch gesture and the touch-sensitive display unit. The moving track from the area A to the area B is the track of the touch gesture. The electronic equipment compares the detected track of the joint touch gesture with a preset track, and judges whether the track of the joint touch gesture is matched with the preset track. The preset track information may be preset by the electronic device when the electronic device leaves a factory, or may be preset by a user. The preset trajectory information may be stored in a memory (e.g., memory 120 in fig. 1).
In some embodiments, whether an application function associated with the trajectory of the joint touch gesture exists is determined by looking up the association relationship between the trajectory of the joint touch gesture and the application function stored in the memory 120. The joint touch gesture may be composed of different joint touch actions. For example, an articulation tap gesture consists of two articulation touch actions of pressing and lifting, and an articulation slide gesture consists of three articulation touch actions of pressing, moving, and lifting. And the moving track between the pressing and the lifting is the track of the joint touch gesture. The user may preset the association relationship between the trajectories of the various joint touch gestures and the application functions, and store the association relationship between the trajectories of the joint touch gestures and the application functions in the memory 120.
For example, the user may previously set the association relationship of the trajectory "C" of the joint touch gesture with the camera application function, and save the association relationship of the trajectory "C" of the joint touch gesture with the camera application function in the memory 120. When the joint touch gesture with the track of "C" acting on the touch-sensitive surface is detected, the track of the joint touch gesture "C" can be judged to be associated with the camera application function by searching the association relationship between the track of the joint touch gesture stored in the memory 120 and the application function.
The displaying of the text selection area on the text application interface is specifically that after the track of the joint touch gesture is judged to be matched with a preset track, the text selection area is displayed on the text application interface. The text selection area is located between a first endpoint and a second endpoint, the first endpoint is located at a first position in the text application interface, and the second endpoint is located at a second position in the text application interface. For example: and after the track of the joint touch gesture is judged to be matched with the preset track, inserting a first endpoint at the position of the starting contact area A of the joint touch gesture and the touch-sensitive display unit, and inserting a second endpoint at the position of the ending contact area B of the joint touch gesture and the touch-sensitive display unit. The insertion position of the first endpoint is a first position which may be the beginning or end of a text word or word in the text application interface displayed by the display that is closest to the centroid of the region a. The insertion position of the second endpoint is a second position which may be the beginning or end of a text word or word in the text application interface displayed by the display that is closest to the centroid of said zone B. And a text region positioned between the first endpoint and the second endpoint in the text application interface is the text selection region.
Based on the same inventive concept, because the principle of solving the problems of the electronic device and the apparatus is similar to the method for selecting the text in the embodiment of the present invention, the implementation of the electronic device and the apparatus can refer to the implementation of the method, and repeated details are not repeated.
The technical scheme of the embodiment of the invention discloses that when a joint touch gesture acting on a touch-sensitive surface is detected, whether a user interface displayed by a display is a text application interface is identified, and if the user interface displayed by the display is the text application interface and the track of the joint touch gesture is matched with a preset track, a text selection area is displayed on the text application interface in response to the joint touch gesture. The scheme of the embodiment of the invention simplifies the operation steps of selecting the text, thereby improving the user experience.
In the embodiments provided in the present application, it can be understood by those skilled in the art that all or part of the steps for implementing the above embodiments are only illustrative and can be implemented by hardware, and can also be implemented by program instructions and related hardware, and when the steps are implemented by program instructions and related hardware, the program can be stored in a non-volatile (non-transitory) computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present invention, and not for limiting the same; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.

Claims (8)

1. A method of selecting text for use on a portable electronic device having a display with a touch-sensitive surface, the method comprising:
when an articulation touch gesture acting on the touch-sensitive surface is detected, the articulation touch gesture consisting of an articulation touch action;
detecting that a touch-sensitive surface grid capacitance value generated by a touch action acting on the touch-sensitive surface meets a first preset capacitance value range, the grid number of non-zero capacitance values is smaller than a preset value, and when a Z-axis direction acceleration signal is in the first preset acceleration range, determining that the touch action is the joint touch action, wherein the touch-sensitive surface grid capacitance value generated by the joint touch action is smaller than a touch-sensitive surface grid capacitance value generated by a finger touch action, the grid number of the non-zero capacitance values generated by the joint touch action is smaller than the grid number of the non-zero capacitance values generated by the finger touch action, and the Z-axis direction acceleration generated by the joint touch action is larger than the Z-axis direction acceleration generated by the finger touch action;
identifying whether a user interface displayed by the display is a text application interface;
if the user interface displayed by the display is a text application interface, determining whether the track of the joint touch gesture is matched with a preset track;
when the track of the joint touch gesture is determined to be matched with the preset track, responding to the joint touch gesture, and displaying a text selection area on the text application interface, wherein the text selection area is located between a first endpoint and a second endpoint;
when the track of the joint touch gesture is determined not to be matched with the preset track, determining whether a second application function associated with the track of the joint touch gesture exists;
executing the second application function when it is determined that the second application function associated with the trajectory of the joint touch gesture exists;
the first endpoint is located at a first position in the text application interface; the first location is a beginning or an end of a text word in a beginning touch region of the touch gesture in the text application interface;
the second endpoint is located at a second location in the text application interface; the second location is a beginning or an end of a text word or word in the text application interface in the end contact region of the touch gesture.
2. The method of claim 1, further comprising, if the user interface displayed by the display is not a text application interface and there is a first application function associated with the trajectory of the joint touch gesture, executing the first application function.
3. A portable electronic device, characterized in that the portable electronic device comprises:
a display having a touch-sensitive surface;
the acceleration sensor is used for acquiring the acceleration in the Z-axis direction;
a memory to store instructions;
a processor that invokes instructions stored in the memory to implement:
when an articulation touch gesture acting on the touch-sensitive surface is detected, the articulation touch gesture consisting of an articulation touch action;
detecting that a touch-sensitive surface grid capacitance value generated by a touch action acting on the touch-sensitive surface meets a first preset capacitance value range, the grid number of non-zero capacitance values is smaller than a preset value, and when a Z-axis direction acceleration signal is in the first preset acceleration range, determining that the touch action is the joint touch action, wherein the touch-sensitive surface grid capacitance value generated by the joint touch action is smaller than a touch-sensitive surface grid capacitance value generated by a finger touch action, the grid number of the non-zero capacitance values generated by the joint touch action is smaller than the grid number of the non-zero capacitance values generated by the finger touch action, and the Z-axis direction acceleration generated by the joint touch action is larger than the Z-axis direction acceleration generated by the finger touch action;
identifying whether a user interface displayed by the display is a text application interface;
if the user interface displayed by the display is a text application interface, determining whether the track of the joint touch gesture is matched with a preset track;
when the track of the joint touch gesture is determined to be matched with the preset track, responding to the joint touch gesture, and displaying a text selection area on the text application interface, wherein the text selection area is located between a first endpoint and a second endpoint;
when the track of the joint touch gesture is determined to be not matched with the preset track, determining whether a second application function related to the track of the joint touch gesture exists or not;
executing the second application function when it is determined that the second application function associated with the trajectory of the joint touch gesture exists; the first endpoint is located at a first position in the text application interface; the first location is a beginning or an end of a text word in a beginning touch region of the touch gesture in the text application interface;
the second endpoint is located at a second location in the text application interface; the second location is a beginning or an end of a text word or word in the text application interface in the end contact region of the touch gesture.
4. The portable electronic device of claim 3, wherein the instructions are further to: executing a first application function associated with the trajectory of the joint touch gesture if the user interface displayed by the display is not a text application interface and the first application function exists.
5. An apparatus for selecting text, the apparatus comprising: the device comprises a detection unit, an identification unit, a judgment unit, a text selection unit, a second judgment unit and a second execution unit;
the detection unit is used for detecting a joint touch gesture acting on the touch-sensitive surface, and the joint touch gesture consists of joint touch actions;
detecting that a touch-sensitive surface grid capacitance value generated by a touch action acting on the touch-sensitive surface meets a first preset capacitance value range, the grid number of non-zero capacitance values is smaller than a preset value, and when a Z-axis direction acceleration signal is in the first preset acceleration range, determining that the touch action is the joint touch action, wherein the touch-sensitive surface grid capacitance value generated by the joint touch action is smaller than a touch-sensitive surface grid capacitance value generated by a finger touch action, the grid number of the non-zero capacitance values generated by the joint touch action is smaller than the grid number of the non-zero capacitance values generated by the finger touch action, and the Z-axis direction acceleration generated by the joint touch action is larger than the Z-axis direction acceleration generated by the finger touch action;
the identification unit is used for identifying whether the user interface displayed by the display is a text application interface;
the judging unit is used for judging whether the track of the joint touch gesture is matched with a preset track;
the text selection unit is used for responding to the joint touch gesture and displaying a text selection area on the text application interface when the joint touch gesture acting on the touch-sensitive surface is detected and the user interface displayed by the display is a text application interface and the track of the joint touch gesture is matched with a preset track, wherein the text selection area is positioned between a first endpoint and a second endpoint;
the second judging unit is used for judging whether an application function associated with the track of the joint touch gesture exists or not when the user interface displayed by the display is a text application interface but the track of the joint touch gesture is not matched with a preset track;
the second execution unit is used for executing a second application function associated with the joint touch gesture when the second application function exists;
the first endpoint is located at a first position in the text application interface; the first location is a beginning or an end of a text word in a beginning touch region of the touch gesture in the text application interface;
the second endpoint is located at a second location in the text application interface; the second location is a beginning or an end of a text word or word in the text application interface in the end contact region of the touch gesture.
6. The apparatus of claim 5, wherein the apparatus further comprises: the device comprises a first judging unit and a first executing unit;
the first judging unit is used for judging whether an application function associated with the track of the joint touch gesture exists or not when the user interface displayed by the display is not a text application interface;
the first execution unit is used for executing a first application function when the first application function associated with the track of the joint touch gesture exists.
7. A user interface on a portable electronic device, the portable electronic device comprising a display, a memory, and a processor for executing instructions stored in the memory, wherein the display has a touch-sensitive surface, the user interface comprising:
an interface for displaying a text application;
when joint touch gestures acting on the touch-sensitive surface are detected, if a user interface displayed by a display is a text application interface and the track of the joint touch gestures is matched with a preset track, responding to the joint touch gestures, and displaying an interface of a text selection area on the text application interface, wherein the text selection area is positioned between a first endpoint and a second endpoint;
if the user interface displayed by the display is a text application interface but the track of the joint touch gesture is not matched with the preset track, executing a second application function when the second application function associated with the track of the joint touch gesture exists;
the joint touch gesture consists of joint touch actions;
detecting that a touch-sensitive surface grid capacitance value generated by a touch action acting on the touch-sensitive surface meets a first preset capacitance value range, the grid number of non-zero capacitance values is smaller than a preset value, and when a Z-axis direction acceleration signal is in the first preset acceleration range, determining that the touch action is the joint touch action, wherein the touch-sensitive surface grid capacitance value generated by the joint touch action is smaller than a touch-sensitive surface grid capacitance value generated by a finger touch action, the grid number of the non-zero capacitance values generated by the joint touch action is smaller than the grid number of the non-zero capacitance values generated by the finger touch action, and the Z-axis direction acceleration generated by the joint touch action is larger than the Z-axis direction acceleration generated by the finger touch action;
the first endpoint is located at a first position in the text application interface; the first location is a beginning or an end of a text word in a beginning touch region of the touch gesture in the text application interface;
the second endpoint is located at a second location in the text application interface; the second location is a beginning or an end of a text word or word in the text application interface in the end contact region of the touch gesture.
8. A non-transitory computer readable storage medium storing one or more programs, the one or more programs comprising instructions, which when executed by a portable electronic device comprising a display with a touch-sensitive surface, cause the portable electronic device to perform the following:
when an articulation touch gesture acting on the touch-sensitive surface is detected, the articulation touch gesture consisting of an articulation touch action;
detecting that a touch-sensitive surface grid capacitance value generated by a touch action acting on the touch surface meets a first preset capacitance value range, the grid number of non-zero capacitance values is smaller than a preset value, and when a Z-axis direction acceleration signal is in the first preset acceleration range, determining that the touch action is the joint touch action, wherein the touch-sensitive surface grid capacitance value generated by the joint touch action is smaller than a touch-sensitive surface grid capacitance value generated by a finger touch action, the grid number of the non-zero capacitance values generated by the joint touch action is smaller than the grid number of the non-zero capacitance values generated by the finger touch action, and the Z-axis direction acceleration generated by the joint touch action is larger than the Z-axis direction acceleration generated by the finger touch action;
identifying whether a user interface displayed by the display is a text application interface;
if the user interface displayed by the display is a text application interface, determining whether the track of the joint touch gesture is matched with a preset track;
when the track of the joint touch gesture is determined to be matched with the preset track, responding to the joint touch gesture, and displaying a text selection area on the text application interface, wherein the text selection area is located between a first endpoint and a second endpoint;
when the track of the joint touch gesture is determined not to be matched with the preset track, determining whether a second application function associated with the track of the joint touch gesture exists;
executing the second application function when it is determined that the second application function associated with the trajectory of the joint touch gesture exists; the first endpoint is located at a first position in the text application interface; the first location is a beginning or an end of a text word in a beginning touch region of the touch gesture in the text application interface;
the second endpoint is located at a second location in the text application interface; the second location is a beginning or an end of a text word or word in the text application interface in the end contact region of the touch gesture.
CN201580030746.XA 2015-08-31 2015-08-31 Method for selecting text Active CN107924261B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2015/088617 WO2017035739A1 (en) 2015-08-31 2015-08-31 Method for selecting text

Publications (2)

Publication Number Publication Date
CN107924261A CN107924261A (en) 2018-04-17
CN107924261B true CN107924261B (en) 2020-10-23

Family

ID=58186454

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201580030746.XA Active CN107924261B (en) 2015-08-31 2015-08-31 Method for selecting text

Country Status (2)

Country Link
CN (1) CN107924261B (en)
WO (1) WO2017035739A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101859177A (en) * 2010-06-09 2010-10-13 天津中科津龙集成电路技术有限公司 Method and device for calling and operating application program on intelligent electronic device
CN103365570A (en) * 2012-03-26 2013-10-23 华为技术有限公司 Content selecting method and content selecting device
CN103608760A (en) * 2011-06-03 2014-02-26 谷歌公司 Gestures for selecting text
CN104049728A (en) * 2013-03-15 2014-09-17 Lg电子株式会社 Electronic device and control method thereof
CN104360808A (en) * 2014-12-04 2015-02-18 李方 Method and device for editing documents by using symbolic gesture instructions
CN104769533A (en) * 2013-05-06 2015-07-08 齐科斯欧公司 Using finger touch types to interact with electronic devices

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8607167B2 (en) * 2007-01-07 2013-12-10 Apple Inc. Portable multifunction device, method, and graphical user interface for providing maps and directions
US8650507B2 (en) * 2008-03-04 2014-02-11 Apple Inc. Selecting of text using gestures
US8786556B2 (en) * 2009-03-12 2014-07-22 Nokia Corporation Method and apparatus for selecting text information

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101859177A (en) * 2010-06-09 2010-10-13 天津中科津龙集成电路技术有限公司 Method and device for calling and operating application program on intelligent electronic device
CN103608760A (en) * 2011-06-03 2014-02-26 谷歌公司 Gestures for selecting text
CN103365570A (en) * 2012-03-26 2013-10-23 华为技术有限公司 Content selecting method and content selecting device
CN104049728A (en) * 2013-03-15 2014-09-17 Lg电子株式会社 Electronic device and control method thereof
CN104769533A (en) * 2013-05-06 2015-07-08 齐科斯欧公司 Using finger touch types to interact with electronic devices
CN104360808A (en) * 2014-12-04 2015-02-18 李方 Method and device for editing documents by using symbolic gesture instructions

Also Published As

Publication number Publication date
CN107924261A (en) 2018-04-17
WO2017035739A1 (en) 2017-03-09
WO2017035739A9 (en) 2017-08-17

Similar Documents

Publication Publication Date Title
AU2020201096B2 (en) Quick screen splitting method, apparatus, and electronic device, display UI, and storage medium
CN106775420B (en) Application switching method and device and graphical user interface
US10725646B2 (en) Method and apparatus for switching screen interface and terminal
CN109426410B (en) Method for controlling cursor movement, content selection method, method for controlling page scrolling and electronic equipment
EP3647926A1 (en) Mobile terminal and split screen control method thereof, and computer readable storage medium
CN108549519B (en) Split screen processing method and device, storage medium and electronic equipment
EP2851779A1 (en) Method, device, storage medium and terminal for displaying a virtual keyboard
US20150212693A1 (en) Interaction method and apparatus for listing data on mobile terminal
CN105302452B (en) Operation method and device based on gesture interaction
CN108475161A (en) Display methods and terminal
CN107066090B (en) Method for controlling fingerprint identification module and mobile terminal
CN113050863A (en) Page switching method and device, storage medium and electronic equipment
CN106934003B (en) File processing method and mobile terminal
CN107003759B (en) Method for selecting text
WO2018039914A1 (en) Method for copying data, and user terminal
CN108920086B (en) Split screen quitting method and device, storage medium and electronic equipment
CN108304709B (en) Face unlocking method and related product
US20180253225A1 (en) Display Operation Method and Apparatus, User Interface, and Storage Medium
CN107924261B (en) Method for selecting text
EP3674867B1 (en) Human-computer interaction method and electronic device
CN105700762B (en) Method and device for displaying option information
CN110109582B (en) Display method and device of mobile terminal and storage medium
US20240019942A1 (en) Input method and terminal
CN106648425B (en) Method and device for preventing mistaken touch of terminal
CN111897473A (en) Information extraction method and device of terminal, storage medium and terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant