WO2017035739A9 - 一种选择文本的方法 - Google Patents
一种选择文本的方法 Download PDFInfo
- Publication number
- WO2017035739A9 WO2017035739A9 PCT/CN2015/088617 CN2015088617W WO2017035739A9 WO 2017035739 A9 WO2017035739 A9 WO 2017035739A9 CN 2015088617 W CN2015088617 W CN 2015088617W WO 2017035739 A9 WO2017035739 A9 WO 2017035739A9
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- text
- touch
- joint
- trajectory
- joint touch
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
Definitions
- Embodiments of the present invention relate to a method of selecting text, and more particularly to a method of selecting text on a display having a touch-sensitive surface using a joint touch gesture.
- the target selection text is selected.
- the operation of selecting text is generally more complicated. For example, the user's finger contacts the text area to be selected in the touch screen; after the finger remains in the touch state for a predetermined time interval, the text area pops up two left and right sign poles; the user touches and drags the sign bar to adjust the text selection area. Similar to such an operation, the user has more interaction steps with the touch screen, and the user experience is worth improving.
- the embodiment of the present invention provides a technical solution for selecting text.
- the technical solution includes:
- an embodiment of the present invention provides a method for selecting text, which is applied to a portable electronic device, the electronic device including a display having a touch-sensitive surface, the method comprising:
- a joint touch gesture acting on the touch-sensitive surface is detected
- the user interface displayed by the display is a text application interface, and the trajectory of the joint touch gesture matches the preset trajectory, displaying a text selection area on the text application interface in response to the joint touch gesture, the text The selection area is located between the first endpoint and the second endpoint;
- the first endpoint is located at a first location in the text application interface
- the second endpoint is located at a second location in the text application interface.
- the method further includes: if the user interface displayed by the display is not a text application interface, and there is a first application function associated with the trajectory of the joint touch gesture Executing the first application function.
- the method further includes: if the user interface displayed by the display is a text application interface, but the trajectory of the joint touch gesture does not match the preset trajectory, The second application function is executed when there is a second application function associated with the trajectory of the joint touch gesture.
- the joint touch gesture is composed of a joint touch motion;
- the touch-sensitive surface grid capacitance value generated by the touch action of the sensitive surface satisfies the first preset capacitance value range, the number of grids of the non-zero capacitance value is less than the preset value, and the Z-axis direction acceleration signal is in the first preset acceleration range
- the touch action is the joint touch action
- the gesture composed of the joint touch action is the joint touch gesture.
- an embodiment of the present invention provides a portable electronic device, including:
- An acceleration sensor for acquiring acceleration in the Z-axis direction
- a memory for storing instructions
- a processor that invokes instructions stored in the memory to:
- a joint touch gesture acting on the touch-sensitive surface is detected
- the user interface displayed by the display is a text application interface, and the trajectory of the joint touch gesture matches the preset trajectory, displaying a text selection area on the text application interface in response to the joint touch gesture, the text The selection area is located between the first endpoint and the second endpoint;
- the first endpoint is located at a first location in the text application interface
- the second endpoint is located at a second location in the text application interface.
- the instruction is further configured to: if the user interface displayed by the display is not a text application interface, and there is a first application function associated with the trajectory of the joint touch gesture The first application function is executed.
- the instruction is further configured to: if the user interface displayed by the display is a text application interface, but the trajectory of the joint touch gesture does not match the preset trajectory, when The second application function is executed when there is a second application function associated with the trajectory of the joint touch gesture.
- the joint touch gesture is composed of a joint touch motion;
- the touch-sensitive surface grid capacitance value generated by the touch action of the sensitive surface satisfies the first preset capacitance value range, the number of grids of the non-zero capacitance value is less than the preset value, and the Z-axis direction acceleration signal is in the first preset acceleration range
- the touch action is the joint touch action
- the gesture composed of the joint touch action is the joint touch gesture.
- an embodiment of the present invention provides a device, where the device includes: a detecting unit, an identifying unit, a determining unit, and a selecting text unit;
- the detecting unit is configured to detect a joint touch gesture acting on the touch-sensitive surface
- the identification unit is configured to identify whether the user interface displayed by the display is a text application interface
- the determining unit is configured to determine whether a track of the joint touch gesture matches a preset track
- the selection text unit is configured to: when detecting a joint touch gesture acting on the touch-sensitive surface, if the user interface displayed by the display is a text application interface, and the trajectory of the joint touch gesture matches the preset trajectory, the response The joint touch gesture displays a text selection area on the text application interface, the text selection area being located between the first end point and the second end point;
- the first endpoint is located at a first location in the text application interface
- the second endpoint is located at a second location in the text application interface.
- the device further includes: a first determining unit, and a first executing unit:
- the first determining unit is configured to determine, when the user interface displayed by the display is not a text application interface, whether an application function associated with the track of the joint touch gesture exists;
- the first execution unit is configured to perform the first application function when there is a first application function associated with the trajectory of the joint touch gesture.
- the device further includes: a second determining unit, and a second executing unit:
- the second determining unit is configured to: when the user interface displayed by the display is a text application interface, but the track of the joint touch gesture does not match the preset track, determine whether there is a track associated with the track of the joint touch gesture.
- the second execution unit is configured to execute the second application function when there is a second application function associated with the joint touch gesture.
- the joint touch gesture is composed of a joint touch action;
- the touch-sensitive surface grid capacitance value generated by the touch action of the sensitive surface satisfies the first preset capacitance value range, the number of grids of the non-zero capacitance value is less than the preset value, and the Z-axis direction acceleration signal is in the first preset acceleration range
- the touch action is the joint touch action
- the gesture composed of the joint touch action is the joint touch gesture.
- an embodiment of the present invention provides a user interface on a portable electronic device, the portable electronic device including a display, a memory, and a processor for executing instructions stored in the memory, wherein the display Having a touch-sensitive surface, the user interface includes:
- the first endpoint is located at a first location in the text application interface
- the second endpoint is located at a second location in the text application interface.
- an embodiment of the present invention provides a non-transitory computer readable storage medium storing one or more programs, the one or more programs including instructions, when the instructions are included
- the portable electronic device with the display of the touch-sensitive surface when executed, causes the portable electronic device to perform the following events:
- a joint touch gesture acting on the touch-sensitive surface is detected
- the user interface displayed by the display is a text application interface, and the trajectory of the joint touch gesture matches the preset trajectory, displaying a text selection area on the text application interface in response to the joint touch gesture, the text The selection area is located between the first endpoint and the second endpoint;
- the first endpoint is located at a first location in the text application interface
- the second endpoint is located at a second location in the text application interface.
- the technical solution of the embodiment of the present invention discloses that when detecting a joint touch gesture acting on the touch-sensitive surface, identifying whether the user interface displayed by the display is a text application interface, if the user interface displayed by the display is a text application interface, and the When the trajectory of the joint touch gesture matches the preset trajectory, the text selection area is displayed on the text application interface in response to the joint touch gesture.
- the solution of the embodiment of the invention simplifies the operation steps of selecting text, thereby improving the user experience.
- FIG. 1 is a schematic diagram of an internal structure of a portable electronic device 100 according to an embodiment of the present invention
- FIG. 2 is a schematic diagram of an external structure of a portable electronic device 100 according to an embodiment of the present invention.
- FIG. 3 is a schematic diagram of displaying a text selection area according to an embodiment of the present invention.
- FIG. 4 is a flowchart of a method for selecting text according to an embodiment of the present invention.
- FIG. 5 is an exemplary user interface of a track of a joint touch gesture according to an embodiment of the present invention as a horizontal line (ie, “—”);
- FIG. 6 is an exemplary user interface of a text selection area resulting from the trajectory of the joint touch gesture illustrated in FIG. 5 in accordance with an embodiment of the present invention
- FIG. 7 is an exemplary user interface of a trajectory of an articulation gesture of the embodiment of the present invention as a vertical line (ie, "
- FIG. 8 is an exemplary user interface of a text selection area resulting from the trajectory of the joint touch gesture illustrated in FIG. 7 in accordance with an embodiment of the present invention
- FIG. 9 is an exemplary user interface of a trajectory of an articulation gesture of the embodiment of the present invention as a diagonal line (ie, "/");
- FIG. 10 is an exemplary user interface of a text selection area resulting from the trajectory of the joint touch gesture illustrated in FIG. 9 in accordance with an embodiment of the present invention
- 11 is an exemplary user interface of a trajectory of a joint touch gesture according to an embodiment of the present invention as a diagonal line (ie, “ ⁇ ”);
- FIG. 12 is an exemplary user interface of a text selection area resulting from the trajectory of the joint touch gesture illustrated in FIG. 11 in accordance with an embodiment of the present invention
- FIG. 13 is an exemplary user interface for performing a word processing function on a text selection area according to an embodiment of the present invention
- FIG. 14 is a simplified schematic diagram of an internal structure of an electronic device with a touch-sensitive display unit according to an embodiment of the present invention.
- FIG. 15 is a schematic diagram showing the functional structure of a device according to an embodiment of the present invention.
- the embodiment of the present invention is exemplified by the portable electronic device 100 including the touch-sensitive display unit. It can be understood by those skilled in the art that the embodiments of the present invention are equally applicable to other devices, such as a handheld device, an in-vehicle device, and the like. Wearable devices, computing devices, and various forms of User Equipment (UE), Mobile Station (MS), Terminal, Terminal Equipment, and the like.
- UE User Equipment
- MS Mobile Station
- Terminal Equipment Terminal Equipment
- the electronic device 100 can support a variety of applications, such as a text application (email application, blog application, web browsing application, etc.); the touch sensitive display unit of the electronic device 100 can intuitively present the user interface of the application, Various applications are performed by the touch-sensitive display unit of the electronic device 100.
- a text application email application, blog application, web browsing application, etc.
- the touch sensitive display unit of the electronic device 100 can intuitively present the user interface of the application.
- Various applications are performed by the touch-sensitive display unit of the electronic device 100.
- FIG. 1 is a schematic diagram of an internal structure of a portable electronic device 100 according to an embodiment of the present invention.
- the electronic device 100 may include a touch sensitive display unit 130, an acceleration sensor 151, a proximity light sensor 152, an ambient light sensor 153, a memory 120, a processor 190, a radio frequency unit 110, an audio circuit 160, a speaker 161, a microphone 162, and WiFi ( Wireless fidelity module 170, Bluetooth module 180, power supply 193, external interface 197, and the like.
- FIG. 1 is merely an example of a portable electronic device, and does not constitute a limitation for a portable electronic device, and may include more or less components than those illustrated, or may combine certain components or different components. .
- the touch-sensitive display unit 130 is sometimes referred to as a "touch screen" for convenience, and may also be referred to as or referred to as a touch-sensitive display system, and may also be referred to as a display having a touch-sensitive surface.
- the display having a touch-sensitive surface includes a touch-sensitive surface and a display screen; the screen interface can be displayed, and a touch action can also be received.
- the touch sensitive display unit 130 provides an input interface and an output interface between the device and the user.
- the touch sensitive display unit 130 can collect touch operations on or near the user, such as operations by the user on the touch sensitive display unit or near the touch sensitive display unit using any suitable object such as a finger 202, a joint, a stylus, or the like.
- the touch sensitive display unit can detect a touch action on the touch sensitive display unit, a grid capacitance value of the touch sensitive display unit, and contact coordinates, and send the touch action, the grid capacitance value of the touch sensitive display unit, and the contact coordinate information.
- the processor 190 is provided and can receive commands from the processor 190 and execute them.
- the touch sensitive display unit 130 displays a visual output.
- Visual output can include graphics, text, icons, video, and any combination thereof (collectively referred to as "graphics"). In some embodiments, some visual output or all The visual output can correspond to a user interface object.
- Touch sensitive display unit 130 may use LCD (Liquid Crystal Display) technology, LPD (Light Emitting Polymer Display) technology, or LED (Light Emitting Diode) technology, although other display technologies may be used in other embodiments.
- Touch sensitive display unit 130 may utilize any of a variety of touch sensing techniques now known or later developed, as well as other proximity sensor arrays or for determining one or more points in contact with touch sensitive display unit 130.
- Other elements to detect contact and any movement or interruption thereof include, but are not limited to, capacitive, resistive, infrared, and surface acoustic wave techniques. In an exemplary embodiment, a projected mutual capacitance sensing technique is used.
- the user can contact the touch sensitive display unit 130 using any suitable object or add-on such as a stylus, finger, joint, or the like.
- the user interface is designed to work primarily based on joint contact and gestures.
- the device translates the joint-based coarse input into an accurate pointer/cursor position or command to perform the action desired by the user.
- device 100 in addition to the touch-sensitive display unit, device 100 can include a touchpad (not shown) for activating or deactivating a particular function.
- the touchpad is a touch sensitive area of the device that is different from the touch sensitive display unit in that it does not display a visual output.
- the touchpad can be a touch-sensitive surface that is separate from the touch-sensitive display unit 130, or an extension of the touch-sensitive surface formed by the touch-sensitive display unit.
- the acceleration sensor 151 can detect the magnitude of acceleration in each direction (typically three axes). At the same time, the acceleration sensor 151 can also be used to detect the magnitude and direction of gravity when the terminal is stationary, and can be used for identifying the gesture of the mobile phone (such as horizontal and vertical screen switching, related games, magnetometer attitude calibration), vibration recognition related functions (such as step counting) , tapping, etc. In the embodiment of the present invention, the acceleration sensor 151 is configured to acquire a gravitational acceleration of the touch action of the user in contact with the touch-sensitive display unit in the Z-axis direction.
- the electronic device 100 may also include one or more proximity light sensors 152 for turning off and disabling the touch function of the touch-sensitive surface when the electronic device 100 is closer to the user (eg, close to the ear when the user is making a call) Avoid user misuse of the touch sensitive display unit.
- the electronic device 100 may further include one or more ambient light sensors 153 for keeping the touch-sensitive display unit off when the electronic device 100 is located in a user's pocket or other dark area to prevent the electronic device 100 from consuming unnecessary when in the locked state. Battery power consumption is incorrectly operated.
- proximity photosensors and ambient light sensing The device can be integrated into one component or as two separate components.
- FIG. 1 shows the proximity photosensor and the ambient light sensor, it can be understood that it does not belong to the essential configuration of the electronic device 100, and may be omitted as needed within the scope of not changing the essence of the invention.
- the memory 120 can be used to store instructions and data.
- the memory 120 may mainly include a storage instruction area and a storage data area.
- the storage data area can store the association relationship between the joint touch gesture and the application function, and can also store the preset track information.
- the storage instruction area can store an operating system, instructions required for at least one function, and the like.
- the instructions may cause the processor 190 to perform the method including, when a joint touch gesture acting on the touch-sensitive surface is detected, identifying whether the user interface displayed by the display is a text application interface. If the user interface displayed by the display is a text application interface, and the trajectory of the joint touch gesture matches the preset trajectory, the text selection area is displayed on the text application interface in response to the joint touch gesture.
- the text selection area is located between the first endpoint and the second endpoint; the first endpoint is located at a first location in the text application interface, and the second endpoint is located at a second location of the text application interface At the office.
- the first application function is performed if the user interface displayed by the display is not a text application interface and there is a first application function associated with the trajectory of the joint touch gesture. If the user interface displayed by the display is a text application interface, but the trajectory of the joint touch gesture does not match the preset trajectory, when there is a second application function associated with the trajectory of the joint touch gesture, executing the second application Features.
- the processor 190 is a control center of the electronic device 100, and connects various parts of the entire mobile phone using various interfaces and lines, and executes the electronic device 100 by operating or executing an instruction stored in the memory 120 and calling data stored in the memory 120. A variety of functions and processing data to monitor the phone as a whole.
- the processor 190 may include one or more processing units; preferably, the processor 190 may integrate an application processor and a modem processor, where the application processor mainly processes an operating system, a user interface, an application, and the like.
- the modem processor primarily handles wireless communications. It will be appreciated that the above described modem processor may also not be integrated into the processor 190.
- the processors, memories can be implemented on a single chip; in some embodiments, they can also be implemented separately on separate chips.
- the processor 190 is further configured to invoke an instruction in the memory to implement the joint touch gesture acting on the touch-sensitive surface to identify the display of the display. Whether the user interface is a text application interface. If the user interface displayed by the display is a text application interface, and the trajectory of the joint touch gesture matches the preset trajectory, the text selection area is displayed on the text application interface in response to the joint touch gesture.
- the text selection area is located between the first endpoint and the second endpoint; the first endpoint is located at a first location in the text application interface, and the second endpoint is located at a second location of the text application interface At the office.
- the first application function is performed if the user interface displayed by the display is not a text application interface and there is a first application function associated with the trajectory of the joint touch gesture. If the user interface displayed by the display is a text application interface, but the trajectory of the joint touch gesture does not match the preset trajectory, when there is a second application function associated with the trajectory of the joint touch gesture, executing the second application Features.
- the radio frequency unit 110 can be used for receiving and transmitting signals during the transmission and reception of information or during a call. Specifically, after receiving the downlink information of the base station, the processing is performed by the processor 190. In addition, the uplink data is designed to be sent to the base station.
- RF circuits include, but are not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a Low Noise Amplifier (LNA), a duplexer, and the like.
- the radio unit 110 can also communicate with network devices and other devices through wireless communication.
- the wireless communication may use any communication standard or protocol, including but not limited to Global System of Mobile communication (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (Code). Division Multiple Access (CDMA), Wideband Code Divided Multiple Access (WCDMA), Long Term Evolution (LTE), E-mail, Short Messaging Service (SMS), etc.
- GSM Global System of Mobile communication
- GPRS General Packet Radio Service
- CDMA Code Division
- the audio circuit 160, the speaker 161, and the microphone 162 can provide an audio interface between the user and the electronic device 100.
- the audio circuit 160 can transmit the converted electrical data of the received audio data to the speaker 161 for conversion to the sound signal output by the speaker 161; on the other hand, the microphone 162 converts the collected sound signal into an electrical signal by the audio circuit 160. After receiving, it is converted into audio data, and then processed by the audio data output processor 190, sent to the terminal, for example, via the radio frequency unit 110, or outputted to the memory 120 for further processing.
- the audio circuit may also include a headphone jack. 163, used to provide a connection interface between the audio circuit and the earphone.
- WiFi is a short-range wireless transmission technology
- the electronic device 100 can help users to send and receive emails, browse web pages, and access streaming media through the WiFi module 170, which provides users with wireless wideness. With internet access.
- FIG. 1 shows the WiFi module 170, it can be understood that it does not belong to the essential configuration of the electronic device 100, and may be omitted as needed within the scope of not changing the essence of the invention.
- Bluetooth is a short-range wireless communication technology. With Bluetooth technology, communication between mobile communication terminal devices such as palmtops, notebook computers, and mobile phones can be effectively simplified, and communication between the above devices and the Internet can be successfully simplified.
- the electronic device 100 passes through the Bluetooth module 180.
- the data transmission between the electronic device 100 and the Internet becomes faster and more efficient, broadening the road for wireless communication.
- Bluetooth technology is an open solution for wireless transmission of voice and data.
- FIG. 1 shows the WiFi module 170, it can be understood that it does not belong to the essential configuration of the electronic device 100, and may be omitted as needed within the scope of not changing the essence of the invention.
- the electronic device 100 also includes a power source 193 (such as a battery) that powers the various components.
- a power source 193 such as a battery
- the power source can be logically coupled to the processor 190 via the power management system 194 to manage charge, discharge, and power management through the power management system 194. And other functions.
- the electronic device 100 further includes an external interface 197, which may be a standard Micro USB interface, or a multi-pin connector, which may be used to connect the electronic device 100 to communicate with other devices, or may be used to connect the charger to The electronic device 100 is charged.
- an external interface 197 which may be a standard Micro USB interface, or a multi-pin connector, which may be used to connect the electronic device 100 to communicate with other devices, or may be used to connect the charger to The electronic device 100 is charged.
- the electronic device 100 may further include a camera, a flash, and the like, and details are not described herein.
- the method of selecting text will be described below by taking the electronic device 100 as an example.
- the electronic device 100 may include a touch-sensitive display unit 130, an acceleration sensor 151, a volume control button 132, a switch button 133, a microphone 162, a speaker 161, an external interface 197, and a headphone jack 163.
- the touch-sensitive display unit 130 may display one or more graphics 300 in the user interface 200 to receive a user's touch input, and the electronic device may be reduced by using the touch-sensitive display unit 130 as a main input or control device for operating the electronic device 100. The number of physical inputs or controls on the 100.
- the touch sensitive display unit may be referred to as a "menu button.”
- the "menu button” can be a physical button or other physical input or control device.
- the acceleration sensor 151 is configured to acquire a gravitational acceleration of a user's touch action on the touch-sensitive display unit on the Z-axis. By pressing and holding the switch button in a depressed state for a predetermined time interval, it can be achieved The power of the electronic device 100 is turned on or off. The locking of the electronic device 100 can be achieved by depressing the switch button and releasing it before a predetermined time interval.
- voice input for activating some functions may also be received via microphone 162.
- FIG. 3 is a schematic diagram of displaying a text selection area according to an embodiment of the present invention.
- the text selection area 301 can be represented on the touch-sensitive display unit 130 as text that is defined by the first endpoint 302a and the second endpoint 302b, between the first endpoint and the second endpoint.
- the selected text area 301 can include any portion of the text shown in FIG. 3, and the text selected in FIG. 3 is merely an example.
- the first endpoint 302a can be associated with the first marker post 303a and the second endpoint 302b can be associated with the second marker post 303b.
- the first marker bar 303a and the second marker lever 303b can be used to indicate the positions of the first endpoint 302a and the second endpoint 302b, respectively. Since the marker lever is easier to select than the endpoint, when the user wishes to move one or all of the first endpoint 302a or the second endpoint 302b to a new location, the marker pole associated with the endpoint can be moved to a new location. Move the endpoint to a new location.
- the sign posts 303a and 303b can be other shapes, sizes, and colors. This embodiment is only an example manner.
- FIG. 4 is a flowchart of a method for selecting text according to an embodiment of the present invention.
- Method 400 can be performed on a portable electronic device (eg, electronic device 100 in FIG. 1 or FIG. 2) having a touch-sensitive display unit and a plurality of applications, including text applications. In some embodiments, some of the operations in method 400 may be combined, and/or the order of some operations may vary.
- a portable electronic device eg, electronic device 100 in FIG. 1 or FIG. 2 having a touch-sensitive display unit and a plurality of applications, including text applications.
- some of the operations in method 400 may be combined, and/or the order of some operations may vary.
- method 400 provides a more efficient way to quickly select text. This method helps the user to select text with fewer steps. Simplified the steps of selecting text and improving the user experience.
- the portable electronic device detects a joint touch gesture (401) that acts on the touch-sensitive surface.
- step 401 may specifically include steps 4011-4013.
- the portable electronic device detects a touch action acting on the touch-sensitive surface (4011); determining whether the touch action is a joint touch action (4012); detecting a joint touch gesture (4013) consisting of the joint touch action.
- the text application interface can display web browsing, email, notepad, instant message, blog application, and the like.
- the electronic device executes the application function (404) when there is an application function associated with the trajectory of the joint touch gesture.
- the user interface displayed by the display is a text application interface
- a text selection area is displayed on the text application interface in response to the joint touch gesture (406).
- the trajectory of the joint touch gesture does not match the preset trajectory, it is determined whether there is an application function associated with the trajectory of the joint touch gesture.
- the application function is performed when there is an application function associated with the trajectory of the joint touch gesture.
- the touch information is communicated to the processor.
- the touch information may include one or more of contact coordinates, a grid capacitance value of the touch sensitive display unit, and a touch action.
- the touch action may include actions such as pressing, moving, and lifting.
- whether the touch action is a joint touch action may be determined based on the grid capacitance information and the Z-axis direction acceleration signal generated by the touch action (4012).
- the touch-sensitive surface grid capacitance information includes a grid capacitance value and a number of grids of non-zero capacitance values.
- the touch action is a joint touch action.
- the number of grids of the non-zero capacitance value is greater than or equal to the preset value, and the Z-axis direction acceleration signal is in the second preset acceleration range, It is determined that the touch action is a finger touch action.
- the Z-axis direction acceleration signal can determine that the touch action is a joint touch action when the first preset acceleration range (for example, within 5 ms, the acceleration signal is greater than 3 g).
- the acceleration signal in the Z-axis direction is in the second preset acceleration range (eg, within 5ms, acceleration)
- the signal is less than 2 g and g is the gravitational acceleration, it can be determined that the touch action is a finger touch action.
- joint touch action in the embodiment of the present invention is not necessarily triggered by the finger joint, and other objects may tap the touch-sensitive display unit 130 at a fast speed, as long as the judgment condition of the joint touch action is satisfied.
- a joint touch action called an embodiment of the present invention.
- a touch gesture can be composed of touch actions.
- a tap gesture consists of pressing and lifting two touch actions;
- a swipe gesture consists of pressing, moving, and lifting three touch actions.
- a joint touch gesture composed of the joint touch action may be detected (4013).
- a joint click gesture consists of pressing and lifting two joint touch actions;
- the joint swipe gesture consists of pressing, moving, and lifting three joint touch actions.
- the joint touch gesture can be composed of different joint touch actions.
- a joint click gesture consists of pressing and lifting two joint touch actions
- the joint swipe gesture consists of pressing, moving, and lifting three joint touch actions.
- the movement trajectory between pressing to lifting is the trajectory of the joint touch gesture.
- the user may preset an association relationship between the trajectory of the plurality of joint touch gestures and the application function, and save the association relationship between the trajectory of the joint touch gesture and the application function in the memory 120.
- the user may preset the association relationship of the trajectory "C” of the joint touch gesture with the photographic application function, and save the association relationship of the trajectory "C” of the joint touch gesture with the photographic application function in the memory 120.
- the joint touch gesture of the trajectory acting on the touch-sensitive surface is detected as "C"
- the joint touch gesture can be determined by looking up the association relationship between the trajectory of the joint touch gesture stored in the memory 120 and the application function.
- the trajectory "C" is associated with the camera application function.
- a joint touch action when a joint touch action is detected in the area A of the touch-sensitive display unit, it is moved to the area B on the touch-sensitive display unit after pressing, and there is a raised joint touch action in the area B.
- the above joint is pressed in the area A, and the joint touch event raised after moving to the area B is a joint touch gesture.
- the position of the touch action (for example, area A or area B) can be judged by the contact coordinate information.
- the joint touch gesture can be composed of joint touch actions.
- the joint click gesture is pressed and lifted It consists of two joint touch actions; the joint swipe gesture consists of three joint touch actions of pressing, moving and lifting.
- the area A is a joint contact area of the joint touch gesture and the touch sensitive display unit; the area B is an end contact area of the joint touch gesture and the touch sensitive display unit.
- the movement trajectory moved from the area A to the area B is the trajectory of the touch gesture.
- the electronic device compares the detected trajectory of the joint touch gesture with the preset trajectory, and determines whether the trajectory of the joint touch gesture matches the preset trajectory (405).
- the preset track information may be preset by the electronic device or preset by the user.
- the preset trajectory information may be saved in a memory (for example, the memory 120 in FIG. 1);
- the straight line may be a horizontal line, a vertical line, or a diagonal line (for example, "-", "
- the preset trajectory may be other forms of trajectory, and may be adaptively adjusted according to specific design requirements.
- the example in which the preset trajectory is a straight line in the embodiment of the present invention does not constitute a limitation on the solution of the present invention.
- FIG. 5 is an exemplary user interface of a trajectory of an articulation gesture of the embodiment of the present invention as a horizontal line (eg, "-").
- the joint touch position detected at the beginning is in the area A (the area A is the start contact area) and then the joint touch position is detected to move to the area B (the area B is the end contact area), and the trajectory of the joint touch gesture is the area A Move to the trajectory of zone B.
- the direction of the arrow is the moving direction of the joint touch gesture from the area A to the area B.
- the electronic device compares the detected horizontal line trajectory with a preset linear trajectory, and determines that the horizontal line trajectory matches the preset trajectory.
- a text selection area is displayed on the text application interface in response to the joint touch gesture (406).
- the text selection area is located between the first endpoint and the second endpoint, the first endpoint is located at a first location in the text application interface, and the second endpoint is located at a second location in the text application interface At the office.
- the first end point is inserted at a position of the joint touch gesture and the start contact area A of the touch-sensitive display unit, and the joint touch gesture and touch
- the position of the end contact area B of the sensitive display unit is inserted into the second end point;
- the insertion position of the first end point is the first position, and the first position may be the centroid of the text application interface displayed by the display closest to the area A
- the insertion position of the second end point is the second position, and the second position may be the text word or word of the text application interface displayed by the display that is closest to the centroid of the area B Starting or ending; a text area located between the first endpoint and the second endpoint in the text application interface is the text selection area.
- the insertion position of the first end point 302a is the start or end of the text word or word closest to the centroid of the joint contact gesture illustrated in FIG. 5 and the start contact area A of the touch-sensitive display unit; the insertion position of the second end point 302b is the most
- the joint touch gesture illustrated in FIG. 5 is close to the beginning or end of the text word or word of the centroid of the end touch zone B of the touch-sensitive display unit; the text selection area 301 is located between the first end point 302a and the second end point 302b.
- the joint touch position detected at the beginning is in the area A (the area A is the start contact area) and then the joint touch position is detected to move to the area B (the area B is the end contact area), and the trajectory of the joint touch gesture is the area A Move to the trajectory of the zone B, as shown by the vertical line trajectory shown by the dotted line, and the arrow direction is the moving direction of the joint touch gesture from the zone A to the zone B.
- the electronic device compares the detected vertical line trajectory with a preset linear trajectory, and determines that the vertical line trajectory matches the preset trajectory.
- the insertion position of the first end point 302a is the start or end of the text word or word closest to the centroid of the joint contact gesture illustrated in FIG. 7 and the start contact area A of the touch-sensitive display unit; the insertion position of the second end point 302b is the most
- the joint touch gesture of FIG. 7 is close to the beginning or end of the text word or word of the centroid of the end touch zone B of the touch-sensitive display unit; the text selection area 301 is located between the first end point 302a and the second end point 302b.
- the trajectory of the joint touch gesture is a diagonal line (eg, "/"), in accordance with an embodiment of the present invention.
- the joint touch position detected at the beginning is in the area A (the area A is the start contact area) and then the joint touch position is detected to move to the area B (the area B is the end contact area), and the trajectory of the joint touch gesture is the area A Move to the trajectory of the zone B, as shown by the vertical line trajectory shown by the dotted line, and the arrow direction is the moving direction of the joint touch gesture from the zone A to the zone B.
- the electronic device compares the detected oblique line trajectory with a preset linear trajectory, and determines that the oblique trajectory matches the preset trajectory.
- the insertion position of the first end point 302a is the start or end of the text word or word closest to the centroid of the joint contact gesture illustrated in FIG. 9 and the start contact area A of the touch-sensitive display unit; the insertion position of the second end point 302b is the most Proximate to the joint touch gesture illustrated in FIG. 9 and the beginning or end of the text word or word of the centroid of the end contact area B of the touch-sensitive display unit; the text selection area 301 is located at the first Between the endpoint 302a and the second endpoint 302b.
- the trajectory of the joint touch gesture is a diagonal line (eg, " ⁇ "), in accordance with an embodiment of the present invention.
- the joint touch position detected at the beginning is in the area A (the area A is the start contact area) and then the joint touch position is detected to move to the area B (the area B is the end contact area), and the trajectory of the joint touch gesture is the area A Move to the trajectory of the zone B, as shown by the vertical line trajectory shown by the dotted line, the arrow direction is the moving direction of the joint touch gesture from the zone A to the zone B; the electronic device makes the detected slanted trajectory and the preset linear trajectory In comparison, it is determined that the oblique line track matches the preset track.
- the insertion position of the first end point 302a is the start or end of the text word or word closest to the centroid of the joint contact gesture illustrated in FIG. 11 and the start contact area A of the touch-sensitive display unit; the insertion position of the second end point 302b is the most The joint touch gesture illustrated in FIG. 11 is close to the beginning or end of the text word or word of the centroid of the end touch zone B of the touch-sensitive display unit; the text selection area 301 is located between the first end point 302a and the second end point 302b.
- the above embodiments may further perform a word processing function on the text selection area.
- FIG. 13 is an exemplary user interface for performing a word processing function on a text selection area according to an embodiment of the present invention.
- the word processing functions may include copying, cutting, pasting, translating, and the like.
- Other word processing functions can also be invoked by selecting "More", which can include underlining the selected text, making the selected text bold, changing the font, font size, font color selection, etc. of the selected text.
- the order in which the word processing functions are illustrated in FIG. 13 and the presentation form can be appropriately adjusted according to design requirements.
- the technical solution of the embodiment of the present invention discloses that when detecting a joint touch gesture acting on the touch-sensitive surface, identifying whether the user interface displayed by the display is a text application interface, if the user interface displayed by the display is a text application interface, and the When the trajectory of the joint touch gesture matches the preset trajectory, the text selection area is displayed on the text application interface in response to the joint touch gesture.
- the solution of the embodiment of the invention simplifies the operation steps of selecting text, thereby improving the user experience.
- FIG. 14 is a simplified schematic diagram of an internal structure of an electronic device with a touch-sensitive display unit according to an embodiment of the present invention.
- Functional blocks of an electronic device may be implemented by hardware, software, or a combination of hardware and software to perform the principles of the invention.
- Those skilled in the art can understand that the functional modules described in FIG. 14 can be combined into or The person is separated into sub-function modules to implement the principles of the invention described above. Accordingly, the description herein may support any possible combination or separation or further definition of the functional modules described herein.
- the principle of solving the problem of the electronic device and the device is similar to the method for selecting the text in the embodiment of the present invention. Therefore, the implementation of the electronic device and the device can be referred to the implementation of the method, and the repeated description is omitted.
- the electronic device 1400 includes a touch-sensitive display unit 130, an acceleration sensor 151, a memory 120, and a processor 190.
- the touch sensitive display unit 130 can be a display having a touch sensitive surface, the touch sensitive display unit 130 including a touch sensitive surface and a display screen.
- the touch-sensitive display unit 130 is configured to display a screen interface, and is further configured to receive a touch action on the touch-sensitive surface and transmit the touch information to the processor 190.
- the touch information may include one or more of a contact coordinate, a grid capacitance value of the touch-sensitive display unit, and a touch action; the touch action may include an action of pressing, moving, and lifting.
- the acceleration sensor 151 is configured to detect an acceleration signal in the Z-axis direction and transmit the detected acceleration signal in the Z-axis direction to the processor 190.
- the storage area 120 stores instructions.
- the processor 190 is coupled to the touch-sensitive display unit 130, the acceleration sensor 151, and the memory 120.
- the processor 190 invokes instructions stored in the memory 120 to enable recognition of whether the user interface displayed by the display is a text application interface when a joint touch gesture is detected that acts on the touch-sensitive surface. If the user interface displayed by the display is a text application interface, and the trajectory of the joint touch gesture matches the preset trajectory, the text selection area is displayed on the text application interface in response to the joint touch gesture.
- the text selection area is located between the first endpoint and the second endpoint; the first endpoint is located at a first location in the text application interface, and the second endpoint is located at a second location of the text application interface At the office.
- the first application function is executed.
- the trajectory of the joint touch gesture does not match the preset trajectory
- Performing the second application function when there is a second application function associated with the trajectory of the joint touch gesture, Performing the second application function.
- the touch sensitive display unit After the touch sensitive display unit receives the touch action on the touch sensitive surface, transmitting the touch information to the processor; the touch information may include contact coordinates, a grid capacitance value of the touch sensitive display unit, and one of the touch actions One or more signals.
- the touch action may include actions such as pressing, moving, and lifting.
- whether the touch action is a joint touch action may be determined based on the grid capacitance information and a Z-axis direction acceleration signal generated by the touch action; the touch-sensitive surface mesh capacitance information includes a grid capacitance The number of values and the number of grids with non-zero capacitance values.
- the touch action is a joint touch action.
- the number of grids of the non-zero capacitance value is greater than or equal to the preset value, and the Z-axis direction acceleration signal is in the second preset acceleration range, It is determined that the touch action is a finger touch action.
- the Z-axis direction acceleration signal can determine that the touch action is a joint touch action when the first preset acceleration range (for example, within 5 ms, the acceleration signal is greater than 3 g).
- the touch action may be determined as a finger touch action.
- joint touch action in the embodiment of the present invention is not necessarily triggered by the finger joint, and other objects may tap the touch-sensitive display unit 130 at a fast speed, as long as the judgment condition of the joint touch action is satisfied.
- a joint touch action called an embodiment of the present invention.
- a touch gesture can be composed of touch actions.
- a tap gesture consists of pressing and lifting two touch actions
- a swipe gesture consists of pressing, moving, and lifting three touch actions.
- a joint touch gesture composed of a joint touch action is detected.
- a joint click gesture consists of pressing and lifting two joint touch actions
- the joint swipe gesture consists of pressing, moving, and lifting three joint touch actions.
- a joint touch action when a joint touch action is detected in the area A of the touch-sensitive display unit, it is moved to the area B on the touch-sensitive display unit after pressing, and there is a raised joint touch action in the area B.
- a joint touch gesture On The joint touch event that is pressed in the area A and moved to the area B, is a joint touch gesture.
- the position of the touch action (for example, area A or area B) can be judged by the contact coordinate information.
- the joint touch gesture can be composed of joint touch actions.
- a joint click gesture consists of pressing and lifting two joint touch actions; the joint swipe gesture consists of pressing, moving, and lifting three joint touch actions.
- the area A is a joint contact area of the joint touch gesture and the touch sensitive display unit; the area B is an end contact area of the joint touch gesture and the touch sensitive display unit.
- the movement trajectory moved from the area A to the area B is the trajectory of the touch gesture.
- the electronic device compares the detected trajectory of the joint touch gesture with the preset trajectory, and determines whether the trajectory of the joint touch gesture matches the preset trajectory.
- the preset track information may be preset by the electronic device or preset by the user.
- the preset trajectory information may be stored in a memory (eg, memory 120 in FIG. 1).
- the joint touch gesture can be composed of different joint touch actions.
- a joint click gesture consists of pressing and lifting two joint touch actions
- the joint swipe gesture consists of pressing, moving, and lifting three joint touch actions.
- the movement trajectory between pressing to lifting is the trajectory of the joint touch gesture.
- the user may preset an association relationship between the trajectory of the plurality of joint touch gestures and the application function, and save the association relationship between the trajectory of the joint touch gesture and the application function in the memory 120.
- the user may preset the association relationship of the trajectory "C” of the joint touch gesture with the photographic application function, and save the association relationship of the trajectory "C” of the joint touch gesture with the photographic application function in the memory 120.
- the joint touch gesture of the trajectory acting on the touch-sensitive surface is detected as "C"
- the joint touch gesture can be determined by looking up the association relationship between the trajectory of the joint touch gesture stored in the memory 120 and the application function.
- the trajectory "C" is associated with the camera application function.
- the text selection area is located between the first endpoint and the second endpoint, the first endpoint is located at a first location in the text application interface, and the second endpoint is located at a second location in the text application interface At the office.
- the first end point is inserted at a position of the joint touch gesture and the start contact area A of the touch-sensitive display unit, and the joint touch gesture and touch The position of the end contact area B of the sensitive display unit is inserted into the second end point.
- the insertion position of the first endpoint is the first The first location may be the beginning or end of a text word or word that is closest to the centroid of the zone A in the text application interface displayed by the display.
- the insertion position of the second endpoint is a second position, which may be the beginning or end of a text word or word that is closest to the centroid of the region B in the text application interface displayed by the display.
- a text area located between the first endpoint and the second endpoint in the text application interface is the text selection area.
- FIG. 15 is a schematic diagram showing the functional structure of a device according to an embodiment of the present invention.
- the apparatus includes a detection unit 1501, an identification unit 1502, a determination unit 1503, and a selection text unit 1504.
- the detecting unit 1501 is configured to detect a joint touch gesture acting on the touch-sensitive surface.
- the identifying unit 1502 is configured to identify whether the user interface displayed by the display is a text application interface.
- the determining unit 1503 is configured to determine whether a trajectory of the joint touch gesture matches a preset trajectory.
- the selection text unit 1504 is configured to identify whether the user interface displayed by the display is a text application interface when detecting a joint touch gesture acting on the touch-sensitive surface. If the user interface displayed by the display is a text application interface, and the trajectory of the joint touch gesture matches the preset trajectory, the text selection area is displayed on the text application interface in response to the joint touch gesture.
- the text selection area is located between the first endpoint and the second endpoint; the first endpoint is located at a first location in the text application interface, and the second endpoint is located at a second location of the text application interface At the office.
- the device further includes a first determining unit 1506, and a first executing unit 1507.
- the first determining unit 1506 is configured to determine, if the user interface displayed by the display is not a text application interface, whether an application function associated with the track of the joint touch gesture exists.
- the first execution unit 1507 is configured to execute the application function if the user interface displayed by the display is not a text application interface and an application function associated with the trajectory of the joint touch gesture exists.
- the device further includes a second determining unit 1508 and a second executing unit 1509.
- the second determining unit 1508 is configured to determine whether the presence or absence of the user interface displayed by the display is a text application interface, but the trajectory of the joint touch gesture is not a preset trajectory.
- the application function of the track touch gesture is associated with the track.
- the second execution unit 1509 is configured to: if the user interface displayed by the display is a text application interface, but the trajectory of the joint touch gesture is not a preset trajectory, and there is an application function associated with the trajectory of the joint touch gesture When the application function is executed.
- the touch information is communicated to the processor.
- the touch information may include one or more of contact coordinates, a grid capacitance value of the touch sensitive display unit, and a touch action.
- the touch action may include actions such as pressing, moving, and lifting.
- whether the touch action is a joint touch action may be determined based on the mesh capacitance information and a Z-axis direction acceleration signal generated by the touch action.
- the touch-sensitive surface grid capacitance information includes a grid capacitance value and a number of grids of non-zero capacitance values.
- the touch action is a joint touch action.
- the number of grids of the non-zero capacitance value is greater than or equal to the preset value, and the Z-axis direction acceleration signal is in the second preset acceleration range, It is determined that the touch action is a finger touch action.
- the Z-axis direction acceleration signal can determine that the touch action is a joint touch action when the first preset acceleration range (for example, within 5 ms, the acceleration signal is greater than 3 g).
- the touch action may be determined as a finger touch action.
- joint touch action in the embodiment of the present invention is not necessarily triggered by the finger joint, and other objects may tap the touch-sensitive display unit 130 at a fast speed, as long as the judgment condition of the joint touch action is satisfied.
- a joint touch action called an embodiment of the present invention.
- a touch gesture can be composed of touch actions.
- a tap gesture consists of pressing and lifting two touch actions;
- a swipe gesture consists of pressing, moving, and lifting three touch actions.
- a joint touch gesture composed of a joint touch action may be detected.
- the joint click gesture consists of pressing and lifting two joint touch actions;
- the joint swipe gesture consists of pressing, moving, and lifting three joint touch actions.
- a joint touch action when a joint touch action is detected in the area A of the touch-sensitive display unit, it is moved to the area B on the touch-sensitive display unit after pressing, and there is a raised joint touch action in the area B.
- the above joint is pressed in the area A, and the joint touch event raised after moving to the area B is a joint touch gesture.
- the position of the touch action (for example, area A or area B) can be judged by the contact coordinate information.
- the joint touch gesture can be composed of joint touch actions.
- a joint click gesture consists of pressing and lifting two joint touch actions; the joint swipe gesture consists of pressing, moving, and lifting three joint touch actions.
- the area A is a joint contact area of the joint touch gesture and the touch sensitive display unit; the area B is an end contact area of the joint touch gesture and the touch sensitive display unit.
- the movement trajectory moved from the area A to the area B is the trajectory of the touch gesture.
- the electronic device compares the detected trajectory of the joint touch gesture with the preset trajectory, and determines whether the trajectory of the joint touch gesture matches the preset trajectory.
- the preset track information may be preset by the electronic device or preset by the user.
- the preset trajectory information may be stored in a memory (eg, memory 120 in FIG. 1).
- the joint touch gesture can be composed of different joint touch actions.
- a joint click gesture consists of pressing and lifting two joint touch actions
- the joint swipe gesture consists of pressing, moving, and lifting three joint touch actions.
- the movement trajectory between pressing to lifting is the trajectory of the joint touch gesture.
- the user may preset an association relationship between the trajectory of the plurality of joint touch gestures and the application function, and save the association relationship between the trajectory of the joint touch gesture and the application function in the memory 120.
- the user may preset the association relationship of the trajectory "C” of the joint touch gesture with the photographic application function, and save the association relationship of the trajectory "C” of the joint touch gesture with the photographic application function in the memory 120.
- the joint touch gesture of the trajectory acting on the touch-sensitive surface is detected as "C"
- the joint touch gesture can be determined by looking up the association relationship between the trajectory of the joint touch gesture stored in the memory 120 and the application function.
- the trajectory "C" is associated with the camera application function.
- the text selection area is located between the first endpoint and the second endpoint, and the first endpoint is located in the text application boundary At a first location in the face, the second endpoint is located at a second location in the text application interface. For example, when it is determined that the trajectory of the joint touch gesture matches the preset trajectory, the first end point is inserted at a position of the joint touch gesture and the start contact area A of the touch-sensitive display unit, and the joint touch gesture and touch The position of the end contact area B of the sensitive display unit is inserted into the second end point.
- the insertion position of the first endpoint is a first position, which may be the beginning or end of a text word or word that is closest to the centroid of the region A in the text application interface displayed by the display.
- the insertion position of the second endpoint is a second position, which may be the beginning or end of a text word or word that is closest to the centroid of the region B in the text application interface displayed by the display.
- a text area located between the first endpoint and the second endpoint in the text application interface is the text selection area.
- the principle of solving the problem of the electronic device and the device is similar to the method for selecting the text in the embodiment of the present invention. Therefore, the implementation of the electronic device and the device may refer to the implementation of the method, and the repeated description is not repeated.
- the technical solution of the embodiment of the present invention discloses that when detecting a joint touch gesture acting on the touch-sensitive surface, identifying whether the user interface displayed by the display is a text application interface, if the user interface displayed by the display is a text application interface, and the When the trajectory of the joint touch gesture matches the preset trajectory, the text selection area is displayed on the text application interface in response to the joint touch gesture.
- the solution of the embodiment of the invention simplifies the operation steps of selecting text, thereby improving the user experience.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
一种选择文本的方法和设备,该方法包括:当检测到作用于触敏表面的关节触摸手势,识别显示器显示的用户界面是否是文本应用界面,如果所述显示器显示的用户界面是文本应用界面,且所述关节触摸手势的轨迹与预设轨迹匹配时,响应所述关节触摸手势,在所述文本应用界面上显示文本选择区。简化了选择文本的操作步骤,进而改进了用户体验。
Description
本发明实施例涉及一种选择文本的方法,特别涉及使用关节触摸手势在具有触敏表面的显示器上选择文本的方法。
随着触摸屏式便携式电子设备的迅速普及发展,越来越多的人通过便携式电子设备处理多媒体和文本应用。例如:人们可以通过便携式电子设备浏览网页,收发电子邮件,收发即时消息等。当用户需要与好友分享文本,或对文本做复制处理时,会对目标选择文本进行选择操作。选择文本的操作一般比较复杂。例如:用户手指与触摸屏中待选择的文本区域接触;手指保持触摸状态达预定时间间隔后,文本区域弹出左右两个标志杆;用户触摸并拖动标志杆可以调整文本选择区域。类似这样的操作,用户与触摸屏的交互步骤较多,用户体验值得改进。
发明内容
为了改进现有技术选择文本操作的用户体验,本发明实施例提供了一种选择文本的技术方案。所述技术方案包括:
第一方面,本发明实施例提供了一种选择文本的方法,应用于一种便携式电子设备上,所述电子设备包括具有触敏表面的显示器,所述方法包括:
当检测到作用于所述触敏表面的关节触摸手势;
识别所述显示器显示的用户界面是否是文本应用界面;
如果所述显示器显示的用户界面是文本应用界面,且所述关节触摸手势的轨迹与预设轨迹匹配,则响应所述关节触摸手势,在所述文本应用界面上显示文本选择区,所述文本选择区位于第一端点和第二端点之间;
所述第一端点位于所述文本应用界面中第一位置处;
所述第二端点位于所述文本应用界面中第二位置处。
在第一方面的第一种可能的实现方式中,所述方法还包括,如果所述显示器显示的用户界面不是文本应用界面,且存在与所述关节触摸手势的轨迹关联的第一应用功能时,执行所述第一应用功能。
在第一方面的第二种可能的实现方式中,所述方法还包括,如果所述显示器显示的用户界面是文本应用界面,但所述关节触摸手势的轨迹与所述预设轨迹不匹配,当存在与所述关节触摸手势的轨迹关联的第二应用功能时,执行所述第二应用功能。
结合第一方面,或者第一方面第一至第二种任意一种可能的实现方式,在第三种可能的实现方式中,所述关节触摸手势由关节触摸动作组成;当作用于所述触敏表面的触摸动作产生的触敏表面网格电容值满足第一预设电容值范围,非零电容值的网格个数小于预设值,且Z轴方向加速度信号在第一预设加速度范围内时,则所述触摸动作是所述关节触摸动作,由所述关节触摸动作组成的手势是所述关节触摸手势。
第二方面,本发明实施例提供了一种便携式电子设备,包括:
显示器,所述显示器具有触敏表面;
加速度传感器,用于获取Z轴方向的加速度;
存储器,用于存储指令;
处理器,所述处理器调用存储在所述存储器中的指令以实现:
当检测到作用于所述触敏表面的关节触摸手势;
识别所述显示器显示的用户界面是否是文本应用界面;
如果所述显示器显示的用户界面是文本应用界面,且所述关节触摸手势的轨迹与预设轨迹匹配时,响应所述关节触摸手势,在所述文本应用界面上显示文本选择区,所述文本选择区位于第一端点和第二端点之间;
所述第一端点位于所述文本应用界面中第一位置处;
所述第二端点位于所述文本应用界面中第二位置处。
在第二方面的第一种可能的实现方式中,所述指令进一步用于:如果所述显示器显示的用户界面不是文本应用界面,且存在与所述关节触摸手势的轨迹关联的第一应用功能时,执行所述第一应用功能。
在第二方面的第二种可能的实现方式中,所述指令进一步用于:如果所述显示器显示的用户界面是文本应用界面,但所述关节触摸手势的轨迹与预设轨迹不匹配,当存在与所述关节触摸手势的轨迹关联的第二应用功能时,执行所述第二应用功能。
结合第二方面,或者第二方面第一至第二种任意一种可能的实现方式,在第三种可能的实现方式中,所述关节触摸手势由关节触摸动作组成;当作用于所述触敏表面的触摸动作产生的触敏表面网格电容值满足第一预设电容值范围,非零电容值的网格个数小于预设值,且Z轴方向加速度信号在第一预设加速度范围内时,则所述触摸动作是所述关节触摸动作,由所述关节触摸动作组成的手势是所述关节触摸手势。
第三方面,本发明实施例提供了一种装置,所述装置包括:检测单元、识别单元、判断单元、选择文本单元;
所述检测单元,用于检测作用于触敏表面的关节触摸手势;
所述识别单元,用于识别显示器显示的用户界面是否是文本应用界面;
所述判断单元,用于判断所述关节触摸手势的轨迹是否与预设轨迹匹配;
所述选择文本单元,用于当检测到作用于所述触敏表面的关节触摸手势,如果显示器显示的用户界面是文本应用界面,且所述关节触摸手势的轨迹与预设轨迹匹配时,响应所述关节触摸手势,在所述文本应用界面上显示文本选择区,所述文本选择区位于第一端点和第二端点之间;
所述第一端点位于所述文本应用界面中第一位置处;
所述第二端点位于所述文本应用界面中第二位置处。
在第三方面的第一种可能的实现方式中,所述装置还包括:第一判断单元、第一执行单元:
第一判断单元用于当所述显示器显示的用户界面不是文本应用界面时,判断是否存在与所述关节触摸手势的轨迹关联的应用功能;
第一执行单元用于当存在与所述关节触摸手势的轨迹关联的第一应用功能时,执行所述第一应用功能。
在第三方面的第二种可能的实现方式中,所述装置还包括:第二判断单元、第二执行单元:
所述第二判断单元用于当所述显示器显示的用户界面是文本应用界面,但所述关节触摸手势的轨迹与预设轨迹不匹配时,判断是否存在与所述关节触摸手势的轨迹关联的应用功能;
所述第二执行单元用于当存在与所述关节触摸手势关联的第二应用功能时,执行所述第二应用功能。
结合第三方面,或者第三方面第一至第二种任意一种可能的实现方式,在第三种可能的实现方式中,所述关节触摸手势由关节触摸动作组成;当作用于所述触敏表面的触摸动作产生的触敏表面网格电容值满足第一预设电容值范围,非零电容值的网格个数小于预设值,且Z轴方向加速度信号在第一预设加速度范围内时,则所述触摸动作是所述关节触摸动作,由所述关节触摸动作组成的手势是所述关节触摸手势。
第四方面,本发明实施例提供了一种便携式电子设备上的用户界面,所述便携式电子设备包括显示器、存储器以及用于执行存储在所述存储器中的指令的处理器,其中,所述显示器具有触敏表面,所述用户界面包括:
用于显示文本应用的界面;
当检测到作用于所述触敏表面的关节触摸手势,如果显示器显示的用户界面是文本应用界面,且所述关节触摸手势的轨迹与预设轨迹匹配时,响应所述关节触摸手势,在所述文本应用界面上显示文本选择区的界面,其中,所述文
本选择区位于第一端点和第二端点之间;
所述第一端点位于所述文本应用界面中第一位置处;
所述第二端点位于所述文本应用界面中第二位置处。
第五方面,本发明实施例提供了一种存储一个或多个程序的非易失性(non-transitory)计算机可读存储介质,所述一个或多个程序包括指令,所述指令当被包括具有触敏表面的显示器的便携式电子设备执行时使所述便携式电子设备执行以下事件:
当检测到作用于所述触敏表面的关节触摸手势;
识别所述显示器显示的用户界面是否是文本应用界面;
如果所述显示器显示的用户界面是文本应用界面,且所述关节触摸手势的轨迹与预设轨迹匹配时,响应所述关节触摸手势,在所述文本应用界面上显示文本选择区,所述文本选择区位于第一端点和第二端点之间;
所述第一端点位于所述文本应用界面中第一位置处;
所述第二端点位于所述文本应用界面中第二位置处。
本发明实施例的技术方案公开了当检测到作用于触敏表面的关节触摸手势,识别显示器显示的用户界面是否是文本应用界面,如果所述显示器显示的用户界面是文本应用界面,且所述关节触摸手势的轨迹与预设轨迹匹配时,响应所述关节触摸手势,在所述文本应用界面上显示文本选择区。本发明实施例的方案简化了选择文本的操作步骤,进而改进了用户体验。
为了更清楚地说明本发明实施例的技术方案,下面将对实施例描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本发明的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1为本发明实施例提供的便携式电子设备100的内部结构示意图;
图2为本发明实施例提供的便携式电子设备100的外部结构示意图;
图3为本发明实施例提供的显示文本选择区的示意图;
图4为本发明实施例提供的选择文本的方法流程图;
图5为本发明实施例关节触摸手势的轨迹为横线(即“—”)的示例性用户界面;
图6为根据本发明实施例图5所示的关节触摸手势的轨迹得到的文本选择区的示例性用户界面;
图7为本发明实施例关节触摸手势的轨迹为竖线(即“|”)的示例性用户界面;
图8为根据本发明实施例图7所示的关节触摸手势的轨迹得到的文本选择区的示例性用户界面;
图9为本发明实施例关节触摸手势的轨迹为斜线(即“/”)的示例性用户界面;
图10为根据本发明实施例图9所示的关节触摸手势的轨迹得到的文本选择区的示例性用户界面;
图11为本发明实施例关节触摸手势的轨迹为斜线(即“\”)的示例性用户界面;
图12为根据本发明实施例图11所示的关节触摸手势的轨迹得到的文本选择区的示例性用户界面;
图13为本发明实施例提供的对文本选择区执行字处理功能的示例性用户界面;
图14是为本发明实施例提供的具有触敏显示单元的电子设备的内部结构简化示意图;
图15是根据为本发明实施例提供的一种装置的功能结构示意图。
下面将结合本发明实施例中的附图,对本发明实施例的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本发明的一部分实施例,而不是全部的实施例。基于本发明实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本发明保护的范围。
为便于说明,本发明实施例以包括触敏显示单元的便携式电子设备100作示例性说明,本领域技术人员可以理解的,本发明实施例同样适用于其他装置,例如手持设备、车载设备、可穿戴设备、计算设备,以及各种形式的用户设备(User Equipment,UE),移动台(Mobile station,MS),终端(terminal),终端设备(Terminal Equipment)等等。
所述电子设备100可以支持多种应用,例如文本应用(电子邮件应用,博客应用,网页浏览应用等);所述电子设备100的触敏显示单元可以直观的呈现所述应用的用户界面,能够通过所述电子设备100的触敏显示单元执行各种应用。
图1为本发明实施例提供的便携式电子设备100的内部结构示意图。所述电子设备100可以包括触敏显示单元130、加速度传感器151、接近光传感器152、环境光传感器153、存储器120、处理器190、射频单元110、音频电路160、扬声器161、麦克风162、WiFi(wireless fidelity,无线保真)模块170、蓝牙模块180、电源193、外部接口197等部件。
本领域技术人员可以理解,图1仅仅是便携式电子设备的举例,并不构成对便携式电子设备的限定,可以包括比图示更多或更少的部件,或者组合某些部件,或者不同的部件。
所述触敏显示单元130有时为了方便被称为“触摸屏”,并且也可被称为是或者被叫做触敏显示器系统,也可以被称为具有触敏表面(touch-sensitive surface)的显示器。所述具有触敏表面的显示器包括触敏表面和显示屏;可以显示屏幕界面、也可以接收触摸动作。
触敏显示单元130提供设备与用户之间的输入接口和输出接口。所述触敏显示单元130可收集用户在其上或附近的触摸操作,例如用户使用手指202、关节、触笔等任何适合的物体在触敏显示单元上或在触敏显示单元附近的操作。触敏显示单元可以检测对触敏显示单元的触摸动作、触敏显示单元的网格电容值、触点坐标,将所述触摸动作、触敏显示单元的网格电容值、触点坐标信息发送给所述处理器190,并能接收所述处理器190发来的命令并加以执行。触敏显示单元130显示视觉输出。视觉输出可包括图形、文本、图标、视频及它们的任何组合(统称为“图形”)。在一些实施例中,一些视觉输出或全部的
视觉输出可对应于用户界面对象。
触敏显示单元130可使用LCD(液晶显示器)技术、LPD(发光聚合物显示器)技术、或LED(发光二极管)技术,但是在其他实施例中可使用其他显示技术。触敏显示单元130可以利用现在已知的或以后将开发出的多种触摸感测技术中的任何技术,以及其他接近传感器阵列或用于确定与触敏显示单元130接触的一个或多个点的其他元件来检测接触及其任何移动或中断,该多种触摸感测技术包括但不限于电容性的、电阻性的、红外线的、和表面声波技术。在一示例性实施例中,使用投射式互电容感测技术。
用户可以利用任何合适的物体或附加物诸如触笔、手指、关节等等,与触敏显示单元130接触。在一些实施例中,用户界面被设计为主要基于关节的接触和手势一起工作。在一些实施例中,设备将基于关节的粗略输入翻译为精确的指针/光标位置或命令,以执行用户所期望的动作。
在一些实施例中,除了触敏显示单元之外,设备100可包括用于激活或解除激活特定功能的触控板(未示出)。在一些实施例中,触控板是设备的触敏区域,该触敏区域与触敏显示单元不同,其不显示视觉输出。触控板可以是与触敏显示单元130分开的触敏表面,或者是由触敏显示单元形成的触敏表面的延伸部分。
所述加速度传感器151可检测各个方向上(一般为三轴)加速度的大小。同时,所述加速度传感器151还可用于检测终端静止时重力的大小及方向,可用于识别手机姿态的应用(比如横竖屏切换、相关游戏、磁力计姿态校准),振动识别相关功能(比如计步器、敲击)等。在本发明实施例中,所述加速度传感器151用于获取用户的触摸动作接触触敏显示单元在Z轴方向的重力加速度。
电子设备100还可以包括一个或多个接近光传感器152,用于当所述电子设备100距用户较近时(例如当用户正在打电话时靠近耳朵)关闭并禁用触敏表面的触摸功能,以避免用户对触敏显示单元的误操作。电子设备100还可以包括一个或多个环境光传感器153,用于当电子设备100位于用户口袋里或其他黑暗区域时保持触敏显示单元关闭,以防止电子设备100在锁定状态时消耗不必要的电池功耗或被误操作。在一些实施例中,接近光传感器和环境光传感
器可以集成在一颗部件中,也可以作为两个独立的部件。至于电子设备100还可配置陀螺仪、气压计、湿度计、温度计、红外线传感器等其他传感器,在此不再赘述。虽然图1示出了接近光传感器和环境光传感器,但是可以理解的是,其并不属于电子设备100的必须构成,完全可以根据需要在不改变发明的本质的范围内而省略。
所述存储器120可用于存储指令和数据。存储器120可主要包括存储指令区和存储数据区。存储数据区可存储关节触摸手势与应用功能的关联关系,还可以存储预设轨迹信息。存储指令区可存储操作系统、至少一个功能所需的指令等。所述指令可使处理器190执行以下方法,具体方法包括:当检测到作用于触敏表面的关节触摸手势,识别显示器显示的用户界面是否是文本应用界面。如果所述显示器显示的用户界面是文本应用界面,且所述关节触摸手势的轨迹与预设轨迹匹配时,响应所述关节触摸手势,在所述文本应用界面上显示文本选择区。所述文本选择区位于第一端点和第二端点之间;所述第一端点位于所述文本应用界面中第一位置处,所述第二端点位于所述文本应用界面中第二位置处。如果显示器显示的用户界面不是文本应用界面,且存在与所述关节触摸手势的轨迹关联的第一应用功能时,执行所述第一应用功能。如果显示器显示的用户界面是文本应用界面,但所述关节触摸手势的轨迹与预设轨迹不匹配,当存在与所述关节触摸手势的轨迹关联的第二应用功能时,执行所述第二应用功能。
处理器190是电子设备100的控制中心,利用各种接口和线路连接整个手机的各个部分,通过运行或执行存储在存储器120内的指令以及调用存储在存储器120内的数据,执行电子设备100的各种功能和处理数据,从而对手机进行整体监控。可选的,处理器190可包括一个或多个处理单元;优选的,处理器190可集成应用处理器和调制解调处理器,其中,应用处理器主要处理操作系统、用户界面和应用程序等,调制解调处理器主要处理无线通信。可以理解的是,上述调制解调处理器也可以不集成到处理器190中。在一些实施例中,处理器、存储器、可以在单一芯片上实现;在一些实施例中,他们也可以在独立的芯片上分别实现。在本发明实施例中,处理器190还用于调用存储器中的指令以实现当检测到作用于触敏表面的关节触摸手势,识别显示器显示的用
户界面是否是文本应用界面。如果所述显示器显示的用户界面是文本应用界面,且所述关节触摸手势的轨迹与预设轨迹匹配时,响应所述关节触摸手势,在所述文本应用界面上显示文本选择区。所述文本选择区位于第一端点和第二端点之间;所述第一端点位于所述文本应用界面中第一位置处,所述第二端点位于所述文本应用界面中第二位置处。如果显示器显示的用户界面不是文本应用界面,且存在与所述关节触摸手势的轨迹关联的第一应用功能时,执行所述第一应用功能。如果显示器显示的用户界面是文本应用界面,但所述关节触摸手势的轨迹与预设轨迹不匹配,当存在与所述关节触摸手势的轨迹关联的第二应用功能时,执行所述第二应用功能。
所述射频单元110可用于收发信息或通话过程中信号的接收和发送,特别地,将基站的下行信息接收后,给处理器190处理;另外,将设计上行的数据发送给基站。通常,RF电路包括但不限于天线、至少一个放大器、收发信机、耦合器、低噪声放大器(Low Noise Amplifier,LNA)、双工器等。此外,射频单元110还可以通过无线通信与网络设备和其他设备通信。所述无线通信可以使用任一通信标准或协议,包括但不限于全球移动通讯系统(Global System of Mobile communication,GSM)、通用分组无线服务(General Packet Radio Service,GPRS)、码分多址(Code Division Multiple Access,CDMA)、宽带码分多址(Wideband Code Divi sion Multiple Access,WCDMA)、长期演进(Long Term Evolution,LTE)、电子邮件、短消息服务(Short Messaging Service,SMS)等。
音频电路160、扬声器161、麦克风162可提供用户与电子设备100之间的音频接口。音频电路160可将接收到的音频数据转换后的电信号,传输到扬声器161,由扬声器161转换为声音信号输出;另一方面,麦克风162将收集的声音信号转换为电信号,由音频电路160接收后转换为音频数据,再将音频数据输出处理器190处理后,经射频单元110以发送给比如另一终端,或者将音频数据输出至存储器120以便进一步处理,音频电路也可以包括耳机插孔163,用于提供音频电路和耳机之间的连接接口。
WiFi属于短距离无线传输技术,电子设备100通过WiFi模块170可以帮助用户收发电子邮件、浏览网页和访问流式媒体等,它为用户提供了无线的宽
带互联网访问。虽然图1示出了WiFi模块170,但是可以理解的是,其并不属于电子设备100的必须构成,完全可以根据需要在不改变发明的本质的范围内而省略。
蓝牙是一种短距离无线通讯技术。利用蓝牙技术,能够有效地简化掌上电脑、笔记本电脑和手机等移动通信终端设备之间的通信,也能够成功地简化以上这些设备与因特网(Internet)之间的通信,电子设备100通过蓝牙模块180使电子设备100与因特网之间的数据传输变得更加迅速高效,为无线通信拓宽道路。蓝牙技术是能够实现语音和数据无线传输的开放性方案。然图1示出了WiFi模块170,但是可以理解的是,其并不属于电子设备100的必须构成,完全可以根据需要在不改变发明的本质的范围内而省略。
电子设备100还包括给各个部件供电的电源193(比如电池),优选的,电源可以通过电源管理系统194与处理器190逻辑相连,从而通过电源管理系统194实现管理充电、放电、以及功耗管理等功能。
电子设备100还包括外部接口197,所述外部接口可以是标准的Micro USB接口,也可以使多针连接器,可以用于连接电子设备100与其他装置进行通信,也可以用于连接充电器为电子设备100充电。
尽管未示出,电子设备100还可以包括摄像头、闪光灯等,在此不再赘述。
以下以电子设备100为例说明选择文本的方法。
图2为本发明实施例提供的便携式电子设备100的外部结构示意图。在本实施例中,电子设备100可以包括触敏显示单元130、加速度传感器151、音量控制按键132、开关按键133、麦克风162、扬声器161、外部接口197、耳机插孔163。所述触敏显示单元130可以显示用户界面200中的一个或多个图形300,接收用户的触摸输入,通过使用触敏显示单元130作为操作电子设备100的主输入或控制装置,可以减少电子设备100上的物理输入或控制装置的数量。在本实施例中,触敏显示单元可以被称为“菜单按钮”。在一些其他实施例中,“菜单按钮”可以是物理按钮或其他物理输入或控制装置。所述加速度传感器151用于获取用户在触敏显示单元上的触摸动作在Z轴的重力加速度。通过压下并保持所述开关按键在被压下状态达预定时间间隔,可以实现打
开或关闭电子设备100的电源。通过压下所述开关按键并在预定时间间隔之前释放,可以实现锁定电子设备100。在其他实施例中,还可以通过麦克风162接收用于激活一些功能的语音输入。
图3为本发明实施例提供的显示文本选择区的示意图。文本选择区301可以在触敏显示单元130上表示为由第一端点302a和第二端点302b限定范围的,位于第一端点和第二端点之间的文本。本领域技术人员可以理解的,选择的文本区301可以包含图3中示出的文本的任何部分,并且在图3中选择的文本仅仅是一个示例。此外,第一端点302a可以和第一标志杆303a关联,第二端点302b可以和第二标志杆303b关联。第一标志杆303a和第二标志杆303b可以分别用于指示第一端点302a和第二端点302b的位置。由于标志杆比端点容易被操作选取,当用户希望移动第一端点302a或第二端点302b中的一个或全部至新的位置时,可以通过移动与端点关联的标志杆至新的位置以实现移动端点至新的位置。在一些实施例中,标志杆303a和303b可以是其他形状、大小和颜色。本实施例中只是一种示例方式。
图4为本发明实施例提供的选择文本的方法流程图。
方法400可在具有触敏显示单元和多个应用程序(包括文本应用程序)的一种便携式电子设备(例如,图1或图2中的电子设备100)上被执行。在一些实施例中,方法400中的一些操作可以被组合,和/或一些操作的次序可以改变。
如下所述,方法400提供了一种更高效的方式来快速地选择文本。该方法帮助用户通过较少的操作步骤实现选择文本。简化了选择文本的操作步骤,改进了用户体验。
便携式电子设备检测到作用于触敏表面的关节触摸手势(401)。
例如:步骤401具体可包括步骤4011~4013。
便携式电子设备检测到作用于触敏表面的触摸动作(4011);判断所述触摸动作是否是关节触摸动作(4012);检测由所述关节触摸动作组成的关节触摸手势(4013)。
当检测到作用于所述触敏表面的关节触摸手势后,识别显示器显示的用户界面是否是文本应用界面(402)。
示例性的,所述文本应用界面可以显示网页浏览,电子邮件,记事本,即时消息,博客应用等。
如果所述显示器显示的用户界面不是文本应用界面,判断是否存在与所述关节触摸手势的轨迹关联的应用功能(403)。当存在与所述关节触摸手势的轨迹关联的应用功能时,电子设备执行所述应用功能(404)。
如果所述显示器显示的用户界面是文本应用界面,判断所述关节触摸手势的轨迹是否与预设轨迹匹配(405)。当所述关节触摸手势的轨迹与预设轨迹匹配时,响应所述关节触摸手势,在所述文本应用界面上显示文本选择区(406)。
可选的,如果所述显示器显示的用户界面是文本应用界面,但所述关节触摸手势的轨迹与预设轨迹不匹配时,判断是否存在与所述关节触摸手势的轨迹关联的应用功能。当存在与所述关节触摸手势的轨迹关联的应用功能时,执行所述应用功能。
当触敏显示单元接收到作用于触敏表面的触摸动作后,将触摸信息传递给处理器。所述触摸信息可以包括触点坐标、触敏显示单元的网格电容值、触摸动作中的一种或多种信号。所述触摸动作可以包括按下、移动以及抬起等动作。
在一些实施例中,可以基于所述网格电容信息以及所述触摸动作产生的Z轴方向加速度信号判断所述触摸动作是否是关节触摸动作(4012)。所述触敏表面网格电容信息包括网格电容值以及非零电容值的网格个数。
当所述网格电容值满足第一预设电容值范围,非零电容值的网格个数小于预设值,且所述Z轴方向加速度信号在第一预设加速度范围时,可以判断所述触摸动作是关节触摸动作。当所述网格电容值满足第二预设电容值范围,非零电容值的网格个数大于或等于预设值,且所述Z轴方向加速度信号在第二预设加速度范围时,可以判断所述触摸动作是手指触摸动作。
例如,当触敏显示单元的网格电容值指示最大电容值满足第一预设电容值范围(比如小于或等于0.42pF),分布有非零电容值的网格个数小于7,在预设时间内,Z轴方向加速度信号在第一预设加速度范围(比如,在5ms内,加速度信号大于3g)时,可以判断该触摸动作为关节触摸动作。当触敏显示单元的网格电容值指示最大电容值满足第二预设电容值范围(比如大于0.42pF、
小于或等于0.46pF),分布有非零电容值的网格个数大于或等于7,且在预设时间内,Z轴方向加速度信号在第二预设加速度范围(比如,在5ms内,加速度信号小于2g,g为重力加速度)时,可以判断该触摸动作为手指触摸动作。可以理解的是,本发明实施例中的关节触摸动作并非一定由手指关节触发,也可以是其他物体以很快的速度敲击触敏显示单元130,只要满足上述关节触摸动作的判断条件均可称为本发明实施例的关节触摸动作。
触摸手势可以由触摸动作组成。例如,点击手势由按下和抬起两个触摸动作组成;滑动手势由按下、移动和抬起三个触摸动作组成。当判断所述触摸动作是关节触摸动作后,可以检测由所述关节触摸动作组成的关节触摸手势(4013)。例如:关节点击手势由按下和抬起两个关节触摸动作组成;关节滑动手势由按下、移动和抬起三个关节触摸动作组成。
在一些实施例中,通过查找存储在存储器120中的关节触摸手势的轨迹与应用功能的关联关系,判断是否存在与所述关节触摸手势的轨迹关联的应用功能(403)。所述关节触摸手势可以通过不同的关节触摸动作组成。例如,关节点击手势由按下和抬起两个关节触摸动作组成,关节滑动手势由按下、移动和抬起三个关节触摸动作组成。在按下至抬起之间的移动轨迹为所述关节触摸手势的轨迹。用户可以预先设置多种关节触摸手势的轨迹与应用功能的关联关系,并将所述关节触摸手势的轨迹与应用功能的关联关系保存在存储器120中。
例如,用户可以预先设置关节触摸手势的轨迹“C”与照相应用功能的关联关系,且将所述关节触摸手势的轨迹“C”与照相应用功能的关联关系保存在存储器120中。当检测到作用于触敏表面的轨迹为“C”的关节触摸手势,通过查找所述存储在存储器120中的关节触摸手势的轨迹与应用功能的关联关系,可以判断得知所述关节触摸手势的轨迹“C”与照相应用功能关联。
在一些实施例中,当检测到在触敏显示单元的区A有按下的关节触摸动作,按下后在触敏显示单元上移动至区B,在区B有抬起的关节触摸动作。上述关节在区A按下,移动至区B后抬起的关节触摸事件,为关节触摸手势。例如:通过触点坐标信息可以判断所述触摸动作的位置(例如,区A或区B)。所述关节触摸手势可以由关节触摸动作组成。例如,关节点击手势由按下和抬
起两个关节触摸动作组成;关节滑动手势由按下、移动和抬起三个关节触摸动作组成。所述区A为关节触摸手势与触敏显示单元的开始接触区;所述区B为关节触摸手势与触敏显示单元的结束接触区。由区A移动至区B的移动轨迹即为触摸手势的轨迹。电子设备将检测到的关节触摸手势的轨迹与预设轨迹做比较,判断所述关节触摸手势的轨迹是否与预设轨迹匹配(405)。所述预设轨迹信息可以是电子设备出厂预置的,也可以是用户预设的。所述预设轨迹信息可以保存在存储器中(例如,图1中的存储器120);
示例性的,以预设轨迹为直线为例,所述直线可以为横线、竖线或者斜线,(例如:“—”、“|”、“/”或“\”)。需要说明的是,所述预设轨迹可以为其他形式的轨迹,可以根据具体设计需求做适应性调整。本发明实施例中以预设轨迹为直线示例,并不构成对本发明方案的限制。
图5为本发明实施例关节触摸手势的轨迹为横线(例如“—”)的示例性用户界面。例如:一开始检测到的关节触摸位置在区A(区A为开始接触区)和随后检测到关节触摸位置移至区B(区B为结束接触区),关节触摸手势的轨迹为由区A移动至区B的轨迹。如图虚线所示的横线轨迹,箭头方向为关节触摸手势由区A至区B的移动方向。电子设备将所检测到的横线轨迹与预设的直线轨迹做比较,判断所述横线轨迹与预设轨迹匹配。
在一些实施例中,如果判断所述关节触摸手势的轨迹与预设轨迹匹配,则响应所述关节触摸手势,在所述文本应用界面上显示文本选择区(406)。所述文本选择区位于第一端点和第二端点之间,所述第一端点位于所述文本应用界面中第一位置处,所述第二端点位于所述文本应用界面中第二位置处。例如:当判断所述关节触摸手势的轨迹与预设轨迹匹配后,在所述关节触摸手势与触敏显示单元的开始接触区A的位置插入第一端点,在所述关节触摸手势与触敏显示单元的结束接触区B的位置插入第二端点;第一端点的插入位置为第一位置,所述第一位置可以是显示器显示的文本应用界面中最接近所述区A的形心的文本字或词的开始或结尾;第二端点的插入位置为第二位置,所述第二位置可以是显示器显示的文本应用界面中最接近所述区B的形心的文本字或词的开始或结尾;位于所述文本应用界面中第一端点和第二端点之间的文本区域即为所述文本选择区。
图6为根据本发明实施例图5所示的关节触摸手势的轨迹得到的文本选择区的示例性用户界面。第一端点302a的插入位置为最接近图5示例的关节触摸手势与触敏显示单元的开始接触区A的形心的文本字或词的开始或结尾;第二端点302b的插入位置为最接近图5示例的关节触摸手势与触敏显示单元的结束接触区B的形心的文本字或词的开始或结尾;文本选择区301位于第一端点302a和第二端点302b之间。
图7为本发明实施例关节触摸手势的轨迹为竖线(例如“|”)的示例性用户界面。例如:一开始检测到的关节触摸位置在区A(区A为开始接触区)和随后检测到关节触摸位置移至区B(区B为结束接触区),关节触摸手势的轨迹为由区A移动至区B的轨迹,如图虚线所示的竖线轨迹,箭头方向为关节触摸手势由区A至区B的移动方向。电子设备将所检测到的竖线轨迹与预设的直线轨迹做比较,判断所述竖线轨迹与预设轨迹匹配。
图8为根据本发明实施例图7所示的关节触摸手势的轨迹得到的文本选择区的示例性用户界面。第一端点302a的插入位置为最接近图7示例的关节触摸手势与触敏显示单元的开始接触区A的形心的文本字或词的开始或结尾;第二端点302b的插入位置为最接近图7示例的关节触摸手势与触敏显示单元的结束接触区B的形心的文本字或词的开始或结尾;文本选择区301位于第一端点302a和第二端点302b之间。
图9为本发明实施例关节触摸手势的轨迹为斜线(例如“/”)的示例性用户界面。例如:一开始检测到的关节触摸位置在区A(区A为开始接触区)和随后检测到关节触摸位置移至区B(区B为结束接触区),关节触摸手势的轨迹为由区A移动至区B的轨迹,如图虚线所示的竖线轨迹,箭头方向为关节触摸手势由区A至区B的移动方向。电子设备将所检测到的斜线轨迹与预设的直线轨迹做比较,判断所述斜线轨迹与预设轨迹匹配。
图10为根据本发明实施例图9所示的关节触摸手势的轨迹得到的文本选择区的示例性用户界面。第一端点302a的插入位置为最接近图9示例的关节触摸手势与触敏显示单元的开始接触区A的形心的文本字或词的开始或结尾;第二端点302b的插入位置为最接近图9示例的关节触摸手势与触敏显示单元的结束接触区B的形心的文本字或词的开始或结尾;文本选择区301位于第一
端点302a和第二端点302b之间。
图11为本发明实施例关节触摸手势的轨迹为斜线(例如“\”)的示例性用户界面。例如:一开始检测到的关节触摸位置在区A(区A为开始接触区)和随后检测到关节触摸位置移至区B(区B为结束接触区),关节触摸手势的轨迹为由区A移动至区B的轨迹,如图虚线所示的竖线轨迹,箭头方向为关节触摸手势由区A至区B的移动方向;电子设备将所检测到的斜线轨迹与预设的直线轨迹做比较,判断所述斜线轨迹与预设轨迹匹配。
图12为根据本发明实施例图11所示的关节触摸手势的轨迹得到的文本选择区的示例性用户界面。第一端点302a的插入位置为最接近图11示例的关节触摸手势与触敏显示单元的开始接触区A的形心的文本字或词的开始或结尾;第二端点302b的插入位置为最接近图11示例的关节触摸手势与触敏显示单元的结束接触区B的形心的文本字或词的开始或结尾;文本选择区301位于第一端点302a和第二端点302b之间。
可选的,以上实施例均可以进一步对文本选择区执行字处理功能。
图13为本发明实施例提供的对文本选择区执行字处理功能的示例性用户界面。所述字处理功能可以包括复制、剪切、粘贴、翻译等。通过选择“更多”还可以调用其他字处理功能,可以包括为选择的文本加下划线,使选择的文本变粗体,改变所选择的文本的字体、字号、字体颜色选择等。图13示例的字处理功能的排列顺序、呈现形式可以根据设计需要做适当的调整。
本发明实施例的技术方案公开了当检测到作用于触敏表面的关节触摸手势,识别显示器显示的用户界面是否是文本应用界面,如果所述显示器显示的用户界面是文本应用界面,且所述关节触摸手势的轨迹与预设轨迹匹配时,响应所述关节触摸手势,在所述文本应用界面上显示文本选择区。本发明实施例的方案简化了选择文本的操作步骤,进而改进了用户体验。
图14是为本发明实施例提供的具有触敏显示单元的电子设备的内部结构简化示意图。
电子设备的功能块可由硬件、软件、或者软硬件组合实现,以执行本发明的原理。本领域的技术人员能够理解,图14中所述的功能模块可被组合为或
者被分离为子功能模块,以实现上述的本发明的原理。因此,本文的描述可以支持本文所述功能模块的任何可能的组合或分离或进一步限定。
基于同一发明构思由于该电子设备及装置的解决问题的原理与本发明实施例选择文本的方法相似,因此该电子设备及装置的实施可以参见方法的实施,重复之处不再赘述。
如图14所示,电子设备1400包括:触敏显示单元130、加速度传感器151、存储器120、处理器190。
所述触敏显示单元130可以为具有触敏表面的显示器,所述触敏显示单元130包括触敏表面和显示屏。所述触敏显示单元130用于显示屏幕界面,还用于接收作用于触敏表面的触摸动作,并将触摸信息传递给处理器190。所述触摸信息可以包括触点坐标、触敏显示单元的网格电容值、触摸动作中的一种或多种信号;所述触摸动作可以包括按下、移动以及抬起等动作。
所述加速度传感器151用于检测Z轴方向的加速度信号并将检测到的Z轴方向的加速度信号传递给处理器190。
所述存储区120存储指令。
所述处理器190与触敏显示单元130、加速度传感器151以及存储器120耦接。
所述处理器190调用存储在所述存储器120中的指令以实现当检测到作用于所述触敏表面的关节触摸手势,识别所述显示器显示的用户界面是否是文本应用界面。如果所述显示器显示的用户界面是文本应用界面,且所述关节触摸手势的轨迹与预设轨迹匹配时,响应所述关节触摸手势,在所述文本应用界面上显示文本选择区。所述文本选择区位于第一端点和第二端点之间;所述第一端点位于所述文本应用界面中第一位置处,所述第二端点位于所述文本应用界面中第二位置处。
可选的,如果所述显示器显示的用户界面不是文本应用界面,且存在与所述关节触摸手势的轨迹关联的第一应用功能时,执行所述第一应用功能。
可选的,如果所述显示器显示的用户界面是文本应用界面,但所述关节触摸手势的轨迹与预设轨迹不匹配,当存在与所述关节触摸手势的轨迹关联的第二应用功能时,执行所述第二应用功能。
当触敏显示单元接收到作用于触敏表面的触摸动作后,将触摸信息传递给处理器;所述触摸信息可以包括触点坐标、触敏显示单元的网格电容值、触摸动作中的一种或多种信号。所述触摸动作可以包括按下、移动以及抬起等动作。
在一些实施例中,可以基于所述网格电容信息以及所述触摸动作产生的Z轴方向加速度信号判断所述触摸动作是否是关节触摸动作;所述触敏表面网格电容信息包括网格电容值以及非零电容值的网格个数。
当所述网格电容值满足第一预设电容值范围,非零电容值的网格个数小于预设值,且所述Z轴方向加速度信号在第一预设加速度范围时,可以判断所述触摸动作是关节触摸动作。当所述网格电容值满足第二预设电容值范围,非零电容值的网格个数大于或等于预设值,且所述Z轴方向加速度信号在第二预设加速度范围时,可以判断所述触摸动作是手指触摸动作。
例如,当触敏显示单元的网格电容值指示最大电容值满足第一预设电容值范围(比如小于或等于0.42pF),分布有非零电容值的网格个数小于7,在预设时间内,Z轴方向加速度信号在第一预设加速度范围(比如,在5ms内,加速度信号大于3g)时,可以判断该触摸动作为关节触摸动作。当触敏显示单元的网格电容值指示最大电容值满足第二预设电容值范围(比如大于0.42pF、小于或等于0.46pF),分布有非零电容值的网格个数大于或等于7,且在预设时间内,Z轴方向加速度信号在第二预设加速度范围(比如,在5ms内,加速度信号小于2g,g为重力加速度)时,可以判断该触摸动作为手指触摸动作。可以理解的是,本发明实施例中的关节触摸动作并非一定由手指关节触发,也可以是其他物体以很快的速度敲击触敏显示单元130,只要满足上述关节触摸动作的判断条件均可称为本发明实施例的关节触摸动作。
触摸手势可以由触摸动作组成。例如,点击手势由按下和抬起两个触摸动作组成;滑动手势由按下、移动和抬起三个触摸动作组成。当判断所述触摸动作是关节触摸动作后,检测由关节触摸动作组成的关节触摸手势。例如:关节点击手势由按下和抬起两个关节触摸动作组成;关节滑动手势由按下、移动和抬起三个关节触摸动作组成。
在一些实施例中,当检测到在触敏显示单元的区A有按下的关节触摸动作,按下后在触敏显示单元上移动至区B,在区B有抬起的关节触摸动作。上
述关节在区A按下,移动至区B后抬起的关节触摸事件,为关节触摸手势。例如:通过触点坐标信息可以判断所述触摸动作的位置(例如,区A或区B)。所述关节触摸手势可以由关节触摸动作组成。例如,关节点击手势由按下和抬起两个关节触摸动作组成;关节滑动手势由按下、移动和抬起三个关节触摸动作组成。所述区A为关节触摸手势与触敏显示单元的开始接触区;所述区B为关节触摸手势与触敏显示单元的结束接触区。由区A移动至区B的移动轨迹即为触摸手势的轨迹。电子设备将检测到的关节触摸手势的轨迹与预设轨迹做比较,判断所述关节触摸手势的轨迹是否与预设轨迹匹配。所述预设轨迹信息可以是电子设备出厂预置的,也可以是用户预设的。所述预设轨迹信息可以保存在存储器中(例如,图1中的存储器120)。
在一些实施例中,通过查找存储在存储器120中的关节触摸手势的轨迹与应用功能的关联关系,判断是否存在与所述关节触摸手势的轨迹关联的应用功能。所述关节触摸手势可以通过不同的关节触摸动作组成。例如,关节点击手势由按下和抬起两个关节触摸动作组成,关节滑动手势由按下、移动和抬起三个关节触摸动作组成。在按下至抬起之间的移动轨迹为所述关节触摸手势的轨迹。用户可以预先设置多种关节触摸手势的轨迹与应用功能的关联关系,并将所述关节触摸手势的轨迹与应用功能的关联关系保存在存储器120中。
例如,用户可以预先设置关节触摸手势的轨迹“C”与照相应用功能的关联关系,且将所述关节触摸手势的轨迹“C”与照相应用功能的关联关系保存在存储器120中。当检测到作用于触敏表面的轨迹为“C”的关节触摸手势,通过查找所述存储在存储器120中的关节触摸手势的轨迹与应用功能的关联关系,可以判断得知所述关节触摸手势的轨迹“C”与照相应用功能关联。
所述在所述文本应用界面上显示文本选择区,具体为当判断所述关节触摸手势的轨迹与预设轨迹匹配后,在所述文本应用界面上显示文本选择区。所述文本选择区位于第一端点和第二端点之间,所述第一端点位于所述文本应用界面中第一位置处,所述第二端点位于所述文本应用界面中第二位置处。例如:当判断所述关节触摸手势的轨迹与预设轨迹匹配后,在所述关节触摸手势与触敏显示单元的开始接触区A的位置插入第一端点,在所述关节触摸手势与触敏显示单元的结束接触区B的位置插入第二端点。第一端点的插入位置为第一位
置,所述第一位置可以是显示器显示的文本应用界面中最接近所述区A的形心的文本字或词的开始或结尾。第二端点的插入位置为第二位置,所述第二位置可以是显示器显示的文本应用界面中最接近所述区B的形心的文本字或词的开始或结尾。位于所述文本应用界面中第一端点和第二端点之间的文本区域即为所述文本选择区。
图15是根据为本发明实施例提供的一种装置的功能结构示意图。
在一些实施例中,所述装置包括检测单元1501、识别单元1502、判断单元1503、选择文本单元1504。
所述检测单元1501,用于检测作用于触敏表面的关节触摸手势。
所述识别单元1502,用于识别显示器显示的用户界面是否是文本应用界面。
所述判断单元1503,用于判断所述关节触摸手势的轨迹是否与预设轨迹匹配。
所述选择文本单元1504,用于当检测到作用于触敏表面的关节触摸手势,识别显示器显示的用户界面是否是文本应用界面。如果所述显示器显示的用户界面是文本应用界面,且所述关节触摸手势的轨迹与预设轨迹匹配时,响应所述关节触摸手势,在所述文本应用界面上显示文本选择区。所述文本选择区位于第一端点和第二端点之间;所述第一端点位于所述文本应用界面中第一位置处,所述第二端点位于所述文本应用界面中第二位置处。
可选的,所述装置还包括第一判断单元1506,第一执行单元1507。
所述第一判断单元1506,用于如果所述显示器显示的用户界面不是文本应用界面时,判断是否存在与所述关节触摸手势的轨迹关联的应用功能。
所述第一执行单元1507,用于如果所述显示器显示的用户界面不是文本应用界面,且存在与所述关节触摸手势的轨迹关联的应用功能时,执行所述应用功能。
可选的,所述装置还包括第二判断单元1508,第二执行单元1509。
所述第二判断单元1508,用于如果所述显示器显示的用户界面是文本应用界面,但所述关节触摸手势的轨迹不是预设轨迹时,判断是否存在与所述关
节触摸手势的轨迹关联的应用功能。
所述第二执行单元1509,用于如果所述显示器显示的用户界面是文本应用界面,但所述关节触摸手势的轨迹不是预设轨迹,且存在与所述关节触摸手势的轨迹关联的应用功能时,执行所述应用功能。
当触敏显示单元接收到作用于触敏表面的触摸动作后,将触摸信息传递给处理器。所述触摸信息可以包括触点坐标、触敏显示单元的网格电容值、触摸动作中的一种或多种信号。所述触摸动作可以包括按下、移动以及抬起等动作。
在一些实施例中,可以基于所述网格电容信息以及所述触摸动作产生的Z轴方向加速度信号判断所述触摸动作是否是关节触摸动作。所述触敏表面网格电容信息包括网格电容值以及非零电容值的网格个数。
当所述网格电容值满足第一预设电容值范围,非零电容值的网格个数小于预设值,且所述Z轴方向加速度信号在第一预设加速度范围时,可以判断所述触摸动作是关节触摸动作。当所述网格电容值满足第二预设电容值范围,非零电容值的网格个数大于或等于预设值,且所述Z轴方向加速度信号在第二预设加速度范围时,可以判断所述触摸动作是手指触摸动作。
例如,当触敏显示单元的网格电容值指示最大电容值满足第一预设电容值范围(比如小于或等于0.42pF),分布有非零电容值的网格个数小于7,在预设时间内,Z轴方向加速度信号在第一预设加速度范围(比如,在5ms内,加速度信号大于3g)时,可以判断该触摸动作为关节触摸动作。当触敏显示单元的网格电容值指示最大电容值满足第二预设电容值范围(比如大于0.42pF、小于或等于0.46pF),分布有非零电容值的网格个数大于或等于7,且在预设时间内,Z轴方向加速度信号在第二预设加速度范围(比如,在5ms内,加速度信号小于2g,g为重力加速度)时,可以判断该触摸动作为手指触摸动作。可以理解的是,本发明实施例中的关节触摸动作并非一定由手指关节触发,也可以是其他物体以很快的速度敲击触敏显示单元130,只要满足上述关节触摸动作的判断条件均可称为本发明实施例的关节触摸动作。
触摸手势可以由触摸动作组成。例如,点击手势由按下和抬起两个触摸动作组成;滑动手势由按下、移动和抬起三个触摸动作组成。当判断所述触摸动作是关节触摸动作后,可以检测由关节触摸动作组成的关节触摸手势。例如:
关节点击手势由按下和抬起两个关节触摸动作组成;关节滑动手势由按下、移动和抬起三个关节触摸动作组成。
在一些实施例中,当检测到在触敏显示单元的区A有按下的关节触摸动作,按下后在触敏显示单元上移动至区B,在区B有抬起的关节触摸动作。上述关节在区A按下,移动至区B后抬起的关节触摸事件,为关节触摸手势。例如:通过触点坐标信息可以判断所述触摸动作的位置(例如,区A或区B)。所述关节触摸手势可以由关节触摸动作组成。例如,关节点击手势由按下和抬起两个关节触摸动作组成;关节滑动手势由按下、移动和抬起三个关节触摸动作组成。所述区A为关节触摸手势与触敏显示单元的开始接触区;所述区B为关节触摸手势与触敏显示单元的结束接触区。由区A移动至区B的移动轨迹即为触摸手势的轨迹。电子设备将检测到的关节触摸手势的轨迹与预设轨迹做比较,判断所述关节触摸手势的轨迹是否与预设轨迹匹配。所述预设轨迹信息可以是电子设备出厂预置的,也可以是用户预设的。所述预设轨迹信息可以保存在存储器中(例如,图1中的存储器120)。
在一些实施例中,通过查找存储在存储器120中的关节触摸手势的轨迹与应用功能的关联关系,判断是否存在与所述关节触摸手势的轨迹关联的应用功能。所述关节触摸手势可以通过不同的关节触摸动作组成。例如,关节点击手势由按下和抬起两个关节触摸动作组成,关节滑动手势由按下、移动和抬起三个关节触摸动作组成。在按下至抬起之间的移动轨迹为所述关节触摸手势的轨迹。用户可以预先设置多种关节触摸手势的轨迹与应用功能的关联关系,并将所述关节触摸手势的轨迹与应用功能的关联关系保存在存储器120中。
例如,用户可以预先设置关节触摸手势的轨迹“C”与照相应用功能的关联关系,且将所述关节触摸手势的轨迹“C”与照相应用功能的关联关系保存在存储器120中。当检测到作用于触敏表面的轨迹为“C”的关节触摸手势,通过查找所述存储在存储器120中的关节触摸手势的轨迹与应用功能的关联关系,可以判断得知所述关节触摸手势的轨迹“C”与照相应用功能关联。
所述在所述文本应用界面上显示文本选择区,具体为当判断所述关节触摸手势的轨迹与预设轨迹匹配后,在所述文本应用界面上显示文本选择区。所述文本选择区位于第一端点和第二端点之间,所述第一端点位于所述文本应用界
面中第一位置处,所述第二端点位于所述文本应用界面中第二位置处。例如:当判断所述关节触摸手势的轨迹与预设轨迹匹配后,在所述关节触摸手势与触敏显示单元的开始接触区A的位置插入第一端点,在所述关节触摸手势与触敏显示单元的结束接触区B的位置插入第二端点。第一端点的插入位置为第一位置,所述第一位置可以是显示器显示的文本应用界面中最接近所述区A的形心的文本字或词的开始或结尾。第二端点的插入位置为第二位置,所述第二位置可以是显示器显示的文本应用界面中最接近所述区B的形心的文本字或词的开始或结尾。位于所述文本应用界面中第一端点和第二端点之间的文本区域即为所述文本选择区。
基于同一发明构思,由于该电子设备及装置的解决问题的原理与本发明实施例选择文本的方法相似,因此该电子设备及装置的实施可以参见方法的实施,重复之处不再赘述。
本发明实施例的技术方案公开了当检测到作用于触敏表面的关节触摸手势,识别显示器显示的用户界面是否是文本应用界面,如果所述显示器显示的用户界面是文本应用界面,且所述关节触摸手势的轨迹与预设轨迹匹配时,响应所述关节触摸手势,在所述文本应用界面上显示文本选择区。本发明实施例的方案简化了选择文本的操作步骤,进而改进了用户体验。
在本申请所提供的实施例中,本领域普通技术人员可以理解实现上述实施例的全部或部分步骤仅仅是示意性的,可以通过硬件来完成,也可以通过程序来指令相关的硬件完成,当通过程序指令相关的硬件完成时,所述的程序可以存储在一个非易失性(non-transitory)计算机可读取存储介质中。基于这样的理解,本发明的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的全部或部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)执行本发明各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器(ROM,Read-Only Memory)、随机存取存储器(RAM,Random Access Memory)、磁碟或者光盘等各种可以存储程序代码的介质。
以上所述,以上实施例仅用以说明本发明的技术方案,而非对其限制;尽管参照前述实施例对本发明进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本发明各实施例技术方案的精神和范围。
Claims (14)
- 一种选择文本的方法,应用于一种便携式电子设备上,所述电子设备包括具有触敏表面的显示器,其特征在于,所述方法包括:当检测到作用于所述触敏表面的关节触摸手势;识别所述显示器显示的用户界面是否是文本应用界面;如果所述显示器显示的用户界面是文本应用界面,且所述关节触摸手势的轨迹与预设轨迹匹配,则响应所述关节触摸手势,在所述文本应用界面上显示文本选择区,所述文本选择区位于第一端点和第二端点之间;所述第一端点位于所述文本应用界面中第一位置处;所述第二端点位于所述文本应用界面中第二位置处。
- 如权利要求1所述的方法,其特征在于,所述方法还包括,如果所述显示器显示的用户界面不是文本应用界面,且存在与所述关节触摸手势的轨迹关联的第一应用功能时,执行所述第一应用功能。
- 如权利要求1所述的方法,其特征在于,所述方法还包括,如果所述显示器显示的用户界面是文本应用界面,但所述关节触摸手势的轨迹与所述预设轨迹不匹配,当存在与所述关节触摸手势的轨迹关联的第二应用功能时,执行所述第二应用功能。
- 如权利要求1-3任一项所述的方法,其特征在于,所述关节触摸手势由关节触摸动作组成;当作用于所述触敏表面的触摸动作产生的触敏表面网格电容值满足第一预设电容值范围,非零电容值的网格个数小于预设值,且Z轴方向加速度信号在第一预设加速度范围内时,则所述触摸动作是所述关节触摸动作,由所述关节触摸动作组成的手势是所述关节触摸手势。
- 一种便携式电子设备,其特征在于,所述便携式电子设备包括:显示器,所述显示器具有触敏表面;加速度传感器,用于获取Z轴方向的加速度;存储器,用于存储指令;处理器,所述处理器调用存储在所述存储器中的指令以实现:当检测到作用于所述触敏表面的关节触摸手势;识别所述显示器显示的用户界面是否是文本应用界面;如果所述显示器显示的用户界面是文本应用界面,且所述关节触摸手势的轨迹与预设轨迹匹配时,响应所述关节触摸手势,在所述文本应用界面上显示文本选择区,所述文本选择区位于第一端点和第二端点之间;所述第一端点位于所述文本应用界面中第一位置处;所述第二端点位于所述文本应用界面中第二位置处。
- 如权利要求5所述的便携式电子设备,其特征在于,所述指令进一步用于:如果所述显示器显示的用户界面不是文本应用界面,且存在与所述关节触摸手势的轨迹关联的第一应用功能时,执行所述第一应用功能。
- 如权利要求5所述的便携式电子设备,其特征在于,所述指令进一步用于:如果所述显示器显示的用户界面是文本应用界面,但所述关节触摸手势的轨迹与预设轨迹不匹配,当存在与所述关节触摸手势的轨迹关联的第二应用功能时,执行所述第二应用功能。
- 如权利要求5-7任一项所述的便携式电子设备,其特征在于,所述关节触摸手势由关节触摸动作组成;当作用于所述触敏表面的触摸动作产生的触敏表面网格电容值满足第一预设电容值范围,非零电容值的网格个数小于预设值,且Z轴方向加速度信号在第一预设加速度范围内时,则所述触摸动作是所述关节触摸动作,由所述关节触摸动作组成的手势是所述关节触摸手势。
- 一种装置,其特征在于,所述装置包括:检测单元、识别单元、判断单元、选择文本单元;所述检测单元,用于检测作用于触敏表面的关节触摸手势;所述识别单元,用于识别显示器显示的用户界面是否是文本应用界面;所述判断单元,用于判断所述关节触摸手势的轨迹是否与预设轨迹匹配;所述选择文本单元,用于当检测到作用于所述触敏表面的关节触摸手势,如果显示器显示的用户界面是文本应用界面,且所述关节触摸手势的轨迹与预设轨迹匹配时,响应所述关节触摸手势,在所述文本应用界面上显示文本选择区,所述文本选择区位于第一端点和第二端点之间;所述第一端点位于所述文本应用界面中第一位置处;所述第二端点位于所述文本应用界面中第二位置处。
- 如权利要求9所述的装置,其特征在于,所述装置还包括:第一判断单元、第一执行单元:第一判断单元用于当所述显示器显示的用户界面不是文本应用界面时,判断是否存在与所述关节触摸手势的轨迹关联的应用功能;第一执行单元用于当存在与所述关节触摸手势的轨迹关联的第一应用功能时,执行所述第一应用功能。
- 如权利要求9所述的装置,其特征在于,所述装置还包括:第二判断单元、第二执行单元:所述第二判断单元用于当所述显示器显示的用户界面是文本应用界面,但所述关节触摸手势的轨迹与预设轨迹不匹配时,判断是否存在与所述关节触摸手势的轨迹关联的应用功能;所述第二执行单元用于当存在与所述关节触摸手势关联的第二应用功能时,执行所述第二应用功能。
- 如权利要求9-11任一项所述的装置,其特征在于,所述关节触摸手势由关节触摸动作组成;当作用于所述触敏表面的触摸动作产生的触敏表面网格电容值满足第一预设电容值范围,非零电容值的网格个数小于预设值,且Z轴方向加速度信号在第一预设加速度范围内时,则所述触摸动作是所述关节触摸动作,由所述关节触摸动作组成的手势是所述关节触摸手势。
- 一种便携式电子设备上的用户界面,其特征在于,所述便携式电子设备包括显示器、存储器以及用于执行存储在所述存储器中的指令的处理器,其中,所述显示器具有触敏表面,所述用户界面包括:用于显示文本应用的界面;当检测到作用于所述触敏表面的关节触摸手势,如果显示器显示的用户界面是文本应用界面,且所述关节触摸手势的轨迹与预设轨迹匹配时,响应所述关节触摸手势,在所述文本应用界面上显示文本选择区的界面,其中,所述文本选择区位于第一端点和第二端点之间;所述第一端点位于所述文本应用界面中第一位置处;所述第二端点位于所述文本应用界面中第二位置处。
- 一种存储一个或多个程序的非易失性计算机可读存储介质,其特征 在于,所述一个或多个程序包括指令,所述指令当被包括具有触敏表面的显示器的便携式电子设备执行时使所述便携式电子设备执行以下事件:当检测到作用于所述触敏表面的关节触摸手势;识别所述显示器显示的用户界面是否是文本应用界面;如果所述显示器显示的用户界面是文本应用界面,且所述关节触摸手势的轨迹与预设轨迹匹配时,响应所述关节触摸手势,在所述文本应用界面上显示文本选择区,所述文本选择区位于第一端点和第二端点之间;所述第一端点位于所述文本应用界面中第一位置处;所述第二端点位于所述文本应用界面中第二位置处。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2015/088617 WO2017035739A1 (zh) | 2015-08-31 | 2015-08-31 | 一种选择文本的方法 |
CN201580030746.XA CN107924261B (zh) | 2015-08-31 | 2015-08-31 | 一种选择文本的方法 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2015/088617 WO2017035739A1 (zh) | 2015-08-31 | 2015-08-31 | 一种选择文本的方法 |
Publications (2)
Publication Number | Publication Date |
---|---|
WO2017035739A1 WO2017035739A1 (zh) | 2017-03-09 |
WO2017035739A9 true WO2017035739A9 (zh) | 2017-08-17 |
Family
ID=58186454
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2015/088617 WO2017035739A1 (zh) | 2015-08-31 | 2015-08-31 | 一种选择文本的方法 |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN107924261B (zh) |
WO (1) | WO2017035739A1 (zh) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112115947A (zh) * | 2020-09-27 | 2020-12-22 | 北京小米移动软件有限公司 | 文本处理方法及装置、电子设备、存储介质 |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8607167B2 (en) * | 2007-01-07 | 2013-12-10 | Apple Inc. | Portable multifunction device, method, and graphical user interface for providing maps and directions |
US8650507B2 (en) * | 2008-03-04 | 2014-02-11 | Apple Inc. | Selecting of text using gestures |
US8786556B2 (en) * | 2009-03-12 | 2014-07-22 | Nokia Corporation | Method and apparatus for selecting text information |
CN101859177B (zh) * | 2010-06-09 | 2012-10-17 | 天津中科津龙集成电路技术有限公司 | 一种在智能电子装置上调用和操作应用程序的方法及装置 |
KR101838260B1 (ko) * | 2011-06-03 | 2018-03-13 | 구글 엘엘씨 | 텍스트를 선택하기 위한 제스처들 |
CN103365570B (zh) * | 2012-03-26 | 2016-12-14 | 华为技术有限公司 | 一种选择内容的方法及装置 |
KR20140113119A (ko) * | 2013-03-15 | 2014-09-24 | 엘지전자 주식회사 | 전자 기기 및 그 제어방법 |
US10599250B2 (en) * | 2013-05-06 | 2020-03-24 | Qeexo, Co. | Using finger touch types to interact with electronic devices |
CN104360808A (zh) * | 2014-12-04 | 2015-02-18 | 李方 | 一种利用符号手势指令进行文档编辑的方法及装置 |
-
2015
- 2015-08-31 CN CN201580030746.XA patent/CN107924261B/zh active Active
- 2015-08-31 WO PCT/CN2015/088617 patent/WO2017035739A1/zh active Application Filing
Also Published As
Publication number | Publication date |
---|---|
CN107924261B (zh) | 2020-10-23 |
CN107924261A (zh) | 2018-04-17 |
WO2017035739A1 (zh) | 2017-03-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
AU2020201096B2 (en) | Quick screen splitting method, apparatus, and electronic device, display UI, and storage medium | |
CN108549519B (zh) | 分屏处理方法、装置、存储介质和电子设备 | |
US10725646B2 (en) | Method and apparatus for switching screen interface and terminal | |
CN109426410B (zh) | 控制光标移动的方法、内容选择方法、控制页面滚动的方法及电子设备 | |
WO2016066092A1 (zh) | 多媒体播放控制方法、装置以及存储介质 | |
EP3647926A1 (en) | Mobile terminal and split screen control method thereof, and computer readable storage medium | |
CN105975190B (zh) | 一种图形界面的处理方法、装置及系统 | |
WO2020007114A1 (zh) | 分屏应用切换方法、装置、存储介质和电子设备 | |
CN110908582A (zh) | 一种控制方法、触控笔及电子组件 | |
CN106371749A (zh) | 一种终端控制的方法和装置 | |
WO2019011335A1 (zh) | 一种移动终端及其控制方法和可读存储介质 | |
WO2018039914A1 (zh) | 一种数据复制方法及用户终端 | |
WO2017035740A9 (zh) | 一种选择文本的方法 | |
JP2018522305A (ja) | 画像変形処理方法、デバイス、および記憶媒体 | |
WO2022063034A1 (zh) | 一种输入界面的显示方法及终端 | |
WO2017035794A1 (zh) | 显示器操作的方法、装置、用户界面及存储介质 | |
WO2017035739A1 (zh) | 一种选择文本的方法 | |
EP3674867B1 (en) | Human-computer interaction method and electronic device | |
WO2017166209A1 (zh) | 设置勿触区域的方法、装置、电子设备、显示界面以及存储介质 | |
CN110109582B (zh) | 移动终端的显示方法、装置及存储介质 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15902546 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 15902546 Country of ref document: EP Kind code of ref document: A1 |