CN113485590A - Touch operation method and device - Google Patents

Touch operation method and device Download PDF

Info

Publication number
CN113485590A
CN113485590A CN202110413329.0A CN202110413329A CN113485590A CN 113485590 A CN113485590 A CN 113485590A CN 202110413329 A CN202110413329 A CN 202110413329A CN 113485590 A CN113485590 A CN 113485590A
Authority
CN
China
Prior art keywords
input
contact
sliding
identifier
area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110413329.0A
Other languages
Chinese (zh)
Inventor
董川
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202110413329.0A priority Critical patent/CN113485590A/en
Publication of CN113485590A publication Critical patent/CN113485590A/en
Priority to PCT/CN2022/086662 priority patent/WO2022218352A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04812Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus

Abstract

The application discloses a touch operation method and device, and belongs to the technical field of communication. The method comprises the following steps: receiving a first input; displaying a contact point identification on a display interface in response to the first input; receiving a second input in an operation area, wherein the operation area is an area except for the area where the contact mark is located in the display interface; in response to the second input, controlling the contact identification movement based on the second input; receiving a third input within the operational area; and responding to the third input, and executing the function corresponding to the third input at the termination position of the contact identification movement.

Description

Touch operation method and device
Technical Field
The application belongs to the technical field of communication, and particularly relates to a touch operation method and device.
Background
Touch-control electronic devices such as mobile phones and tablet computers are widely available in people's lives. However, when a finger and a stylus are used to perform a touch operation on a display screen of an electronic device, if the size of the display screen of the electronic device is small, the finger and the stylus often block a user's sight line, so that the user cannot accurately control the position of the finger and the stylus to be operated corresponding to the touch operation on the display screen. Resulting in invalid touch operation and reduced touch operation efficiency.
Disclosure of Invention
An object of the embodiments of the present application is to provide a touch operation method and device, which can solve the problem of low touch operation efficiency.
In a first aspect, an embodiment of the present application provides a touch operation method, where the method includes:
receiving a first input;
displaying a contact point identification on a display interface in response to the first input;
receiving a second input in an operation area, wherein the operation area is an area except for the area where the contact mark is located in the display interface;
in response to the second input, controlling the contact identification movement based on the second input;
receiving a third input within the operational area;
and responding to the third input, and executing the function corresponding to the third input at the termination position of the contact identification movement.
In a second aspect, an embodiment of the present application provides an operation control device, including:
a receiving module for receiving a first input;
the display module is used for responding to the first input and displaying the contact identification on a display interface;
the receiving module is further configured to receive a second input in an operation area, where the operation area is an area of the display interface other than the area where the contact identifier is located;
a control module for controlling the contact identifier movement based on the second input in response to the second input;
the receiving module is further used for receiving a third input in the operation area;
and the execution module is used for responding to the third input and executing the function corresponding to the third input at the termination position of the contact identification movement.
In a third aspect, an embodiment of the present application provides an electronic device, which includes a processor, a memory, and a program or instructions stored on the memory and executable on the processor, and when executed by the processor, the program or instructions implement the steps of the method according to the first aspect.
In a fourth aspect, embodiments of the present application provide a readable storage medium, on which a program or instructions are stored, which when executed by a processor implement the steps of the method according to the first aspect.
In a fifth aspect, an embodiment of the present application provides a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and the processor is configured to execute a program or instructions to implement the method according to the first aspect.
In an embodiment of the application, the contact point identification may be displayed on the display interface by responding to the first input. In response to a second input received at the operation area, the contact identification movement is controlled based on the second input, and in response to a third input received at the operation area, a function corresponding to the third input is executed at an end position of the contact identification movement. The operation area is an area on the display screen except the area where the contact mark is located, and therefore the operation area has a certain distance from the contact mark. Therefore, when the contact point identifier is moved by the second input on the operation area by the finger and the touch pen, the sight of the user cannot be shielded by the finger and the touch pen, and the user can accurately position the moving position of the contact point identifier. Therefore, the contact mark can be accurately controlled to move to the position to be operated, and the finger and the touch pen can accurately trigger the function corresponding to the third input at the position to be operated. Compared with the related art, the method and the device reduce invalid touch operation, reduce the operation error touch rate, improve the touch operation efficiency and improve the user experience.
Drawings
Fig. 1 is a flowchart of a touch operation method according to an embodiment of the present disclosure;
FIG. 2 is a schematic diagram of a target pop-up interface provided by an embodiment of the present application;
FIG. 3 is a schematic diagram of a shortcut operation interface provided in an embodiment of the present application;
fig. 4 is a flowchart of another touch operation method provided in the embodiment of the present application;
FIG. 5 is one of the operational schematics provided by embodiments of the present application;
FIG. 6 is a second schematic diagram of the operation provided by the embodiment of the present application;
FIG. 7 is a third exemplary diagram of the operation provided by the embodiment of the present application;
FIG. 8 is a fourth operational schematic provided by an embodiment of the present application;
FIG. 9 is a fifth operational schematic provided by an embodiment of the present application;
FIG. 10 is a sixth operational schematic provided by an embodiment of the present application;
fig. 11 is a block diagram of a touch operation device according to an embodiment of the present disclosure;
fig. 12 is a schematic structural diagram of an electronic device provided in an embodiment of the present application;
fig. 13 is a schematic hardware structure diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described clearly below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments that can be derived by one of ordinary skill in the art from the embodiments given herein are intended to be within the scope of the present disclosure.
The terms first, second and the like in the description and in the claims of the present application are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It will be appreciated that the data so used may be interchanged under appropriate circumstances such that embodiments of the application may be practiced in sequences other than those illustrated or described herein, and that the terms "first," "second," and the like are generally used herein in a generic sense and do not limit the number of terms, e.g., the first term can be one or more than one. In addition, "and/or" in the specification and claims means at least one of connected objects, a character "/" generally means that a preceding and succeeding related objects are in an "or" relationship.
The touch operation method provided by the embodiment of the present application is described in detail below with reference to the accompanying drawings through specific embodiments and application scenarios thereof.
Touch-control electronic devices such as mobile phones and tablet computers are widely available in people's lives.
However, when a finer touch operation is performed on a display screen of an electronic device by using a finger and a stylus, for example, a photo detail is beautified by using a cropping software, a drawing software is used for creating a fine drawing, or a file editing software is used for modifying file content, if the size of the display screen of the electronic device is smaller, the finger and the stylus often block the sight of a user, so that the user cannot accurately control the finger and the stylus to contact a position to be operated corresponding to the touch operation on the display screen. The position to be operated refers to a position at which the user wants to perform a touch operation. The user can only judge the approximate position to be operated corresponding to the touch operation by feeling, so that the control element is controlled to move to the position to be operated, and the touch operation is executed. Therefore, the user cannot accurately control the finger and the stylus to execute the touch operation at the position to be operated, so that invalid touch operation is caused, the operation error touch rate is improved, and the touch operation efficiency is reduced.
Referring to fig. 1, a flowchart of a touch operation method according to an embodiment of the present disclosure is shown. The touch operation method can be applied to touch electronic equipment. The touch electronic device may be a terminal equipped with a touch display screen. For example, the terminal may be a mobile phone, a tablet computer, an e-book reader, smart glasses, a smart watch, a music player, a notebook computer, a laptop portable computer, a desktop computer, and the like. As shown in fig. 1, the touch operation method may include:
step 101, receiving a first input.
In the embodiment of the present application, the terminal may have a precision operation mode (also referred to as a precision operation module). In the accurate operation mode, the user can control the finger and the stylus to accurately contact the position to be operated so as to accurately control the contact member to execute touch operation at the position to be operated. The position to be operated refers to a position where the user wants to perform a touch operation. For example, a word of "big family good" is displayed in the display interface. The user wants to click on the "home" word. The position of the word "home" is the position to be operated. The precise operation mode may include a plurality of touch operation modes such as a click operation mode, a long press operation mode, and a slide operation mode.
Optionally, the terminal may receive a first input on a setting page to start the precision operation mode. Or the setting page displays the accurate operation identifier. The terminal can receive a first input aiming at the precise operation identifier in a setting page so as to start a precise operation mode. Under the condition that the accurate operation modes are of various types, the setting page can display accurate operation identifications corresponding to different accurate operation mode types one to one. The setting page can be any page displayed by the terminal, such as a program page of a target application program, a target popup interface or a target floating frame page. The first input may be a click, long press, swipe, hover gesture, or voice input, among other types of input.
For example, the user may slide to enter a W typeface in a black screen state. The terminal may receive a sliding input of W-words. Alternatively, as shown in FIG. 2, the user may press long anywhere in the program interface of the target application. The terminal may display the target popup interface 201 after receiving the long press input. The target pop-up window interface 201 comprises an operation bar area 202, and a precise operation identifier 203 is displayed in the operation bar area 202. Wherein the operation bar area 202 can be located at the right side of the target pop-up window interface 201. The user can click on the precise operation identifier. And enabling the terminal to receive click input aiming at the accurate operation identifier on the target popup interface. Alternatively, as shown in fig. 3, the user may slide down from the top of the touch display screen in the display interface to perform a pull-down operation. The terminal may display the shortcut operation interface 301 after receiving the downward slide input. The shortcut operation interface 301 includes a precise operation identifier 302. Namely, an accurate operation mode shortcut setting entry is displayed on the shortcut operation interface. The user can click on the precise operation identifier. And the terminal receives click input aiming at the accurate operation identifier on the shortcut operation interface.
And 102, responding to the first input, and displaying a contact point identifier on a display interface.
In the embodiment of the application, after receiving the first input, the terminal may display the contact point identifier on the display interface in response to the first input. Alternatively, the contact identifier may be a red dot icon, a finger icon, or a colored arrow icon, etc. The terminal can display the contact point identification at any position in the display interface. Or the terminal can display the contact point identification at the set position in the display interface. The set position may be a center position of the touch display screen, or the like.
Optionally, the terminal may further display the contact point identifier on the display interface based on an input operation of the user. The process of displaying the contact point identifier on the display interface by the terminal may include: the terminal receives a sixth input in the display interface, and in response to the sixth input, displays the contact point identification in the display interface based on the sixth input. The sixth input may be a click, long press, slide, hover gesture, or voice input.
For example, when the sixth input is an input for the same operation position, such as a click input or a long-press input, the displaying, by the terminal, the contact point identifier on the display interface based on the sixth input may include: and the terminal displays the contact identification at the operation position corresponding to the sixth input in the display interface. When the sixth input is input through a plurality of operation positions, such as a sliding input or a hovering gesture input, the displaying, by the terminal, the contact identifier on the display interface based on the sixth input may include: and the terminal displays the contact point identification at the termination position of the sixth input in the display interface. Wherein, the termination position of the sliding input can be the position of the finger when being lifted in the sliding input.
Specifically, the user may click the screen within the setting area where the position to be operated is located, that is, near the position to be operated. And enabling the terminal to receive click input in the display interface, and responding to the click input to display the contact point identification at a click position corresponding to the click input. Or, the user can control the finger and the touch pen to slide on the touch display screen to the set area range of the position to be operated. The terminal is caused to receive a slide input, and in response to the slide input, a contact identification is displayed at an end position of the slide input.
In this way, since the contact point identification can be displayed at the set position based on the sixth input. Therefore, the display position of the contact mark can be manually controlled, and the flexibility is high. In addition, the user can display the contact identification near the position to be operated through the sixth input, so that the subsequent moving distance for moving the contact identification from the current position to the position to be operated is reduced, namely, the subsequent operation for moving the contact identification is reduced, the operating efficiency is improved, and the user experience is improved.
And 103, receiving a second input in an operation area, wherein the operation area is an area except for the area where the contact mark is located in the display interface.
In the embodiment of the application, after the terminal displays the contact identifier, the display interface of the terminal can be divided into an area where the contact identifier is located and an operation area, wherein the operation area is an area other than the area where the contact identifier is located. Alternatively, the operation area may be the entire area other than the area where the contact mark is located, or the operation area may be a set area other than the area where the contact mark is located. Alternatively, the area where the contact identifier is located may refer to a circular area with a numerical radius set around the contact identifier. Or a rectangular area with the contact mark as the center point and the length and the width as set values. The operation area is not limited, and it needs to be ensured that when a user performs an operation in the operation area by using a finger and a touch pen, the finger and the touch pen do not form sight shielding on the contact mark.
And 104, responding to the second input, and controlling the contact identification to move based on the second input.
Alternatively, the second input may be a click, long press, swipe, hover gesture, or voice input, among other types of input. In the embodiment of the application, the second input is taken as a sliding input or a clicking input as an example, and the terminal is further explained based on the second input to control the contact point identifier to move.
In a first alternative implementation, the second input is a slide input. The process of the terminal controlling the contact identification movement based on the second input may include: the terminal acquires the sliding direction information and the track length information of the sliding input. And controlling the contact mark to move the track length along the sliding direction according to the sliding direction information and the track length information.
In the embodiment of the application, the terminal can acquire the sliding direction information and the track length information of the sliding input in real time. Therefore, the length of the moving track of the contact mark along the sliding direction can be controlled in real time. Optionally, when the user slides on the screen with the finger and the stylus pen, that is, performs the sliding input, the terminal may obtain the sliding direction information and the track length information of the finger and the stylus pen in real time in response to the sliding input, and control the contact point identifier to move the track length along the sliding direction. Therefore, the contact mark can move along with the finger and the touch pen in real time, a user can determine the display position of the contact mark more intuitively, and the contact mark can be controlled to move to the position to be operated accurately. The operation visual effect is improved, the operation efficiency is improved, and the user experience is improved.
Optionally, the process that the terminal may obtain the sliding direction information and the track length information of the sliding input in real time may include: the terminal may capture the start and end positions of the slide input. And according to the vector of the initial position pointing to the end position, obtaining that the sliding direction of the sliding input is the direction of the vector, and the track length is the length of the vector. To acquire the sliding direction information and the track length information of the sliding input.
For example, assume that the origin of the terminal screen coordinate system is the position of the first pixel in the upper left corner of the display screen. The X coordinate axis of the screen coordinate system is the row direction of the row pixels, and the Y coordinate axis is the column direction of the column pixels. Assuming that the terminal collects the initial position information of the sliding input, the method comprises the following steps: the screen coordinates of the start position a are (10, 10), and the end position information includes: the screen coordinates of the end position B are (14, 7). Vector of terminal start position pointing to end position
Figure BDA0003024883200000081
Is (4, -3). Vector quantity
Figure BDA0003024883200000082
Length of (2)
Figure BDA0003024883200000083
Is composed of
Figure BDA0003024883200000084
I.e. a track length of 5 pixels. If the direction of the vector can be represented by an included angle between the vector and the positive direction of the X coordinate axis, the included angle between the vector and the horizontal direction is arcsin3/5 ≈ 36.87 °, that is, the sliding direction is a direction forming an included angle of 36.87 ° with the positive direction of the X coordinate axis. The terminal control contact mark is moved by 5 pixels in a direction forming an angle of 36.87 degrees with the positive direction of the X coordinate axis.
Optionally, the process of controlling, by the terminal, the contact identifier to move the track length along the sliding direction according to the sliding direction information and the track length information may include: and the terminal calculates and obtains the target moving distance corresponding to the track length input in a sliding manner according to the preset scaling of the track length and the moving distance of the contact mark. And the terminal control contact mark moves the target moving distance along the sliding direction.
The moving distance of the contact mark and the length of the track have preset scaling. And when the preset scaling is 1:1, the terminal calculates to obtain that the target moving distance is equal to the track length of the sliding input. And the terminal control contact mark moves along the sliding direction by the track length.
And when the ratio of the moving distance of the contact mark to the track length is smaller than 1, calculating to obtain that the target moving distance is reduced according to a preset scaling compared with the track length input in a sliding mode. Therefore, the actual moving distance of the terminal control contact mark along the sliding direction is smaller than the track length of the sliding input, and the user can conveniently control the actual moving distance of the contact mark. Especially, when the starting movement position of the contact mark is close to the position to be operated, the user can control the contact mark to move a short distance by sliding the finger and the touch pen at a long distance, and the operation accuracy of the user is improved. Of course, the ratio of the moving distance of the contact mark to the track length may also be greater than 1, which is not limited in this embodiment of the application.
Illustratively, assume that the track length is P pixels. If the preset scaling of the track length and the moving distance of the contact mark is as follows: 2: 1, the moving distance of the target corresponding to the track length is 1/2 × P pixels; if the preset scaling of the track length and the moving distance of the contact mark is as follows: 1:1, the target moving distance corresponding to the track length is P pixels; if the preset scaling of the track length and the moving distance of the contact mark is as follows: 2: 1, the target moving distance corresponding to the track length is 2 × P pixels.
Optionally, the process of the terminal controlling the contact identification movement based on the second input may further include: and acquiring sliding speed information of the sliding input. Based on this, the process of controlling the contact point identifier to move the track length along the sliding direction according to the sliding direction information and the track length information by the terminal may include: and the terminal controls the contact point identification to adopt the sliding speed according to the sliding direction information, the track length information and the sliding speed information, and moves the track length along the sliding direction. Thus, the contact point identifier and the finger and the touch pen guarantee the same speed and the same moving track, namely the contact point identifier and the finger and the touch pen move in parallel and keep the same distance in the instant. Therefore, the user can more directly control the movement condition of the contact mark through the finger and the touch pen, and the display position of the contact mark can be intuitively determined. The contact identification is controlled to move to the position to be operated accurately. The operation visual effect is improved, the operation efficiency is improved, and the user experience is improved.
Alternatively, the process by which the terminal can acquire the sliding speed information of the sliding input may include: the terminal can collect the start position, the end position and the sliding time of the sliding input. And obtaining the sliding speed according to the length of the vector of the starting position pointing to the end position and the sliding time so as to obtain the sliding speed information.
Illustratively, continuing with the above example as an example, assume that the terminal collects the start position a, the end position B, and the sliding time T of the sliding input. The sliding speed V is
Figure BDA0003024883200000091
Terminal control contact identification
Figure BDA0003024883200000092
Speed, length of the track moving in the sliding direction.
In a second alternative implementation, the second input is a click input. The process of the terminal controlling the contact identification movement based on the second input may include: the terminal may acquire a target distance between a click position of the click input and a display position of the contact point identifier and a moving direction. The control contact point identifier moves the target distance in the moving direction. Wherein, the moving direction may be a direction pointing from the display position to the click position.
Optionally, the process of controlling, by the terminal, the contact identifier to move the target distance in the moving direction according to the sliding direction information and the track length information may include: and the terminal calculates and obtains the target moving distance corresponding to the target distance input by clicking according to the preset scaling of the target distance and the moving distance of the contact mark. And the terminal control contact mark moves the target moving distance along the moving direction. In this embodiment of the present application, the explanation and implementation method for calculating and obtaining the target movement distance corresponding to the click-input target distance by the terminal according to the preset scaling ratio between the target distance and the movement distance of the contact identifier may refer to the aforementioned first optional implementation manner, and the explanation and implementation manner for calculating and obtaining the target movement distance corresponding to the slide-input track length by the terminal according to the preset scaling ratio between the track length and the movement distance of the contact identifier, which is not described in detail in this embodiment of the present application.
It should be noted that the terminal may receive the second input multiple times within the operation region before receiving the third input within the operation region, and control the contact identifier movement based on the second input in response to receiving the second input each time.
Optionally, the method further comprises: the terminal receives another second input in the operation area, and in response to the another second input, the contact point identification continues to move based on the second input control.
Wherein, after the terminal receives another second input in the operation area, the terminal can control the contact point identification based on the other second input and continue to move from the termination position of the contact point identification based on the previous second input. That is, before the terminal receives the third input in the operation area, the user may perform a second input after the user performs the second input to control the finger and the stylus to leave the touch display screen, so that the terminal receives the second input again by performing the second input again in the operation area, and in response to the second input, the contact point identifier continues to move based on the second input.
For example, the terminal controls the contact identifier to move to the position A based on the second input in response to the second input after the first second input is received in the operation area. The terminal controls the contact point identification to start moving from position a based on the second input in response to the second input after receiving the second input. It should be noted that, for the explanation and implementation manner of the terminal continuing to move based on the second input control contact identifier, reference may be made to the explanation and implementation manner of the terminal moving based on the second input control contact identifier in step 102, which is not described in detail in this embodiment of the present application.
Step 105, receiving a third input in the operation area.
In the embodiment of the present application, when the user thinks that the desired position to be operated has been reached, the third input may be performed in the operation area. The terminal may receive a third input at the operation area. Wherein the third input is of a different input type than the second input. This may enable the terminal to distinguish whether a function corresponding to the third input needs to be performed at the terminal position where the contact point identification moves. Alternatively, the second input may be a click, long press, swipe, hover gesture, or voice input, among other types of input.
And step 106, responding to the third input, and executing the function corresponding to the third input at the termination position of the contact identification movement.
In this embodiment, the function corresponding to the third input may be an operation function. Or the function corresponding to the third input may be a function set according to the actual situation. When the third input is different types of input, the functions corresponding to the third input may be different, thereby implementing different types of accurate operation modes.
For example, when the third input is two click operations within a set duration, that is, the third input is a double click operation, the function corresponding to the third input may be to trigger a click event at the termination position of the movement of the contact identifier, so as to execute a real click operation with respect to the termination position, thereby implementing a click operation mode in an accurate operation mode. When the third input is a long-press input, the function corresponding to the third input may be to trigger a long-press event at the termination position of the movement of the contact identifier, so as to execute a real long-press operation at the termination position, and implement a long-press operation mode of the accurate operation mode.
Alternatively, when the third input is the same type of trigger operation, the third input may correspond to a different function. Optionally, accurate operation identifiers corresponding to different accurate operation mode types one to one are displayed on the setting page. The function corresponding to the third input can be determined according to the precise operation identifier corresponding to the first input.
For example, if the first input received by the terminal is a first input corresponding to the target precision operation identifier. The target accurate operation identifier corresponds to a click operation mode in the accurate operation mode. The function corresponding to the third input is to trigger a click event at the termination position of the movement of the contact identifier, so as to execute a real click operation for the termination position, thereby implementing a click operation mode of an accurate operation mode.
In summary, the touch operation method provided by the embodiment of the application can display the touch point identifier on the display interface by responding to the first input. In response to a second input received at the operation area, the contact identification movement is controlled based on the second input, and in response to a third input received at the operation area, a function corresponding to the third input is executed at an end position of the contact identification movement. The operation area is an area on the display screen except the area where the contact mark is located, and therefore the operation area has a certain distance from the contact mark. Therefore, when the contact point identifier is moved by the second input on the operation area by the finger and the touch pen, the sight of the user cannot be shielded by the finger and the touch pen, and the user can accurately position the moving position of the contact point identifier. Therefore, the contact mark can be accurately controlled to move to the position to be operated, and the finger and the touch pen can accurately trigger the function corresponding to the third input at the position to be operated. Compared with the related art, the method and the device reduce invalid touch operation, reduce the operation error touch rate, improve the touch operation efficiency and improve the user experience.
Please refer to fig. 4, which shows a flowchart of another touch operation method according to an embodiment of the present application. The triggering operation method can also be applied to the touch electronic equipment. As shown in fig. 4, the touch operation method may include:
step 401, receive a first input.
For the explanation and implementation of this step, reference may be made to the explanation and implementation of step 101, which is not described in detail in this embodiment of the application.
Step 402, responding to the first input, and displaying a contact point identification on a display interface.
For the explanation and implementation of this step, reference may be made to the explanation and implementation of step 102, which is not described in detail in this embodiment of the application.
And 403, receiving a second input in an operation area, wherein the operation area is an area except for the area where the contact mark is located in the display interface.
For the explanation and implementation of this step, reference may be made to the explanation and implementation of step 103, which is not described in detail in this embodiment of the application.
Step 404, in response to the second input, controlling the contact identification movement based on the second input.
For the explanation and implementation of this step, reference may be made to the explanation and implementation of step 104, which is not described in detail in this embodiment of the application.
Step 405, a third input is received within the operational area.
For the explanation and implementation of this step, reference may be made to the explanation and implementation of step 105, which is not described in detail in this embodiment of the application.
And step 406, responding to the third input, and executing a function corresponding to the third input at the termination position of the contact identification movement.
For the explanation and implementation of this step, reference may be made to the explanation and implementation of step 106, which is not described in detail in this embodiment of the application.
Optionally, the precision operation mode may further include a trajectory drawing mode. A third input of the particular input type may correspond to a trace-drawing mode. The terminal receives a third input of a specific input type in step 405, and the function of the third input in step 406 is to determine a drawing start position.
Or, the setting page may display an accurate operation identifier corresponding to the trajectory drawing mode, where the accurate operation identifier is the trajectory drawing identifier. Then in the case where the first input responded in step 401 is the first input identified for trajectory drawing, the function of the third input in step 406 is the function of determining the drawing start point position. The process of the terminal performing the function corresponding to the third input at the termination position of the contact point identification movement in step 406 may include: the recorded contacts identify the end position of the movement. And after step 406, the method further comprises:
step 407, a fourth input is received in the operation area.
Alternatively, the fourth input may be a click, long press, swipe, hover gesture, or voice input, among other types of input. Wherein the fourth input is of a different input type than the third input. For example, the fourth input may be the same input type as the second input.
And 408, responding to a fourth input, controlling the contact point identifier to move based on the fourth input, drawing a set pattern by taking the termination position as a drawing starting point according to the moving track of the contact point identifier, and displaying the set pattern.
In this embodiment of the application, reference may be made to the explanation and implementation manner of the terminal controlling the contact identifier movement based on the fourth input in step 102, and details of the explanation and implementation manner of the terminal controlling the contact identifier movement based on the second input are not repeated in this embodiment of the application.
In this embodiment, in response to the fourth input, the terminal may draw the setting pattern with the termination position as the drawing start point according to the movement trajectory of the contact identifier movement controlled based on the fourth input, and display the setting pattern. Alternatively, the setting pattern may include a selection frame pattern, a line pattern, or other shapes or other types of patterns. The selection frame pattern may include a rectangular selection frame pattern or a circular selection frame pattern.
Optionally, before the terminal receives the fourth input in the operation area in step 407, the method further includes: the terminal displays pattern selection information, which may include pattern identifications corresponding to a plurality of types of patterns. The terminal receives a seventh input of a target pattern identifier among the plurality of pattern identifiers in response to the seventh input. The type of setting pattern is selected. The terminal draws the selected type setting pattern with the end position as the drawing start point in response to a fourth input. Wherein, the seventh input may be a click, long press, slide, hover gesture, or voice input.
For example, the pattern selection information may include a pattern identifier corresponding to the selection block pattern and a pattern identifier corresponding to the line pattern. And after receiving click input of the pattern identifier corresponding to the selection frame pattern, the terminal responds to the click input and selects the subsequently drawn set pattern as the selection frame pattern.
The present embodiment will be described below by taking a setting pattern as a selection frame pattern and a line pattern as examples.
In a first alternative implementation, the setting pattern is a checkbox pattern. The process of drawing the set pattern by the terminal with the termination position as the drawing start point according to the moving track of the contact point identifier may include: and drawing a selection frame pattern taking the movement track as a diagonal line by taking the ending position as a starting point.
Optionally, in a case that the selection frame pattern is a rectangular selection frame pattern, the terminal may obtain the position of the moving track point of the contact point identifier in real time in a process of controlling the contact point identifier to move based on the fourth input. And calculating the positions of four vertexes forming the target rectangle according to the termination position and the positions of the moving track points acquired in real time. And connecting the vertexes of the four positions and drawing the selection frame pattern. The target rectangle is a rectangle with a connecting line between the termination position and the position of the moving track point as a diagonal line. When the position of the obtained moving track point of the contact mark is changed, deleting the selection frame pattern drawn according to the position of the moving track point before the change.
For example, assume that the pixel coordinate of the termination position of the terminal record is (1, 1). And in the process that the terminal controls the contact point identifier to move based on the fourth input, the pixel coordinate of the position of the first moving track point of the contact point identifier is obtained in real time as (2, 2). The pixel coordinates of the positions of the four vertices constituting the target rectangle are calculated as (1,1), (1,2), (2,2), and (2,2) in this order. The first selection box pattern is drawn by connecting four vertices. And (2) obtaining the pixel coordinates of the position of the second moving track point of the contact point identification in real time as (2, 3). The pixel coordinates of the positions of the four vertices constituting the target rectangle are calculated as (1,1), (1,3), (2,3), and (2,1) in this order. And connecting the four vertexes to draw a second selection frame pattern, and deleting the first selection frame pattern.
In a second alternative implementation, the pattern is set to be a line pattern. The process of drawing the set pattern by the terminal with the termination position as the drawing start point according to the moving track of the contact point identifier may include: and drawing a line pattern of the moving track.
Optionally, the terminal may obtain, in the process of controlling the movement of the contact identifier based on the fourth input, the position of each moving track point in the moving track of the contact identifier in real time, and connect the moving track points at each position with the termination position as the starting point to draw the line pattern.
For example, assume that the pixel coordinate of the termination position of the terminal record is (1, 1). And in the process that the terminal controls the contact point identifier to move based on the fourth input, the pixel coordinate of the position of the first moving track point in the moving track of the contact point identifier is obtained in real time as (2, 2). And drawing a line pattern by connecting the moving track points on the (1,1) and the (2, 2). And (2) obtaining the pixel coordinates of the position of the second moving track point of the contact point identification in real time as (2, 3). And (3) connecting the moving track points on the (1,1), (2,2) and (2,3) to draw a line pattern.
It should be noted that the terminal may receive the fourth input multiple times in the operation area before the fifth input is received in the operation area, and control the contact point identifier movement based on the fourth input in response to each received fourth input.
Optionally, in the foregoing second optional implementation manner, before the terminal receives a fifth input in the operation area and executes the function corresponding to the setting pattern, the method further includes: the terminal receives another fourth input in the operation area, and in response to the another fourth input, the contact point identification is controlled to continue to move based on the fourth input. And continuously drawing line patterns of the moving track by the terminal according to the moving track of the continuous movement of the contact mark.
Wherein, the terminal continuing to move based on the other fourth input control contact identification may mean that the terminal moves for a distance based on the one fourth input control contact identification. A further fourth input is received at the terminal, in response to which the terminal controls the contact identification to continue moving from the stop position based on the further fourth input. The stop position is a position where the terminal controls the contact mark to stop moving in a certain distance based on a fourth input.
For example, assume that the pixel coordinate of the termination position of the terminal record is (1, 1). The terminal controls the contact point identification to move to the position with the pixel coordinate (1,2) based on a fourth input. Another fourth input is received at the terminal, in response to which the terminal control contact identity continues to move from the position with pixel coordinates (1, 2).
Step 409, a fifth input is received within the operational area.
In the embodiment of the present application, when the user considers that the desired pattern end position has been reached, the fifth input may be performed in the operation area. So that the terminal can receive a fifth input in the operation area. Wherein the fifth input is of a different input type than the fourth input. This makes it possible for the terminal to distinguish whether or not the function corresponding to the setting pattern needs to be executed. Alternatively, the fifth input may be a click, long press, swipe, hover gesture, or voice input, among other types of input.
And step 410, responding to a fifth input, and executing a function corresponding to the set pattern.
In the embodiment of the present application, the function corresponding to the set pattern may be an operation function such as a box selection function or a drawing display function. Alternatively, the function corresponding to the setting pattern may be a function set according to actual conditions.
Optionally, the types of the setting patterns are different, and the functions corresponding to the setting patterns are different. For example, when the setting pattern is a selection box pattern, the function corresponding to the setting pattern may be a box selection function. Alternatively, when the setting pattern is a line pattern, the function corresponding to the setting pattern may be a drawing display function, that is, a function of displaying the drawn line pattern. Or the types of the setting patterns are different, and the corresponding functions of the setting patterns are the same. For example, the function corresponding to the setting pattern is a selection function, that is, a function for setting the pattern to select information desired by the user. The content of the area of the display interface where the set pattern is located is selected in response to a fifth input. The terminal may display an executable operation selection for the selected content, e.g., a delete operation selection, a copy operation selection, etc., facilitating the user to perform subsequent selections of the selected content.
For example, when the setting pattern is a selection frame pattern, the function corresponding to the setting pattern may be a frame selection function. For example, the user wants to box the three words "you are happy". After the terminal executes step 406, the terminal may record the termination position of the contact identifier as "you" in the upper left corner. Assume that the fourth input is a slide input and the fifth input is a double-click input. And the user controls the finger and the touch pen to slide and input to the right and the lower direction in the operation area. The terminal receives a slide input, and in response to the slide input, the control contact identifier slides synchronously in a sliding direction of the slide input. The user finds that the contact mark reaches the position of the lower right corner of "ya", and at the moment, the three words of "you are like" are positioned inside the selection frame pattern. The user determines that the contact point identification reaches the desired position. At this point the screen is double clicked within the operating area. The terminal receives a double-click input, and in response to the double-click input, the box selects "you like".
In this embodiment of the present application, before step 405, and/or step 409, the method further includes: and when the contact mark is detected to stop moving, displaying prompt information. The prompt message is used to prompt the user for executable input and input of the corresponding function. The executable operations may include at least one of: a second input and a third input.
For example, after step 402 and before step 405, assume that the second input is a slide input, the third input is a double-click input, and the function corresponding to the double-click input is a click operation. When the contact point identifier is detected to stop moving, prompt information can be displayed. The prompt message may include "you can continue to slide the screen and control the movement of the contact point identifier. Or, you can double-click the screen to end the contact point identification moving operation and execute the clicking operation ".
Similarly, if the function of the third input is the function of determining the drawing start point position, then after step 406 and before step 409, it is assumed that the fourth input is a slide input, the fifth input is a double-click input, and the function corresponding to the double-click input is the drawing display function. When the contact point identifier is detected to stop moving, prompt information can be displayed. At this time, the prompt message may include "you can continue to slide the screen, and control the contact point identifier to move for drawing the movement track. Or, you can double-click the screen to end the contact point identification moving operation and execute the display track ".
For the convenience of the reader to understand, the embodiment of the present application explains the foregoing touch operation method again by using the following two examples.
In a first example, a user may want to select a phrase from a small-font article displayed on the terminal. If the user wants to select the phrase "high level security service" as outlined in the display interface shown in FIG. 5, the selection box 501 is used to select the phrase. At the moment, the user needs to click the starting position of the phrase of the high-level security service, namely the high word position, and then click the ending position of the phrase of the high-level security service, namely the word position of the high word position, so that the phrase of the high-level security service can be selected. The first input is assumed to be a click input, the second input is assumed to be a slide input, the third input is assumed to be a double click input within a set time length, that is, the double click input and the contact identifier are red dot icons, and the function corresponding to the third input is to trigger a click event at the termination position of the movement of the contact identifier, so as to execute a real click operation for the termination position.
The user can slide downwards from the top of the touch display screen, so that the terminal displays the shortcut operation interface. The shortcut operation interface comprises an accurate operation identifier corresponding to a click operation mode. And clicking the accurate operation identification corresponding to the click operation mode by the user, so that the terminal opens the click operation mode in the accurate operation mode.
The user can move a finger freely on the touch display screen, and finally click the screen at an initial position A near the position of the word "high" in the phrase "high-level security service" displayed on the display interface, namely, the finger is dropped for the first time near the position of the word "high". The terminal receives the click input, and displays a red dot icon at a click position (initial position A) corresponding to the click input. At this time, as shown in fig. 6, the user can see that a red dot icon 601 is displayed at the initial position a in the upper right corner direction near the "high" word. As shown in fig. 7 and 8, the user slides a finger in the lower left corner of the touch display screen (in the direction indicated by the arrow in fig. 8) in the lower right operating area of the area where the red dot icon is located. And after receiving the sliding input, the terminal responds to the sliding input and controls the red dot icon to move towards the lower left corner from the initial position A synchronously with the sliding of the finger. The user can lift the finger at any time to finish the sliding operation, and the red dot icon stays and is displayed at the termination position of the sliding operation. The user can slide the finger within the operating area again. And after receiving the sliding input, the terminal responds to the sliding input and controls the red dot icon to continue moving.
When the user finds that the red dot icon has moved to the "high" word position, the user can double-click the screen at the operating area. The terminal receives double-click input in the operation area, and in response to the double-click input, a click operation is performed at the termination position of the movement of the contact identifier, namely the position of the 'high' word currently displayed by the red dot icon. At this time, the user has finished clicking the start position of the "high level security service" phrase, i.e., the "high" word position. The user may repeat the above process so that the terminal performs the click operation again at the position of the "business" word. And finishing the selection of the phrase 'high-level security service'.
As a second example, the user wants to draw a circular arc pattern starting from the position point B. The function corresponding to the circular arc pattern is a drawing and displaying function. Assume that the first input is a click input, the second input is a slide input, the third input is a double-click input, the contact mark is a red dot icon, the function corresponding to the third input is a function of recording a trace drawing start point, the fourth input is a slide input, and the fifth input is a double-click input.
The user can slide downwards from the top of the touch display screen, so that the terminal displays the shortcut operation interface. The shortcut operation interface comprises a track drawing identifier corresponding to a track drawing mode. And clicking the track drawing identification by the user, so that the terminal starts a track drawing mode in the accurate operation mode.
The user can move a finger freely on the touch display screen and click the screen at an initial position C near the final position point B. The terminal receives the click input, and displays a red dot icon at a click position (initial position C) corresponding to the click input. At this time, as shown in fig. 9, the user can see that a red dot icon 901 is displayed at the initial position C near the position point B. And the user slides a finger to the lower right corner of the touch display screen in the lower right operation area of the area where the red dot icon is located. After receiving the slide input, the terminal controls the red dot icon 901 to move from the initial position C to the lower right corner in synchronization with the finger slide in response to the slide input.
The user finds that the red dot icon has moved to position point B. The user double-clicks the screen in the operation area. And the terminal receives double-click input in the operation area and records the position point B in response to the double-click input. As shown in fig. 10, the user starts at any position within the operation area and slides a finger in the direction shown by the arrow in fig. 10. The direction indicated by the arrow in fig. 10 is the arc direction of the circular arc pattern desired by the user. And after receiving the sliding input, the terminal responds to the sliding input, controls the red dot icon to move from the position B to the direction shown by the arrow in synchronization with the sliding of the finger, and draws and displays the line pattern of the moving track by taking the position B as a drawing starting point. The user can lift the finger at any time to finish the sliding operation, and the red dot icon stays and is displayed at the termination position of the sliding operation. The user can slide the finger within the operating area again. And after receiving the sliding input, the terminal responds to the sliding input, controls the red dot icon to continue moving, and continues to draw the line pattern of the moving track by taking the termination position of the last sliding operation as the start.
And when finding that the drawn line pattern is the wanted circular arc pattern, the user determines to finish the operation of drawing the line pattern. The user can double click the screen in the operation area, so that the terminal exits from the track drawing mode in the accurate operation mode. And the terminal receives double-click input in the operation area and responds to the double-click input to display the drawn line pattern.
In summary, the touch operation method provided by the embodiment of the application can display the touch point identifier on the display interface by responding to the first input. In response to a second input received at the operation area, the contact identification movement is controlled based on the second input, and in response to a third input received at the operation area, a function corresponding to the third input is executed at an end position of the contact identification movement. The operation area is an area on the display screen except the area where the contact mark is located, and therefore the operation area has a certain distance from the contact mark. Therefore, when the contact point identifier is moved by the second input on the operation area by the finger and the touch pen, the sight of the user cannot be shielded by the finger and the touch pen, and the user can accurately position the moving position of the contact point identifier. Therefore, the contact mark can be accurately controlled to move to the position to be operated, and the finger and the touch pen can accurately trigger the function corresponding to the third input at the position to be operated. Compared with the related art, the method and the device reduce invalid touch operation, reduce the operation error touch rate, improve the touch operation efficiency and improve the user experience.
It should be noted that, in the touch operation method provided in the embodiment of the present application, the execution main body may be a touch operation device, or a control module of the touch operation device for executing the method of the touch operation. In the embodiment of the present application, a method for a touch operation device to perform a touch operation is taken as an example to describe the touch operation device provided in the embodiment of the present application.
Referring to fig. 11, a block diagram of a touch operation device according to an embodiment of the present disclosure is shown. As shown in fig. 11, the touch operation device 1100 includes: a receiving module 1101, a display module 1102, a control module 1103, and an execution module 1104.
A receiving module 1101 for receiving a first input;
the display module 1102 is configured to display a contact identifier on a display interface in response to a first input;
the receiving module 1101 is further configured to receive a second input in an operation area, where the operation area is an area other than an area where the contact identifier is located in the display interface;
a control module 1103 for controlling the contact identification movement based on a second input in response to the second input;
a receiving module 1101, further configured to receive a third input in the operation area;
and an executing module 1104, configured to, in response to the third input, execute a function corresponding to the third input at the termination position of the contact identification movement.
Optionally, the second input comprises a slide input; the control module 1103 is further configured to:
acquiring sliding direction information and track length information of sliding input;
and controlling the contact mark to move the track length along the sliding direction according to the sliding direction information and the track length information.
Optionally, the control module 1103 is further configured to:
calculating to obtain a target moving distance corresponding to the track length input in a sliding mode according to the preset scaling of the track length and the moving distance of the contact mark;
the control contact mark moves the target moving distance in the sliding direction.
Optionally, the control module 1103 is further configured to:
acquiring sliding speed information of sliding input;
and controlling the contact point identifier to adopt the sliding speed according to the sliding direction information, the track length information and the sliding speed information, and moving the track length along the sliding direction.
Optionally, the display module 1102 is further configured to:
when the contact point identifier is detected to stop moving, displaying prompt information, wherein the prompt information is used for prompting a user to execute operations and operate corresponding functions, and the executable operations comprise at least one of the following operations: a second input and a third input.
Optionally, a trajectory drawing identifier is displayed, and the first input comprises a first input for the trajectory drawing identifier; the execution module 1104 is further configured to record an end position of the contact identification movement.
The receiving module 1101 is further configured to receive a fourth input in the operation area.
A control module 1103 for, in response to a fourth input, controlling the contact identification movement based on the fourth input;
the device still includes: and the drawing module is used for drawing the set pattern by taking the termination position as a drawing starting point according to the movement track of the contact mark.
The display module 1102 is further configured to display the setting pattern.
The receiving module 1101 is further configured to receive a fifth input in the operation area.
The execution module 1104 is further configured to execute a function corresponding to the setting pattern in response to a fifth input.
Optionally, the drawing module is further configured to draw a selection box pattern with the movement trajectory as a diagonal line, with the termination position as a starting point; or drawing a line pattern of the moving track.
To sum up, the touch operation device provided by the embodiment of the application can respond to the first input through the display module to display the contact identifier on the display interface. The control module controls the contact identification movement based on the second input in response to the second input received at the operation area, so that the execution module may execute a function corresponding to the third input at an end position of the contact identification movement in response to the third input received at the operation area. The operation area is an area on the display screen except the area where the contact mark is located, and therefore the operation area has a certain distance from the contact mark. Therefore, when the contact point identifier is moved by the second input on the operation area by the finger and the touch pen, the sight of the user cannot be shielded by the finger and the touch pen, and the user can accurately position the moving position of the contact point identifier. Therefore, the contact mark can be accurately controlled to move to the position to be operated, and the finger and the touch pen can accurately trigger the function corresponding to the third input at the position to be operated. Compared with the related art, the method and the device reduce invalid touch operation, reduce the operation error touch rate, improve the touch operation efficiency and improve the user experience.
The touch operation device in the embodiment of the present application may be a device, or may be a component, an integrated circuit, or a chip in a terminal. The device can be mobile electronic equipment or non-mobile electronic equipment. For example, the mobile electronic device may be a mobile phone, a tablet computer, a notebook computer, a palm top computer, a vehicle-mounted electronic device, a wearable device, an ultra-mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), and the like. The non-mobile electronic device may be a server, a Network Attached Storage (NAS), a Personal Computer (PC), a Television (TV), a teller machine, a self-service machine, or the like, and the embodiments of the present application are not limited in particular.
The touch operation device in the embodiment of the present application may be a device having an operating system. The operating system may be an Android operating system (Android), an iOS operating system, or other possible operating systems, which is not specifically limited in the embodiments of the present application.
The touch operation device provided in the embodiment of the present application can implement each process implemented by the method embodiments of fig. 1 and fig. 4, and is not described here again to avoid repetition.
Optionally, as shown in fig. 12, an electronic device 1200 is further provided in an embodiment of the present application, and includes a processor 1201, a memory 1202, and a program or an instruction stored in the memory 1202 and executable on the processor 1201, where the program or the instruction is executed by the processor 1201 to implement each process of the embodiment of the touch operation method, and can achieve the same technical effect, and no further description is provided here to avoid repetition.
It should be noted that the electronic device in the embodiment of the present application includes the mobile electronic device and the non-mobile electronic device described above.
Referring to fig. 13, fig. 13 is a schematic diagram of a hardware structure of an electronic device implementing an embodiment of the present application. As shown in fig. 13, the electronic device 1300 includes, but is not limited to: a radio frequency unit 1301, a network module 1302, an audio output unit 1303, an input unit 1304, a sensor 1305, a display unit 1306, a user input unit 1307, an interface unit 1308, a memory 1309, a processor 1130, and the like.
Those skilled in the art will appreciate that the electronic device 1300 may further comprise a power source (e.g., a battery) for supplying power to the various components, and the power source may be logically connected to the processor 1310 via a power management system, so as to manage charging, discharging, and power consumption management functions via the power management system. Drawing (A)13The electronic device structures shown in the figures do not constitute limitations of the electronic device, and the electronic device may include more or less components than those shown, or combine some components, or arrange different components, and thus, the description is not repeated here.
Wherein, the processor 1310 is configured to receive a first input.
And the display unit 1306 is used for responding to the first input and displaying the contact point identification on the display interface.
A processor 1310 configured to receive a second input in an operation area, where the operation area is an area of the display interface other than the area where the contact identifier is located. In response to the second input, controlling the contact identification movement based on the second input. A third input is received within the operational area. And responding to the third input, and executing the function corresponding to the third input at the termination position of the contact identification movement.
By displaying contact identifiers on the display interface in response to the first input. In response to a second input received in the operation area, the contact identification movement is controlled based on the second input, and in response to a third input received in the operation area, a function corresponding to the third input is executed at an end position of the contact identification movement. The operation area is an area on the display screen except the area where the contact mark is located, and therefore the operation area has a certain distance from the contact mark. Therefore, when the contact point identifier is moved by the second input on the operation area by the finger and the touch pen, the sight of the user cannot be shielded by the finger and the touch pen, and the user can accurately position the moving position of the contact point identifier. Therefore, the contact mark can be accurately controlled to move to the position to be operated, and the finger and the touch pen can accurately trigger the function corresponding to the third input at the position to be operated. Compared with the related art, the method and the device reduce invalid touch operation, reduce the operation error touch rate, improve the touch operation efficiency and improve the user experience.
Optionally, the second input comprises a slide input; the processor 1310 is further configured to obtain sliding direction information and track length information of the sliding input. And controlling the contact mark to move the track length along the sliding direction according to the sliding direction information and the track length information.
Optionally, the processor 1310 is further configured to calculate a target movement distance corresponding to the track length of the sliding input according to a preset scaling of the track length and the movement distance of the contact identifier. And controlling the contact mark to move the target moving distance along the sliding direction.
Optionally, the processor 1310 is further configured to obtain sliding speed information of the sliding input. And controlling the contact mark to adopt the sliding speed according to the sliding direction information, the track length information and the sliding speed information, and moving the track length along the sliding direction.
Optionally, the processor 1310 is further configured to display a prompt message when the contact identifier is detected to stop moving, where the prompt message is used to prompt a user to perform an executable operation and operate a corresponding function, where the executable operation includes at least one of: the second input and the third input.
Optionally, a trajectory drawing identifier is displayed, and the first input comprises a first input for the trajectory drawing identifier; a processor 1310, further configured to record an end position of the contact identification movement;
a processor 1310 configured to receive a fourth input within the operation region. And responding to the fourth input, controlling the contact point identification to move based on the fourth input, and drawing a set pattern by taking the termination position as a drawing starting point according to the movement track of the contact point identification.
The display unit 1306 is further configured to display the setting pattern.
A processor 1310 configured to receive a fifth input within the operation region. And responding to the fifth input, and executing a function corresponding to the set pattern.
Optionally, the processor 1310 is further configured to draw a selection box pattern with the movement trajectory as a diagonal line, starting from the end position; or drawing a line pattern of the moving track.
By displaying contact identifiers on the display interface in response to the first input. In response to a second input received in the operation area, the contact identification movement is controlled based on the second input, and in response to a third input received in the operation area, a function corresponding to the third input is executed at an end position of the contact identification movement. The operation area is an area on the display screen except the area where the contact mark is located, and therefore the operation area has a certain distance from the contact mark. Therefore, when the contact point identifier is moved by the second input on the operation area by the finger and the touch pen, the sight of the user cannot be shielded by the finger and the touch pen, and the user can accurately position the moving position of the contact point identifier. Therefore, the contact mark can be accurately controlled to move to the position to be operated, and the finger and the touch pen can accurately trigger the function corresponding to the third input at the position to be operated. Compared with the related art, the method and the device reduce invalid touch operation, reduce the operation error touch rate, improve the touch operation efficiency and improve the user experience.
It should be understood that in the embodiment of the present application, the input Unit 1304 may include a Graphics Processing Unit (GPU) 13041 and a microphone 13042, and the Graphics processor 13041 processes image data of still pictures or videos obtained by an image capturing apparatus (such as a camera) in a video capturing mode or an image capturing mode. The display unit 1306 may include a display panel 13061, and the display panel 13061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 1307 includes a touch panel 13071 and other input devices 13072. A touch panel 13071, also referred to as a touch screen. The touch panel 13071 may include two parts, a touch detection device and a touch controller. Other input devices 13072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein. Memory 1309 may be used to store software programs as well as various data, including but not limited to application programs and operating systems. The processor 1310 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 1310.
The embodiment of the present application further provides a readable storage medium, where a program or an instruction is stored on the readable storage medium, and when the program or the instruction is executed by a processor, the process of the embodiment of the touch operation method is implemented, and the same technical effect can be achieved, and in order to avoid repetition, details are not repeated here.
The processor is the processor in the electronic device described in the above embodiment. The readable storage medium includes a computer readable storage medium, such as a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and so on.
The embodiment of the present application further provides a chip, where the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to run a program or an instruction to implement each process of the embodiment of the touch operation method, and can achieve the same technical effect, and the details are not repeated here to avoid repetition.
It should be understood that the chips mentioned in the embodiments of the present application may also be referred to as system-on-chip, system-on-chip or system-on-chip, etc.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Further, it should be noted that the scope of the methods and apparatus of the embodiments of the present application is not limited to performing the functions in the order illustrated or discussed, but may include performing the functions in a substantially simultaneous manner or in a reverse order based on the functions involved, e.g., the methods described may be performed in an order different than that described, and various steps may be added, omitted, or combined. In addition, features described with reference to certain examples may be combined in other examples.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a computer software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, or a network device) to execute the method according to the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the invention is not limited to the precise embodiments described above, which are meant to be illustrative and not restrictive, and that various changes may be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (14)

1. A touch operation method is characterized by comprising the following steps:
receiving a first input;
displaying a contact point identification on a display interface in response to the first input;
receiving a second input in an operation area, wherein the operation area is an area except for the area where the contact mark is located in the display interface;
in response to the second input, controlling the contact identification movement based on the second input;
receiving a third input within the operational area;
and responding to the third input, and executing the function corresponding to the third input at the termination position of the contact identification movement.
2. The method of claim 1, wherein the second input comprises a slide input; the controlling the contact identification movement based on the second input includes:
acquiring sliding direction information and track length information of the sliding input;
and controlling the contact mark to move the track length along the sliding direction according to the sliding direction information and the track length information.
3. The method according to claim 2, wherein the controlling the contact point identifier to move along the sliding direction by the track length according to the sliding direction information and the track length information comprises:
calculating to obtain a target moving distance corresponding to the track length of the sliding input according to a preset scaling of the track length and the moving distance of the contact mark;
and controlling the contact mark to move the target moving distance along the sliding direction.
4. The method of claim 2, wherein the controlling the contact identification movement based on the second input further comprises: acquiring sliding speed information of the sliding input;
the controlling the contact mark to move the track length along the sliding direction according to the sliding direction information and the track length information includes: and controlling the contact mark to adopt a sliding speed according to the sliding direction information, the track length information and the sliding speed information, and moving the track length along the sliding direction.
5. The method of claim 1, wherein prior to receiving a third input within the operational area, the method further comprises:
when the contact mark is detected to stop moving, displaying prompt information, wherein the prompt information is used for prompting a user to execute operations and functions corresponding to the operations, and the executable operations comprise at least one of the following operations: the second input, the third input.
6. The method of claim 1, wherein a trajectory drawing identifier is displayed, and wherein the first input comprises a first input for the trajectory drawing identifier; the executing the function corresponding to the third input at the termination position of the contact identification movement comprises: recording the moving termination position of the contact point identification;
the method further comprises the following steps:
receiving a fourth input within the operational area;
responding to the fourth input, controlling the contact point identifier to move based on the fourth input, drawing a set pattern by taking the termination position as a drawing starting point according to the movement track of the contact point identifier, and displaying the set pattern;
receiving a fifth input within the operational area;
and responding to the fifth input, and executing a function corresponding to the set pattern.
7. The method according to claim 6, wherein the drawing a set pattern with the end position as a drawing start point according to the movement track of the contact point identifier comprises:
drawing a selection frame pattern taking the moving track as a diagonal line by taking the ending position as a starting point;
or drawing a line pattern of the moving track.
8. A touch-operated device, the device comprising:
a receiving module for receiving a first input;
the display module is used for responding to the first input and displaying the contact identification on a display interface;
the receiving module is further configured to receive a second input in an operation area, where the operation area is an area of the display interface other than the area where the contact identifier is located;
a control module for controlling the contact identifier movement based on the second input in response to the second input;
the receiving module is further used for receiving a third input in the operation area;
and the execution module is used for responding to the third input and executing the function corresponding to the third input at the termination position of the contact identification movement.
9. The apparatus of claim 8, wherein the second input comprises a slide input; the control module is further configured to:
acquiring sliding direction information and track length information of the sliding input;
and controlling the contact mark to move the track length along the sliding direction according to the sliding direction information and the track length information.
10. The apparatus of claim 9, wherein the control module is further configured to:
calculating to obtain a target moving distance corresponding to the track length of the sliding input according to a preset scaling of the track length and the moving distance of the contact mark;
and controlling the contact mark to move the target moving distance along the sliding direction.
11. The apparatus of claim 9, wherein the control module is further configured to:
acquiring sliding speed information of the sliding input;
and controlling the contact mark to adopt a sliding speed according to the sliding direction information, the track length information and the sliding speed information, and moving the track length along the sliding direction.
12. The apparatus of claim 8, wherein the display module is further configured to:
when the contact mark is detected to stop moving, displaying prompt information, wherein the prompt information is used for prompting a user to execute operations and functions corresponding to the operations, and the executable operations comprise at least one of the following operations: the second input, the third input.
13. The apparatus of claim 8, wherein a trajectory drawing identifier is displayed, and wherein the first input comprises a first input for the trajectory drawing identifier; the execution module is further configured to: recording the moving termination position of the contact point identification;
the receiving module is further used for receiving a fourth input in the operation area;
the control module is further used for responding to the fourth input and controlling the contact point identification to move based on the fourth input;
the device further comprises: the drawing module is used for drawing a set pattern by taking the termination position as a drawing starting point according to the movement track of the contact mark;
the display module is also used for displaying the set pattern;
the receiving module is further configured to receive a fifth input in the operation area;
the execution module is further configured to execute a function corresponding to the set pattern in response to the fifth input.
14. The apparatus of claim 13, wherein the rendering module is further configured to:
drawing a selection frame pattern taking the moving track as a diagonal line by taking the ending position as a starting point;
or drawing a line pattern of the moving track.
CN202110413329.0A 2021-04-16 2021-04-16 Touch operation method and device Pending CN113485590A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202110413329.0A CN113485590A (en) 2021-04-16 2021-04-16 Touch operation method and device
PCT/CN2022/086662 WO2022218352A1 (en) 2021-04-16 2022-04-13 Method and apparatus for touch operation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110413329.0A CN113485590A (en) 2021-04-16 2021-04-16 Touch operation method and device

Publications (1)

Publication Number Publication Date
CN113485590A true CN113485590A (en) 2021-10-08

Family

ID=77932963

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110413329.0A Pending CN113485590A (en) 2021-04-16 2021-04-16 Touch operation method and device

Country Status (2)

Country Link
CN (1) CN113485590A (en)
WO (1) WO2022218352A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022218352A1 (en) * 2021-04-16 2022-10-20 维沃移动通信有限公司 Method and apparatus for touch operation

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103019588A (en) * 2012-11-26 2013-04-03 中兴通讯股份有限公司 Touch positioning method, device and terminal
CN105892786A (en) * 2015-01-16 2016-08-24 张凯 Method for achieving text selection on touchscreen interface
JP6313395B1 (en) * 2016-10-17 2018-04-18 グリー株式会社 Drawing processing method, drawing processing program, and drawing processing apparatus
CN113791710A (en) * 2017-09-05 2021-12-14 华为终端有限公司 Method for controlling cursor movement, content selection method, method for controlling page scrolling and electronic equipment
CN108132752B (en) * 2017-12-21 2020-02-18 维沃移动通信有限公司 Text editing method and mobile terminal
CN113485590A (en) * 2021-04-16 2021-10-08 维沃移动通信有限公司 Touch operation method and device

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022218352A1 (en) * 2021-04-16 2022-10-20 维沃移动通信有限公司 Method and apparatus for touch operation

Also Published As

Publication number Publication date
WO2022218352A1 (en) 2022-10-20

Similar Documents

Publication Publication Date Title
KR102230708B1 (en) User termincal device for supporting user interaxion and methods thereof
US11487426B2 (en) Apparatus and method for cursor control and text selection and editing based on gesture-based touch inputs received in a virtual keyboard display area
US10437360B2 (en) Method and apparatus for moving contents in terminal
US8654076B2 (en) Touch screen hover input handling
KR102367838B1 (en) Device, method, and graphical user interface for managing concurrently open software applications
US8276101B2 (en) Touch gestures for text-entry operations
KR102462364B1 (en) Method of displaying an image by using a scroll bar and apparatus thereof
US10579248B2 (en) Method and device for displaying image by using scroll bar
EP3002664A1 (en) Text processing method and touchscreen device
US20150154444A1 (en) Electronic device and method
US9836211B2 (en) Device, method, and graphical user interface for selection of views in a three-dimensional map based on gesture inputs
US10521101B2 (en) Scroll mode for touch/pointing control
JP2015007949A (en) Display device, display controlling method, and computer program
CN105824531A (en) Method and device for adjusting numbers
US20140079317A1 (en) Electronic apparatus and handwritten document processing method
CN108475172B (en) Information display method and device and terminal equipment
US20150205483A1 (en) Object operation system, recording medium recorded with object operation control program, and object operation control method
WO2022218352A1 (en) Method and apparatus for touch operation
CN108021313B (en) Picture browsing method and terminal
CN107613077A (en) A kind of method for controlling mobile phone screen
JP2015022524A (en) Terminal device and system
CN110262747B (en) Method and device for controlling terminal, terminal and storage medium
CN112764623B (en) Content editing method and device
CN113360035B (en) Icon sorting method and device and electronic equipment
CN115358251A (en) Graphic code identification method, graphic code identification device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination