CN114816213A - Operation identification method and device, electronic equipment and readable storage medium - Google Patents

Operation identification method and device, electronic equipment and readable storage medium Download PDF

Info

Publication number
CN114816213A
CN114816213A CN202210467561.7A CN202210467561A CN114816213A CN 114816213 A CN114816213 A CN 114816213A CN 202210467561 A CN202210467561 A CN 202210467561A CN 114816213 A CN114816213 A CN 114816213A
Authority
CN
China
Prior art keywords
touch
information
touch point
edge
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210467561.7A
Other languages
Chinese (zh)
Inventor
潘山
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202210467561.7A priority Critical patent/CN114816213A/en
Publication of CN114816213A publication Critical patent/CN114816213A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0412Digitisers structurally integrated in a display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/0418Control or interface arrangements specially adapted for digitisers for error correction or compensation, e.g. based on parallax, calibration or alignment

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Position Input By Displaying (AREA)

Abstract

The application discloses an operation identification method, an operation identification device, electronic equipment and a readable storage medium, and belongs to the technical field of electronics. Wherein the method comprises the following steps: acquiring touch information of touch input, wherein the touch information comprises touch position information and acquisition time of the touch position information; determining position information of an additional touch point in the touch screen according to the touch information under the condition that the touch information meets a preset condition; and identifying the operation corresponding to the touch input according to the touch information and the position information of the additional touch point.

Description

Operation identification method and device, electronic equipment and readable storage medium
Technical Field
The application belongs to the technical field of electronics, and particularly relates to an operation identification method and device, electronic equipment and a readable storage medium.
Background
With the development of electronic devices, in order to facilitate the use of users, gesture navigation functions are supported in the electronic devices. Gesture navigation, as the name implies, realizes functions corresponding to operation gestures according to different operation gestures of a user on a touch screen. Such as: when an operation gesture sliding upwards from the bottom edge of the touch screen is detected, the electronic device can be triggered to display the desktop. With the gesture navigation, the operation becomes more convenient.
However, when a finger slides inward from the edge of the touch screen, the sensor near the edge of the touch screen may sense a low value due to a fast sliding speed, and may not reach the touch point threshold, and thus, may not detect the touch point at the edge. Or, a certain distance exists between the initial touch point of the touch input and the edge of the touch screen. This may cause the actually identified initial touch point to be far from the edge of the touch screen, and thus the user operation may not be accurately identified.
Disclosure of Invention
The embodiment of the application aims to provide an operation identification method, which can solve the problem that in the prior art, the operation identification accuracy is low.
In a first aspect, an embodiment of the present application provides an operation identification method, which is applied to an electronic device, where the electronic device includes a touch screen, and the method includes:
acquiring touch information of touch input; the touch information comprises touch position information and acquisition time of the touch position information;
determining position information of an additional touch point in the touch screen according to the touch information under the condition that the touch information meets a preset condition;
and identifying the operation corresponding to the touch input according to the touch information and the position information of the additional touch point.
In a second aspect, an embodiment of the present application provides an operation recognition apparatus, which is applied to an electronic device, where the electronic device includes a touch screen, and the apparatus includes:
the acquisition module is used for acquiring touch information of touch input; the touch information comprises touch position information and acquisition time of the touch position information;
the determining module is used for determining the position information of the additional touch point in the touch screen according to the touch information under the condition that the touch information meets the preset condition;
and the identification module is used for identifying the operation corresponding to the touch input according to the touch information and the position information of the additional touch point.
In a third aspect, embodiments of the present application provide an electronic device, which includes a processor and a memory, where the memory stores a program or instructions executable on the processor, and the program or instructions, when executed by the processor, implement the steps of the method according to the first aspect.
In a fourth aspect, embodiments of the present application provide a readable storage medium, on which a program or instructions are stored, which when executed by a processor implement the steps of the method according to the first aspect.
In a fifth aspect, an embodiment of the present application provides a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and the processor is configured to execute a program or instructions to implement the method according to the first aspect.
In a sixth aspect, embodiments of the present application provide a computer program product, stored on a storage medium, for execution by at least one processor to implement the method according to the first aspect.
In this way, in the embodiment of the application, by acquiring the touch information of the touch input, when the touch information meets the preset condition, it is described that the touch input is an operation gesture which can trigger the preset function when the user executes, then, the position information of the additional touch point is determined in the touch screen according to the touch information, the actual operation intention of the user can be judged according to the touch information, and then the additional touch point is supplemented in the touch screen, and the operation determined together according to the touch information and the position information of the additional touch point is more accurate and more conforms to the operation which is expected to be realized by the touch input of the user, so that the accuracy of operation identification can be improved.
Drawings
Fig. 1 is a flowchart of an operation identification method according to an embodiment of the present application;
fig. 2 is a schematic diagram of an additional touch point provided in the embodiment of the present application;
fig. 3 is a block diagram of an operation recognition apparatus according to an embodiment of the present application;
fig. 4 is one of the hardware configuration diagrams of the electronic device according to the embodiment of the present application;
fig. 5 is a second schematic diagram of a hardware structure of the electronic device according to the embodiment of the present application.
Detailed Description
The technical solutions of the embodiments of the present application will be described below clearly with reference to the drawings of the embodiments of the present application, and it is obvious that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments that can be derived from the embodiments of the present application by one of ordinary skill in the art are intended to be within the scope of the present application.
The terms first, second and the like in the description and in the claims of the present application are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It will be appreciated that the data so used may be interchanged under appropriate circumstances such that embodiments of the application may be practiced in sequences other than those illustrated or described herein, and that the terms "first," "second," and the like are generally used herein in a generic sense and do not limit the number of terms, e.g., the first term can be one or more than one. In addition, "and/or" in the specification and claims means at least one of connected objects, a character "/" generally means that a preceding and succeeding related objects are in an "or" relationship.
With the popularization of electronic devices and the development of touch screen technology, users generally interact with electronic devices through touch screens in the actual use process.
The touch screen may be an inductive display device capable of receiving touch input from a finger or a contact. Currently, a capacitive screen technology is generally adopted for a touch screen. When a user performs touch input on the touch screen, the capacitance value of the capacitive sensor arranged on the touch screen changes, and the processor compares the change of the capacitance value with a reporting threshold value to detect which point is touched, which process is called reporting a touch point, which is called reporting a touch point for short.
In order to be more convenient for users to use, the electronic equipment supports the function of gesture navigation. And gesture navigation, namely realizing a preset function corresponding to an operation gesture according to different operation gestures of a user on the touch screen.
Such as: when an operation gesture sliding upwards from the bottom edge of the touch screen is detected, the electronic equipment can be triggered to display a desktop; when an operation gesture sliding upwards from the bottom edge of the touch screen and staying is detected, the electronic display background application can be triggered; when an operation gesture sliding inwards from the left edge of the touch screen is detected, a previous-level interface of the electronic display can be triggered. With the gesture navigation, the navigation bar is omitted, the operation is more convenient, and the user experience is improved.
However, when a finger slides inward from the edge of the touch screen, the sensor at the edge area of the touch screen detects a low sensing amount, and may not reach the touch threshold, so that the initial touch point of the touch input is far from the edge of the touch screen, and the user operation cannot be accurately identified.
For example, the operation intention of the user is to trigger the electronic device to display a desktop, and when an operation gesture of sliding upwards from the bottom edge of the touch screen is executed, the touch information of the touch point reported after the sensor detects that a finger touches the touch screen may include P 1 、P 2 、P 3 、P 4 And P 5 Touch information of the five touch points, P 1 P 5 Is a trace of a touch input that slides quickly up the bottom of the touch screen. When the user quickly slides upwards, the sensor at the edge area of the touch screen detects low induction quantity and fails to report points. Initial touch point P of touch input detected by sensor 1 The distance from the bottom edge of the touch screen is far, so that the electronic equipment display desktop cannot be triggered. Therefore, the user operation cannot be quickly and accurately identified, and the user operation intention can be realized only by repeating the operation for several times when the user uses the gesture navigation, so that the user experience is greatly influenced.
Based on the application scenario, the operation identification method provided by the embodiment of the present application is described in detail below with reference to the accompanying drawings through specific embodiments and application scenarios thereof.
Fig. 1 is a flowchart of an operation identification method according to an embodiment of the present application.
As shown in fig. 1, the operation recognition method may include steps 110 to 130, and the method is applied to an operation recognition device, and specifically as follows:
according to the operation identification method, touch information of touch input is acquired, wherein the touch information comprises touch position information and acquisition time of the touch position information. The touch position information includes abscissa information and ordinate information, such as: the position information of the first touch point may be (400, 1500), and a unit of the abscissa information and the ordinate information is a pixel. The touch input is composed of a plurality of touch points, and the touch information of the touch input is the touch information of the touch points. After the touch information is acquired, position information of a start touch point of the touch input may be acquired from the touch information, and a sliding speed of the touch input may be determined according to the touch position information and an acquisition time of the touch position information. Therefore, whether the user executes the quick sliding action close to the edge of the touch screen can be judged according to whether the touch information meets the preset condition. Under the condition that the touch information meets the preset condition, the touch input is an operation gesture which can trigger a preset function when a user executes, the actual operation intention of the user can be judged according to the touch information, and then additional touch points are supplemented in the touch screen, the operation determined together according to the touch information and the position information of the additional touch points is more accurate, and the operation expected to be realized by the touch input of the user is more met. Therefore, the operation corresponding to the touch input is identified according to the touch information and the position information of the additional touch point, the accuracy of operation identification can be ensured, the operation meeting the real intention of the user can be accurately executed, and the user experience is improved.
Step 110, acquiring touch information of the touch input, where the touch information includes touch position information and acquisition time of the touch position information.
Touch screens are broadly classified into infrared type, resistive type, surface acoustic wave type, and capacitive type touch screens.
Taking the capacitive touch screen as an example, the touch information of the touch input detected by the capacitive sensor can be acquired. When a user finger contacts the touch screen, namely touch input is performed on the touch screen, a part of electric charge is taken away by the user finger at the corresponding electrode in the touch screen, and capacitance change is generated. According to the capacitance change of each electrode in the touch screen, the coordinates of the touch point of the finger of the user can be specifically positioned. Therefore, the position information of the touch input can be positioned, and the corresponding operation is executed. Therefore, the touch position information of the touch input and the acquisition time of the touch position information can be determined according to the capacitance value acquired by the capacitance sensor.
In a possible embodiment, in a case that a distance between the first touch point and the first edge is not less than a first threshold, an operation corresponding to the touch input is identified according to the first touch point. The first touch point is a starting point of touch input, the touch screen comprises a first edge and a second edge, the second edge is an edge of the touch screen except the first edge, and the touch input can point to the second edge from the first edge.
The distance between the first touch point and the first edge is not less than the first threshold, which indicates that the first touch point is farther from the edge of the touch screen, that is, the touch input is not for triggering an operation gesture of a preset function, for example, the touch input is for clicking or sliding a page.
Wherein, the operation gesture for triggering the preset function may include: an operation gesture of sliding up from the bottom edge of the touch screen, an operation gesture of sliding up from the bottom edge of the touch screen and staying, and an operation gesture of sliding inward from the left side edge or the right side edge of the touch screen.
When a user performs touch input on the touch screen, the change of the capacitance value is compared with a preset threshold value to detect which point is touched and report the point, so as to determine the touch point.
Generally, a touch screen is scanned at preset time intervals, so that multiple touch images can be obtained, wherein each touch image comprises one touch point. Taking touch input as sliding input, as shown in FIG. 2, P 1 To P 5 The touch points are reported after the touch screen sensor detects the touch of the finger.
In another possible embodiment, the touch points corresponding to the touch input mentioned above only include the first touch point, and after step 110, the following steps may be further included:
and identifying the operation corresponding to the touch input according to the first touch point.
The touch points only include the first touch point, which indicates that the finger performing the touch input leaves the touch screen after the first touch point is detected, and at this time, the operation corresponding to the touch input can be directly identified according to the first touch point.
Specifically, if the other touch point of the touch input with the ID of 0 is not detected after the touch input with the Identity Document (ID) of 0 is detected as the first touch point, it indicates that the touch input with the ID of 0 leaves the touch screen, and the touch point of the first touch point can be quickly reported. Wherein, ID is used for distinguishing the finger, and one finger corresponds to one ID.
Alternatively, after step 110, the following steps may be further included:
determining a sliding distance according to the touch information; and under the condition that the sliding distance is smaller than the preset sliding distance, identifying the operation corresponding to the touch input according to the touch information.
The touch information includes touch position information of the initial touch point and touch position information of the termination touch point, and the sliding distance of the touch input can be determined according to the touch position information of the initial touch point and the touch position information of the termination touch point. If the sliding distance is smaller than the preset sliding distance, it is indicated that the touch input is not an operation gesture for triggering a preset function.
In yet another possible embodiment, after step 110, the following steps may be further included:
and under the condition that the sliding speed of the touch input is not greater than the third threshold, identifying the operation corresponding to the touch input according to the touch information.
If the sliding speed of the touch input is not greater than the third threshold, it indicates that the touch input is not a fast sliding motion, and an operation corresponding to the touch input, for example, an operation of sliding a page, may be identified according to the detected touch information.
The touch input may include a first touch point and a second touch point, and the sliding speed of the touch input may be specifically determined according to the difference between the acquisition times of the first touch point and the second touch point and the distance between the first touch point and the second touch point.
If the first touch point and the second touch point are touch points on two frames of touch images detected continuously, the obtaining time difference between the first touch point and the second touch point is fixed, and the operation corresponding to the touch input can be identified according to the touch information (namely the first touch point and the second touch point) under the condition that the distance between the first touch point and the second touch point is not greater than a preset threshold value.
Specifically, as shown in fig. 2, the first touch point is P 1 The second touch point is P 2 If P is 1 P 2 The distance between the two points is not more than 80 pixels, and the two points P1 and P2 can be directly reported. That is, additional touch points are not required to be supplemented, the first touch point is the initial touch point of the touch input, and the sliding track line is P 1 P 5
Therefore, if the sliding speed of the touch input is not greater than the third threshold, it can be determined that the touch input is not a quick slide, and thus it is determined that the operation intention of the user is not to trigger the preset function. Therefore, under the condition that the sliding speed of the touch input is not larger than the third threshold, the operation corresponding to the touch input can be directly identified according to the touch information, and the accuracy of operation identification is ensured.
And step 120, determining position information of the additional touch point in the touch screen according to the touch information under the condition that the touch information meets the preset condition.
The touch input may include at least one touch point, so the touch information includes position information of the touch point. The touch position information and the sliding speed of the initial touch point of the touch input can be determined according to the touch information, so that whether a user executes an operation gesture for triggering a preset function can be determined according to the touch information, the touch input is the operation gesture capable of triggering the preset function when the user executes the operation gesture under the condition that the touch information meets a preset condition, the actual operation intention of the user can be determined according to the touch information, and an additional touch point is supplemented in the touch screen, so that the operation can be determined more accurately according to the touch information and the touch position information of the additional touch point in the follow-up process.
In another possible embodiment, the determining the position information of the additional touch point in the touch screen according to the touch information when the touch information satisfies the preset condition includes:
under the condition that the distance between the first touch point and the first edge is smaller than a first threshold value, determining position information of an additional touch point in the touch screen according to touch information;
the distance between the additional touch point and the first edge is smaller than the distance between the first touch point and the first edge, and the distance between the additional touch point and the first edge is smaller than a second threshold value.
The touch screen comprises a first edge and a second edge, and the second edge is an edge of the touch screen except the first edge. In the case that the first edge is the upper edge of the touch screen, the second edge may be the lower edge of the touch screen, the left edge of the touch screen, or the right edge of the touch screen, and the same applies to other cases.
The touch input comprises an initial touch point and an end touch point, the touch input points to the second edge from the first edge, the initial touch point is close to the first edge, and the distance between the initial touch point and the first edge is smaller than a first threshold value. The ending touch point is close to the second edge, that is, the distance between the ending touch point and the second edge is smaller than the distance between the ending touch point and any other edge.
When the touch input points from the first side to the second side, if the trajectory of the touch input is a straight line, the intersection points of the trajectory of the touch input and the edge of the touch screen are located on the first side and the second side, respectively. If the track of the touch input is a curve, the intersection points of the straight line obtained by fitting the track and the edge of the touch screen are respectively positioned on the first edge and the second edge.
The touch input includes a first touch point, and the first touch point is an initial touch point of the touch input, that is, the first touch point is a touch point at which the touch input is first detected.
If the distance between the first touch point and the first edge is smaller than the first threshold, it indicates that the initial touch point of the touch operation is closer to the edge of the touch screen, and at this time, additional touch points need to be supplemented. And extending the starting point of the touch input to the position of the edge of the touch screen, namely, the distance between the additional touch point and the first edge is less than the distance between the first touch point and the first edge, and the distance between the additional touch point and the first edge is less than a second threshold value. By determining the position information of the additional touch points in the edge of the touch screen, the initial touch point of the actual operation intention of the user is supplemented to increase the success rate of the operation recognition.
Therefore, under the condition that the distance between the first touch point and the first edge is smaller than the first threshold, the position information of the additional touch point is determined in the touch screen according to the touch information, the actual operation intention of the user can be judged according to the touch information, the additional touch point is supplemented in the touch screen, the operation determined together according to the touch information and the position information of the additional touch point is more accurate, the operation expected to be realized by the touch input of the user is better met, and the success rate of operation identification is increased.
In the above step of determining, according to the touch information, the position information of the additional touch point in the touch screen when the distance between the first touch point and the first edge is smaller than the first threshold, the method may specifically include the following steps:
and under the condition that the distance between the first touch point and the first edge is smaller than a first threshold value and the sliding speed of the touch input is larger than a third threshold value, determining the position information of the additional touch point in the touch screen according to the touch information.
If the distance between the first touch point and the first edge is smaller than the first threshold and the sliding speed of the touch input is greater than the third threshold, it is indicated that the initial touch point of the touch input is closer to the edge of the touch screen and the sliding speed is faster, so that it is determined that the operation intention of the user is to trigger the preset function, and at this time, an additional touch point needs to be supplemented, that is, the position information of the additional touch point is determined in the touch screen according to the touch information.
The distance between the additional touch point and the first edge is smaller than the distance between the first touch point and the first edge, which indicates that the additional touch point is closer to the first edge than the first touch point. And the distance between the additional touch point and the first edge is smaller than a second threshold value, which indicates that the additional touch point is close to the edge of the touch screen.
If the sliding speed of the touch input is greater than the third threshold, it may be determined that the operation intention of the user is to trigger the preset function, and in this case, a point needs to be added in a direction close to the first edge, as shown in fig. 2, P may be set to be P 0 Supplementing as the initial touch point of the current touch input, wherein P 0 For indicating additional touch points, P 1 The touch control device is used for indicating a first touch point. P 0 The distance from the first edge is less than P 1 Distance from the first edge. By determining the position information of the additional touch points in the touch screen, the touch points which are actually intended by the user can be supplemented, so that the success rate of operation identification is increased.
Therefore, under the condition that the distance between the first touch point and the first edge is smaller than the first threshold value and the sliding speed of the touch input is larger than the third threshold value, the actual operation intention of the user can be judged according to the touch information, additional touch points are supplemented in the touch screen, the position information of the additional touch points is determined in the touch screen, the touch input operation can be conveniently identified according to the touch information and the position information of the additional touch points, and the success rate of operation identification is increased.
The touch input includes a first touch point and a second touch point, a time difference between an acquisition time of the first touch point and an acquisition time of the second touch point is smaller than a fourth threshold, and the method may further include the following steps before determining position information of an additional touch point in the touch screen according to the touch information under the condition that a distance between the first touch point and the first edge is smaller than the first threshold and a sliding speed of the touch input is greater than the third threshold:
determining the distance between the first touch point and the second touch point according to the touch information;
and determining the sliding speed of the touch input according to the distance and time difference value between the first touch point and the second touch point.
The sliding speed of the touch input can be determined according to the sliding distance and the sliding time, and the sliding speed is equal to the sliding distance divided by the sliding time. In a case where the touch input includes the first touch point and the second touch point, a distance between the first touch point and the second touch point may be determined according to the touch information.
The time difference between the acquisition time of the first touch point and the acquisition time of the second touch point is smaller than the fourth threshold, which may indicate that the first touch point and the second touch point are touch points on two frames of touch images detected continuously, and then the acquisition time difference between the first touch point and the second touch point is known, so that the sliding speed of the touch input may be determined according to the distance and the time difference.
Therefore, the sliding speed of touch input can be determined quickly and accurately according to the distance between the first touch point and the second touch point and the time difference value between the acquisition time of the first touch point and the acquisition time of the second touch point.
And step 130, identifying the operation corresponding to the touch input according to the touch information and the position information of the additional touch point.
Wherein, taking the operation gesture of sliding upwards from the bottom edge of the touch screen as an example, the additional touch point (P) 0 ) May be as shown in fig. 2, i.e. at P 1 By adding a point P 0 . Wherein, P 1 For indicating a first touch point, P 0 At P 1 The first touch point is located on the first side of the touch panel, and the second touch point is located on the second side of the touch panel. Therefore, the touch points after the supplementary additional touch points include: p 0 -P 5 According to the touch point P 0 -P 5 The operation of the user is accurately recognized to accurately realize the operation intention of the user.
It should be noted that, the embodiment shown in fig. 2 is exemplified by "touch input sliding upwards from the bottom edge of the touch screen", and in addition, the preset function is started by touch input sliding downwards from the upper edge of the touch screen, touch input sliding rightwards from the left edge of the touch screen, or touch input sliding leftwards from the right edge of the touch screen.
The touch information comprises abscissa information of the touch point and ordinate information of the touch point, and the abscissa information of the additional touch point is consistent with the abscissa information of the first touch point;
or the ordinate information of the additional touch point is consistent with the ordinate information of the first touch point.
Specifically, the abscissa information of the additional touch point coincides with the abscissa information of the first touch point, i.e., P 0 The abscissa of (a) is the abscissa of P1.
Usually, taking the upper left corner of the touch screen as the origin of coordinates, taking sliding from bottom to top as an example, the distance between the additional touch point and the first edge is smaller than the second threshold, that is, the additional touch point P 0 Is smaller than a second threshold value, and the absolute difference value between the ordinate information of (a) and the ordinate information of any point on the first edge is smaller than the second threshold value.
For example, the first edge is a lower edge of the touch screen, the second edge is an upper edge of the touch screen, a distance between the first edge and the second edge of the touch screen is 1600, the second threshold is 1, that is, the ordinate information of any point on the first edge is 1600, P is 0 May be selected as 1599.
Assuming that the touch position information of the first touch point is (200, 1500), the position information of the additional touch point may be (200, 1599). The abscissa information of the additional touch point and the abscissa information of the first touch point are both 200, and the distance between the additional touch point and the first edge (1600-.
The touch input of sliding downwards from the upper edge of the touch screen, the touch input of sliding rightwards from the left edge of the touch screen, or the touch input of sliding leftwards from the right edge of the touch screen can be obtained by the same method.
For example, when the touch input is a touch input that slides to the right from the left edge of the touch screen or a touch input that slides to the left from the right edge of the touch screen, the ordinate information of the additional touch point matches the ordinate information of the first touch point.
Therefore, the abscissa information of the first touch point is used as the abscissa information of the additional touch point, and the touch position information of the additional touch point can be determined quickly and effectively.
To sum up, in the embodiment of the present application, by obtaining touch information of a touch input, when the touch information meets a preset condition, it is described that the touch input is an operation gesture that a user executes a preset function, and then, position information of an additional touch point is determined in a touch screen according to the touch information, an actual operation intention of the user can be determined according to the touch information, and then the additional touch point is supplemented in the touch screen.
According to the operation identification method provided by the embodiment of the application, the execution main body can be an operation identification device. In the embodiment of the present application, an operation recognition method executed by an operation recognition device is taken as an example, and the operation recognition device provided in the embodiment of the present application is described.
Fig. 3 is a block diagram of an operation recognition device according to an embodiment of the present application, where the operation recognition device 300 includes:
the obtaining module 310 is configured to obtain touch information of the touch input, where the touch information includes touch position information and obtaining time of the touch position information.
The determining module 320 is configured to determine, according to the touch information, position information of an additional touch point in the touch screen when the touch information meets a preset condition.
The identifying module 330 is configured to identify an operation corresponding to the touch input according to the touch information and the position information of the additional touch point.
In this way, in the embodiment of the application, by acquiring the touch information of the touch input, when the touch information meets the preset condition, it is described that the touch input is an operation gesture which can trigger the preset function when the user executes, then, the position information of the additional touch point is determined in the touch screen according to the touch information, the actual operation intention of the user can be judged according to the touch information, and then the additional touch point is supplemented in the touch screen, and the operation determined together according to the touch information and the position information of the additional touch point is more accurate and more conforms to the operation which is expected to be realized by the touch input of the user, so that the accuracy of operation identification can be improved.
Optionally, the touch screen includes a first edge and a second edge, the second edge is an edge of the touch screen other than the first edge, the touch input points to the second edge from the first edge, the touch input includes a first touch point, and the first touch point is an initial touch point of the touch input, and the determining module 320 is specifically configured to:
under the condition that the distance between the first touch point and the first edge is smaller than a first threshold value, determining position information of an additional touch point in the touch screen according to touch information;
the distance between the additional touch point and the first edge is smaller than the distance between the first touch point and the first edge, and the distance between the additional touch point and the first edge is smaller than a second threshold value.
Optionally, the determining module 320 is specifically configured to:
and under the condition that the distance between the first touch point and the first edge is smaller than a first threshold value and the sliding speed of the touch input is larger than a third threshold value, determining the position information of the additional touch point in the touch screen according to the touch information.
Optionally, the touch input includes a first touch point and a second touch point, and a time difference between the acquisition time of the first touch point and the acquisition time of the second touch point is smaller than a fourth threshold, and the determining module 320 is further configured to:
determining the distance between the first touch point and the second touch point according to the touch information;
and determining the sliding speed of the touch input according to the distance and time difference value between the first touch point and the second touch point.
Optionally, the touch information includes abscissa information of the touch point and ordinate information of the touch point, and the abscissa information of the additional touch point is consistent with the abscissa information of the first touch point; or the ordinate information of the additional touch point is consistent with the ordinate information of the first touch point.
Optionally, the identifying module 330 is further configured to:
and under the condition that the sliding speed of the touch input is not greater than the third threshold, identifying the operation corresponding to the touch input according to the touch information.
The operation identification device in the embodiment of the present application may be an electronic device, or may be a component in an electronic device, such as an integrated circuit or a chip. The electronic device may be a terminal, or may be a device other than a terminal. The electronic Device may be, for example, a Mobile phone, a tablet computer, a notebook computer, a palm top computer, a vehicle-mounted electronic Device, a Mobile Internet Device (MID), an Augmented Reality (AR)/Virtual Reality (VR) Device, a robot, a wearable Device, an ultra-Mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), and the like, and may also be a server, a Network Attached Storage (NAS), a Personal Computer (PC), a Television (TV), a teller machine, a self-service machine, and the like, and the embodiments of the present application are not particularly limited.
The operation recognition device of the embodiment of the present application may be a device having an action system. The action system may be an Android (Android) action system, an ios action system, or other possible action systems, and the embodiment of the present application is not particularly limited.
The operation identification device provided in the embodiment of the present application can implement each process implemented by the above method embodiment, and is not described here again to avoid repetition.
Optionally, as shown in fig. 4, an electronic device 410 is further provided in this embodiment of the present application, and includes a processor 411, a memory 412, and a program or an instruction stored in the memory 412 and capable of being executed on the processor 411, where the program or the instruction is executed by the processor 411 to implement each step of any one of the above operation identification method embodiments, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
It should be noted that the electronic device according to the embodiment of the present application includes the mobile electronic device and the non-mobile electronic device described above.
Fig. 5 is a schematic diagram of a hardware structure of an electronic device implementing an embodiment of the present application.
The electronic device 500 includes, but is not limited to: a radio frequency unit 501, a network module 502, an audio output unit 503, an input unit 504, a sensor 505, a display unit 506, a user input unit 507, an interface unit 508, a memory 509, a processor 510, and the like.
Those skilled in the art will appreciate that the electronic device 500 may further comprise a power supply (e.g., a battery) for supplying power to various components, and the power supply may be logically connected to the processor 510 via a power management system, so as to implement functions of managing charging, discharging, and power consumption via the power management system. The electronic device structure shown in fig. 5 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than those shown, or combine some components, or arrange different components, and thus, the description is omitted here.
The processor 510 is configured to obtain touch information of the touch input, where the touch information includes touch position information and obtaining time of the touch position information.
The processor 510 is further configured to determine, according to the touch information, position information of an additional touch point in the touch screen when the touch information meets a preset condition.
The processor 510 is further configured to identify an operation corresponding to the touch input according to the touch information and the position information of the additional touch point.
In this way, in the embodiment of the application, by acquiring the touch information of the touch input, when the touch information meets the preset condition, it is described that the touch input is an operation gesture which can trigger the preset function when the user executes, then, the position information of the additional touch point is determined in the touch screen according to the touch information, the actual operation intention of the user can be judged according to the touch information, and then the additional touch point is supplemented in the touch screen, and the operation determined together according to the touch information and the position information of the additional touch point is more accurate and more conforms to the operation which is expected to be realized by the touch input of the user, so that the accuracy of operation identification can be improved.
Optionally, the processor 510 is further configured to determine, according to the touch information, position information of an additional touch point in the touch screen when a distance between the first touch point and the first edge is smaller than a first threshold;
the distance between the additional touch point and the first edge is smaller than the distance between the first touch point and the first edge, and the distance between the additional touch point and the first edge is smaller than a second threshold value.
Optionally, the processor 510 is further configured to determine, according to the touch information, position information of the additional touch point in the touch screen when a distance between the first touch point and the first edge is smaller than a first threshold and a sliding speed of the touch input is greater than a third threshold.
Optionally, the touch input includes a first touch point and a second touch point, a time difference between an acquisition time of the first touch point and an acquisition time of the second touch point is smaller than a fourth threshold, and the processor 510 is further configured to determine a distance between the first touch point and the second touch point according to the touch information when a distance between the first touch point and the first edge is smaller than the first threshold and a sliding speed of the touch input is greater than the third threshold;
and determining the sliding speed of the touch input according to the distance and time difference value between the first touch point and the second touch point.
Optionally, the processor 510 is further configured to identify, according to the touch information, an operation corresponding to the touch input, when the sliding speed of the touch input is not greater than the third threshold.
In summary, by acquiring the touch information of the touch input, when the touch information meets the preset condition, it is described that the touch input is an operation gesture that a user performs to trigger a preset function, then, the position information of the additional touch point is determined in the touch screen according to the touch information, the actual operation intention of the user can be determined according to the touch information, and then the additional touch point is supplemented in the touch screen, and the operation determined according to the touch information and the position information of the additional touch point is more accurate and more conforms to the operation that the touch input of the user is expected to achieve, so that the accuracy of operation identification can be improved.
It should be understood that in the embodiment of the present application, the input Unit 504 may include a Graphics Processing Unit (GPU) 5041 and a microphone 5042, and the Graphics processor 5041 processes image data of a still picture or a video image obtained by an image capturing device (such as a camera) in a video image capturing mode or an image capturing mode. The display unit 506 may include a display panel 5061, and the display panel 5061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 507 includes at least one of a touch panel 5071 and other input devices 5072. Touch panel 5071, also referred to as a touch screen. The touch panel 5071 may include two parts of a touch detection device and a touch controller. Other input devices 5072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in further detail herein. The memory 509 may be used to store software programs as well as various data including, but not limited to, applications and action systems. Processor 510 may integrate an application processor, which primarily handles motion systems, user pages, applications, etc., and a modem processor, which primarily handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 510.
The memory 509 may be used to store software programs as well as various data. The memory 509 may mainly include a first storage area storing a program or an instruction and a second storage area storing data, wherein the first storage area may store an operating system, an application program or an instruction (such as a sound playing function, an image playing function, etc.) required for at least one function, and the like. Further, the memory 509 may comprise volatile memory or non-volatile memory, or the memory x09 may comprise both volatile and non-volatile memory. The non-volatile Memory may be a Read-Only Memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an Electrically Erasable PROM (EEPROM), or a flash Memory. The volatile Memory may be a Random Access Memory (RAM), a Static Random Access Memory (Static RAM, SRAM), a Dynamic Random Access Memory (Dynamic RAM, DRAM), a Synchronous Dynamic Random Access Memory (Synchronous DRAM, SDRAM), a Double Data Rate Synchronous Dynamic Random Access Memory (Double Data Rate SDRAM, ddr SDRAM), an Enhanced Synchronous SDRAM (ESDRAM), a Synchronous Link DRAM (SLDRAM), and a Direct Memory bus RAM (DRRAM). The memory 509 in the embodiments of the subject application includes, but is not limited to, these and any other suitable types of memory.
Processor 510 may include one or more processing units; optionally, the processor 510 integrates an application processor, which mainly handles operations related to the operating system, user interface, and applications, and a modem processor, which mainly handles wireless communication signals, such as a baseband processor. It will be appreciated that the modem processor described above may not be integrated into processor 510.
The embodiment of the present application further provides a readable storage medium, where a program or an instruction is stored on the readable storage medium, and when the program or the instruction is executed by a processor, the program or the instruction implements each process of the above operation identification method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
The processor is the processor in the electronic device described in the above embodiment. The readable storage medium includes a computer readable storage medium, such as a computer read only memory ROM, a random access memory RAM, a magnetic or optical disk, and the like.
The embodiment of the present application further provides a chip, where the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to run a program or an instruction to implement each process of the above operation identification method embodiment, and can achieve the same technical effect, and the details are not repeated here to avoid repetition.
It should be understood that the chips mentioned in the embodiments of the present application may also be referred to as system-on-chip, system-on-chip or system-on-chip, etc.
Embodiments of the present application provide a computer program product, where the program product is stored in a storage medium, and the program product is executed by at least one processor to implement the processes in the foregoing operation identification method embodiments, and can achieve the same technical effects, and in order to avoid repetition, details are not described here again.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Further, it should be noted that the scope of the methods and apparatus of the embodiments of the present application is not limited to performing the functions in the order illustrated or discussed, but may include performing the functions in a substantially simultaneous manner or in a reverse order based on the functions involved, e.g., the methods described may be performed in an order different than that described, and various steps may be added, omitted, or combined. In addition, features described with reference to certain examples may be combined in other examples.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application or portions thereof that contribute to the prior art may be embodied in the form of a computer software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (which may be a mobile phone, a computer, a server, or a network device, etc.) to execute the method according to the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the invention is not limited to the precise embodiments described above, which are meant to be illustrative and not restrictive, and that various changes may be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (14)

1. An operation identification method, characterized in that the method comprises:
acquiring touch information of touch input, wherein the touch information comprises touch position information and acquisition time of the touch position information;
determining position information of an additional touch point in a touch screen according to the touch information under the condition that the touch information meets a preset condition;
and identifying the operation corresponding to the touch input according to the touch information and the position information of the additional touch point.
2. The method according to claim 1, wherein the touch screen includes a first edge and a second edge, the second edge is an edge of the touch screen other than the first edge, the touch input is directed from the first edge to the second edge, the touch input includes a first touch point, the first touch point is a start touch point of the touch input, and determining position information of an additional touch point in the touch screen according to the touch information when the touch information satisfies a predetermined condition includes:
under the condition that the distance between the first touch point and the first edge is smaller than a first threshold value, determining position information of an additional touch point in the touch screen according to the touch information;
wherein a distance between the additional touch point and the first edge is smaller than a distance between the first touch point and the first edge, and a distance between the additional touch point and the first edge is smaller than a second threshold.
3. The method according to claim 2, wherein determining position information of additional touch points in the touch screen according to the touch information in a case that a distance between the first touch point and the first edge is smaller than a first threshold value comprises:
and under the condition that the distance between the first touch point and the first edge is smaller than the first threshold value and the sliding speed of the touch input is larger than a third threshold value, determining the position information of an additional touch point in the touch screen according to the touch information.
4. The method according to claim 3, wherein the touch input includes the first touch point and a second touch point, a time difference between an acquisition time of the first touch point and an acquisition time of the second touch point is smaller than a fourth threshold, and in a case that a distance between the first touch point and the first edge is smaller than the first threshold and a sliding speed of the touch input is greater than a third threshold, before determining position information of an additional touch point in the touch screen according to the touch information, the method further comprises:
determining the distance between the first touch point and the second touch point according to the touch information;
and determining the sliding speed of the touch input according to the distance between the first touch point and the second touch point and the time difference value.
5. The method of claim 2, wherein the touch information comprises abscissa information of the touch point and ordinate information of the touch point, and the abscissa information of the additional touch point is consistent with the abscissa information of the first touch point;
or the ordinate information of the additional touch point is consistent with the ordinate information of the first touch point.
6. The method according to claim 1 or 2, wherein after the obtaining of the touch information of the touch input, the method further comprises:
and under the condition that the sliding speed of the touch input is not greater than a third threshold, identifying the operation corresponding to the touch input according to the touch information.
7. An operation recognition apparatus, characterized in that the apparatus comprises:
the acquisition module is used for acquiring touch information of touch input; the touch information comprises touch position information and acquisition time of the touch position information;
the determining module is used for determining the position information of the additional touch point in the touch screen according to the touch information under the condition that the touch information meets the preset condition;
and the identification module is used for identifying the operation corresponding to the touch input according to the touch information and the position information of the additional touch point.
8. The apparatus of claim 7, wherein the touch screen comprises a first edge and a second edge, the second edge is an edge of the touch screen other than the first edge, the touch input points from the first edge to the second edge, the touch input comprises a first touch point, the first touch point is a starting touch point of the touch input, and the determining module is specifically configured to:
under the condition that the distance between the first touch point and the first edge is smaller than a first threshold value, determining position information of an additional touch point in the touch screen according to the touch information;
wherein a distance between the additional touch point and the first edge is smaller than a distance between the first touch point and the first edge, and a distance between the additional touch point and the first edge is smaller than a second threshold.
9. The apparatus of claim 8, wherein the determining module is specifically configured to:
and under the condition that the distance between the first touch point and the first edge is smaller than the first threshold value and the sliding speed of the touch input is larger than a third threshold value, determining the position information of an additional touch point in the touch screen according to the touch information.
10. The apparatus of claim 9, wherein the touch input comprises the first touch point and a second touch point, a time difference between an acquisition time of the first touch point and an acquisition time of the second touch point is smaller than a fourth threshold, and the determining module is further configured to:
determining the distance between the first touch point and the second touch point according to the touch information;
and determining the sliding speed of the touch input according to the distance between the first touch point and the second touch point and the time difference value.
11. The apparatus of claim 8, wherein the touch information comprises abscissa information of the touch point and ordinate information of the touch point, and the abscissa information of the additional touch point is consistent with the abscissa information of the first touch point;
or the ordinate information of the additional touch point is consistent with the ordinate information of the first touch point.
12. The apparatus of claim 7 or 8, wherein the identification module is further configured to:
and under the condition that the sliding speed of the touch input is not greater than a third threshold, identifying the operation corresponding to the touch input according to the touch information.
13. An electronic device comprising a processor and a memory, the memory storing a program or instructions executable on the processor, the program or instructions when executed by the processor implementing the steps of the operation recognition method of any one of claims 1 to 6.
14. A readable storage medium, on which a program or instructions are stored, which when executed by a processor implement the steps of the operation identification method according to any one of claims 1 to 6.
CN202210467561.7A 2022-04-29 2022-04-29 Operation identification method and device, electronic equipment and readable storage medium Pending CN114816213A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210467561.7A CN114816213A (en) 2022-04-29 2022-04-29 Operation identification method and device, electronic equipment and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210467561.7A CN114816213A (en) 2022-04-29 2022-04-29 Operation identification method and device, electronic equipment and readable storage medium

Publications (1)

Publication Number Publication Date
CN114816213A true CN114816213A (en) 2022-07-29

Family

ID=82510465

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210467561.7A Pending CN114816213A (en) 2022-04-29 2022-04-29 Operation identification method and device, electronic equipment and readable storage medium

Country Status (1)

Country Link
CN (1) CN114816213A (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104731514A (en) * 2015-04-09 2015-06-24 努比亚技术有限公司 Method and device for recognizing single-hand-holding touch operation in touch operation area
CN106201309A (en) * 2016-06-29 2016-12-07 维沃移动通信有限公司 A kind of status bar processing method and mobile terminal

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104731514A (en) * 2015-04-09 2015-06-24 努比亚技术有限公司 Method and device for recognizing single-hand-holding touch operation in touch operation area
CN106201309A (en) * 2016-06-29 2016-12-07 维沃移动通信有限公司 A kind of status bar processing method and mobile terminal

Similar Documents

Publication Publication Date Title
US11592980B2 (en) Techniques for image-based search using touch controls
US9189152B2 (en) Touch device and method for dynamically setting touch inactive area, and non-transitory recording medium
US9025878B2 (en) Electronic apparatus and handwritten document processing method
CN106572207B (en) Device and method for identifying single-hand mode of terminal
CN114327726A (en) Display control method, display control device, electronic equipment and storage medium
CN107577404B (en) Information processing method and device and electronic equipment
CN104750401A (en) Touch method and related device as well as terminal equipment
CN114089868A (en) Touch operation method and device and electronic equipment
CN106201078B (en) Track completion method and terminal
CN108021313B (en) Picture browsing method and terminal
CN114816213A (en) Operation identification method and device, electronic equipment and readable storage medium
CN114115639A (en) Interface control method and device, electronic equipment and storage medium
CN114518859A (en) Display control method, display control device, electronic equipment and storage medium
CN113791725A (en) Touch pen operation identification method, intelligent terminal and computer readable storage medium
CN108595091B (en) Screen control display method and device and computer readable storage medium
CN110568989A (en) service processing method, service processing device, terminal and medium
CN106502515B (en) Picture input method and mobile terminal
KR20160109238A (en) Method of browsing digital content using gesture and computing device operating with the method
CN111324273A (en) Media display implementation method and device
CN104007886A (en) Information processing method and electronic device
JP2018185569A (en) Information processing apparatus, display control program and display control method
CN117555458A (en) Icon moving method and device, electronic equipment and readable storage medium
CN115480664A (en) Touch response method and device, electronic equipment and storage medium
CN114415929A (en) Control method and device of electronic equipment, electronic equipment and readable storage medium
CN115358251A (en) Graphic code identification method, graphic code identification device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination