CN113138693B - Operation identification method and device, electronic equipment and storage medium - Google Patents

Operation identification method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN113138693B
CN113138693B CN202110396825.XA CN202110396825A CN113138693B CN 113138693 B CN113138693 B CN 113138693B CN 202110396825 A CN202110396825 A CN 202110396825A CN 113138693 B CN113138693 B CN 113138693B
Authority
CN
China
Prior art keywords
touch
area
divided areas
parameters
divided
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110396825.XA
Other languages
Chinese (zh)
Other versions
CN113138693A (en
Inventor
蒋飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202110396825.XA priority Critical patent/CN113138693B/en
Publication of CN113138693A publication Critical patent/CN113138693A/en
Application granted granted Critical
Publication of CN113138693B publication Critical patent/CN113138693B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application discloses an operation identification method and device, electronic equipment and a storage medium. The method comprises the following steps: if touch control is detected, acquiring a touch control position of the touch control; acquiring a touch area corresponding to the touch position; and identifying the touch category corresponding to the touch based on the touch parameter corresponding to the touch area. By the method, different touch areas use different touch parameters to identify the touch operation, different touch requirements in different touch areas can be met, and meanwhile, the accuracy of identifying the touch operation can be improved.

Description

Operation identification method and device, electronic equipment and storage medium
Technical Field
The application belongs to the technical field of terminals, and particularly relates to an operation identification method and device, an electronic device and a storage medium.
Background
Along with the increasing popularization of electronic devices, especially, the rapid update of many intelligent electronic products including mobile phones, small notebooks, large notebooks, etc. makes the user demand for such products to be used more and more. However, when the whole touch area of the intelligent electronic product is controlled, the same control parameters are generally used for controlling, so that the touch area cannot respond well to some touch operations.
Disclosure of Invention
In view of the above problems, the present application provides an operation recognition method, apparatus, electronic device, and storage medium to achieve an improvement of the above problems.
In a first aspect, an embodiment of the present application provides an operation identification method, where the method includes: if touch control is detected, acquiring a touch control position of the touch control; acquiring a touch area corresponding to the touch position; and identifying the touch category corresponding to the touch based on the touch parameter corresponding to the touch area.
In a second aspect, an embodiment of the present application provides an operation identification apparatus, including: a position acquisition unit, configured to acquire a touch position of the touch if the touch is detected; the area acquisition unit is used for acquiring a touch area corresponding to the touch position; and the identification unit is used for identifying the touch category corresponding to the touch based on the touch parameter corresponding to the touch area.
In a third aspect, an embodiment of the present application provides an electronic device, including one or more processors and a memory; one or more programs, wherein the one or more programs are stored in the memory and configured to be executed by the one or more processors, the one or more programs configured to perform the methods described above.
In a fourth aspect, the present application provides a computer-readable storage medium, in which a program code is stored, wherein the program code performs the above-mentioned method when running.
The embodiment of the application provides an operation identification method and device, electronic equipment and a storage medium. Firstly, if touch is detected, acquiring a touch position of the touch, then acquiring a touch area corresponding to the touch position, and identifying a touch category corresponding to the touch based on a touch parameter corresponding to the touch area. By the method, different touch areas use different touch parameters to identify the touch operation, different touch requirements in different touch areas can be met, and meanwhile, the accuracy of identifying the touch operation can be improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 is a schematic diagram of a game scenario proposed in an embodiment of the present application;
FIG. 2 is a flow chart illustrating an operation recognition method according to an embodiment of the present application;
FIG. 3 is a flow chart illustrating a method of operation identification according to another embodiment of the present application;
fig. 4 is a flowchart illustrating a method for acquiring a touch position in an operation recognition method according to another embodiment of the present application;
fig. 5 is a schematic diagram illustrating obtaining a touch position in an operation recognition method according to another embodiment of the present application;
fig. 6 is a schematic diagram illustrating a filtering process performed in an operation recognition method according to another embodiment of the present application;
FIG. 7 is a flow chart illustrating a method of operation identification according to yet another embodiment of the present application;
FIG. 8 is a flow chart illustrating a method of operation identification according to yet another embodiment of the present application;
fig. 9 is a flowchart illustrating a configuration of touch parameters according to yet another embodiment of the present application;
fig. 10 is a schematic diagram illustrating a region division scenario according to yet another embodiment of the present application;
fig. 11 is a schematic view illustrating a scene of a display screen rotation according to still another embodiment of the present application;
fig. 12 is a block diagram illustrating a structure of an operation recognition apparatus according to an embodiment of the present application;
fig. 13 is a block diagram showing another structure of an operation recognition apparatus according to an embodiment of the present application;
fig. 14 is a block diagram showing a structure of still another operation recognition apparatus according to an embodiment of the present application;
FIG. 15 is a block diagram of an electronic device for executing an operation recognition method according to an embodiment of the present application in real time;
fig. 16 illustrates a storage unit for storing or carrying program code implementing an operation recognition method according to an embodiment of the present application in real time.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
With the development of society, mobile phones have become an indispensable object for people, and mobile phone games have become the first choice for people to play and enjoy. The related mobile phone game is usually based on a user performing a touch operation on a touch screen of the electronic device to execute a corresponding game operation.
However, the inventor of the present invention has found in research on related operation recognition methods that when the entire touch area of an electronic device is controlled, the same control parameters are usually used for controlling, so that the touch area may not respond well to some touch operations. When a user performs a touch operation in a touch screen of an electronic device to perform a corresponding game operation, in some scenarios (e.g., a game, as shown in fig. 1), functions of different touch areas are significantly different, for example, the operations of the a area and the B area in fig. 1 are mainly performed by a sliding operation, and the operations of the C area and the D area are mainly performed by a clicking operation. Therefore, in this case, the user desires that the click response be faster and more accurate, the long slide operation be smoother, and the short slide operation be more sensitive. However, when the whole touch area of the electronic device is controlled, the same touch parameters are used to control the touch area of the electronic device, so that the above effects cannot be satisfied, and if the touch parameters are set to be biased to click operation, the sliding operation may not be responded well; if the touch parameter is set to be more biased toward the sliding operation, it may result in a poor response to the clicking operation.
Therefore, the inventor proposes an operation identification method, an apparatus, an electronic device, and a storage medium in the present application, in which a touch position of touch is obtained first if touch is detected, then a touch area corresponding to the touch position is obtained, a touch category corresponding to the touch is identified based on a touch parameter corresponding to the touch area, and different touch areas use different touch parameters to identify touch operation, so that different touch requirements in different touch areas can be met, and meanwhile, accuracy of identifying the touch operation can be improved.
Embodiments of the present application will be described in detail below with reference to the accompanying drawings.
Referring to fig. 2, an operation identification method provided in the embodiment of the present application includes:
step S110: and if the touch control is detected, acquiring the touch control position of the touch control.
In this embodiment of the present application, when the electronic device detects that a certain area is touched, a touch position of the touch is obtained, where the touch position may include all touch positions of the touch, and may include a start position and an end position, and in this embodiment of the present application, the start position and the end position may be represented by position coordinates of a touch point. The initial position of touch control can be understood as the position coordinates of a contact point which is initially contacted with the touch control area of the display screen when a user performs touch control in the touch control area of the display screen; the touch ending position may be understood as a position coordinate of a contact point that is in contact with the touch area of the display screen when the user releases the touch area of the display screen when performing touch in the touch area of the display screen. For example, if the touch is a slide, the touch position of the slide may be a start point position and an end point position of the slide corresponding to the slide track. The starting point position of the sliding track is the position coordinate of a contact point which is initially contacted with the display screen when a user slides on the display screen; the end position of the sliding track is the position coordinate of a contact point which is contacted with the display screen when the user releases the display screen when the user slides on the display screen.
As a way of doing this, corresponding sensor devices, such as pressure sensors, acceleration sensors, gyroscopes, etc., may be integrated inside the electronic device. Then, whether touch is detected or not can be determined through a sensor arranged in the electronic device, and meanwhile, a touch position of the touch can be acquired through the sensor arranged in the electronic device. Alternatively, whether touch is detected may be determined by a pressure sensor provided inside the electronic device. Specifically, a pressure threshold may be preset, and when it is detected that a pressure value of a touch performed by a user in a touch area of the display screen exceeds the preset pressure threshold, it is determined that the touch is detected. After it is determined that touch is detected, a touch position of the touch may be obtained through a sensor provided inside the electronic device, and the touch position may include a start position and an end position.
Step S120: and acquiring a touch area corresponding to the touch position.
In this embodiment of the present application, a touch area of a display screen may be divided into a plurality of areas in advance, and after the plurality of areas are divided, the area may be represented by contour information corresponding to each area, where the contour information may be a length of a contour line corresponding to the area or an intersection coordinate of the contour lines. For example, if the touch area of the display screen is divided into a plurality of rectangular areas, the length and width of each rectangular area may be used to represent the area, the vertex coordinates of each rectangular area may be used to represent the area, or the two rectangular areas may be combined to represent the rectangular area.
As a manner, after the touch position corresponding to the touch is obtained in the manner described above, the touch area corresponding to the touch position may be determined by using the position coordinates of the touch position. The touch area corresponding to the touch position may be one or more touch areas in a plurality of touch areas of the display screen. Specifically, it may be detected whether the position coordinates corresponding to the touch position are in a certain touch area to determine the touch area corresponding to the touch position, and if the touch position corresponding to the touch position is completely in one of the touch areas, the touch area is the touch area corresponding to the touch position.
Optionally, there may be a plurality of touch areas corresponding to touch positions, and if the position coordinate of the start position of touch is in one touch area and the position coordinate of the end position of touch is in the other touch area, the two touch areas are the touch areas corresponding to the touch positions of touch. For example, if the position coordinate of the start position of the touch is in the area a and the position coordinate of the end position of the touch is in the area B, then both the area a and the area B are the touch areas corresponding to the touch position of the touch.
Step S130: and identifying the operation category corresponding to the touch based on the touch parameter corresponding to the touch area.
In the embodiment of the application, the touch parameter may be set by a manufacturer when the electronic device leaves a factory, or may be set by a user in a self-defined manner. After the touch area of the display screen is divided into a plurality of touch areas, different touch parameters can be set for the different touch areas in advance, and then after the touch area corresponding to the touch position of touch is obtained, the touch parameter corresponding to the touch area can be directly obtained, so that the touch category corresponding to touch can be identified based on the corresponding touch parameter, and the identified touch category corresponding to touch can be press operation, slide operation, click operation and the like of a user on the touch area of the display screen.
Specifically, when different touch parameters are set for different touch areas, the touch parameters may be set for the main touch corresponding to the area, for example, if the main touch corresponding to the area a is a click operation, the touch parameters corresponding to the area a may be set as the touch parameters suitable for responding to the click operation. Further, when the touch area corresponding to the touch position corresponding to the current touch is acquired as the area a, the acquired corresponding touch parameter is a touch parameter suitable for responding to the click operation, and whether the touch category of the current touch is the click operation can be identified based on the corresponding touch parameter.
By the method, the corresponding relation among the touch area, the touch parameters and the touch can be established, and further the touch corresponding to the touch area and the touch parameters can be determined through the corresponding relation among the touch area, the touch parameters and the touch.
According to the operation identification method, firstly, if touch is detected, a touch position of the touch is obtained, then a touch area corresponding to the touch position is obtained, and the touch is identified based on touch parameters corresponding to the touch area. By the method, different touch areas use different touch parameters to identify the touch operation, different touch requirements in different touch areas can be met, and meanwhile, the accuracy of identifying the touch operation can be improved.
Referring to fig. 3, an operation identification method provided in the embodiment of the present application includes:
step S210: when the occurrence of a specified event is detected, a plurality of divided areas of the display screen corresponding to the specified event are acquired.
In the embodiment of the application, the touch area of the display screen can be divided into a plurality of corresponding divided areas in advance according to different specified events, the corresponding relation between different specified events and the plurality of divided areas of the display screen is established, and then when the specified event is detected to occur, the plurality of divided areas of the display screen corresponding to the specified event can be directly obtained. The corresponding relationship may be stored in the cloud server, or may be stored in a local storage area of the electronic device.
As one mode, the designated event may be a preset touch, or may be a different game starting event.
Specifically, when a certain game is detected to be started or a preset touch is detected, a plurality of divided areas of the display screen corresponding to the game or the touch may be acquired from a local storage area of the cloud server or the electronic device. For example, for game a, the touch area of the display screen may be preset to be divided into area 1, area 2 and area 3; for game B, the touch area of the display screen may be divided into an area a and an area B in advance. Establishing a corresponding relation between the game A and the area 1, the area 2 and the area 3, establishing a corresponding relation between the game B and the area a and the area B, and storing the corresponding relation in a local storage area of the electronic equipment or a cloud server, and further, when the game A is detected to be started, directly searching the area 1, the area 2 and the area 3 corresponding to the game A from the local storage area of the electronic equipment or the cloud server.
Step S220: and acquiring touch parameters corresponding to the plurality of divided areas.
In the embodiment of the application, different touch parameters may be preset for a plurality of divided areas of the display screen corresponding to the specified event, and a corresponding relationship among the specified event, the plurality of divided areas, and the touch parameters may be established and stored. Illustratively, as described above, the plurality of divided areas of the display screen corresponding to the game a include the area 1, the area 2, and the area 3, the touch parameter of the area 1 may be set as the parameter 1 in advance, the touch parameter of the area 2 may be set as the parameter 2, the touch parameter of the area 3 may be set as the parameter 3, a correspondence relationship between the game a, the area 1, the area 2, the area 3, the parameter 1, the parameter 2, and the parameter 3 may be established, and the correspondence relationship may be stored in a local storage area of the electronic device or in a cloud server. When the plurality of divided areas corresponding to the game a including the area 1, the area 2, and the area 3 are obtained, the touch parameter corresponding to the area 1 may be the parameter 1, the touch parameter corresponding to the area 2 may be the parameter 2, and the touch parameter corresponding to the area 3 may be the parameter 3, which are searched from a local storage area of the electronic device or a cloud server. If the corresponding relation is stored in the cloud server, after the game a is detected to be started, the cloud server can be removed according to the application name of the game a to search for corresponding configuration, specifically, the electronic device can send an inquiry request to the cloud server, after the cloud server receives the inquiry request, the cloud server returns a corresponding divided area and touch parameters corresponding to the divided area to the electronic device, and after receiving the information, the electronic device stores the information and updates the touch parameters of the divided areas corresponding to the game a to the latest acquired touch parameters.
Step S230: and updating the touch parameters of the plurality of divided areas into the touch parameters corresponding to the divided areas.
As one mode, after the touch parameter corresponding to each of the plurality of divided areas is acquired, the touch parameter of each of the plurality of divided areas is updated to the touch parameter corresponding to each of the plurality of divided areas.
Step S240: and if the touch control is detected, acquiring the touch control position of the touch control.
As one mode, as shown in fig. 4, the acquiring a touch position of the touch if the touch is detected includes:
step S241: and acquiring the distance between the touch position and the touch position of the previous touch.
Specifically, in the embodiment of the present application, in order to avoid some small touch jitters from being recognized as movement, when a touch is detected, a distance between a touch position of the touch and a touch position of a previous touch is obtained, and the distance is compared with a preset lock point distance to determine why the touch is specifically touched.
The lock point distance is a preset distance between two touch points which can distinguish click operation and sliding operation, and the setting of the lock point distance can affect the recognition of the click operation and the sliding operation, so that the lock point distance can be set to be a proper distance through multiple tests, which is not specifically limited herein. It will be appreciated that the greater the value of the lock point distance, the more advantageous the recognition of a click operation.
When a user touches a touch area of the display screen, whether touch is detected or not can be determined through a sensor arranged in the electronic device, if touch is detected, a touch position of the touch is obtained, and then a distance between the touch position of the touch and a touch position of a previous touch can be obtained. Specifically, the distance may be calculated by using the position coordinates of the touch point corresponding to the touch position of the touch and the position coordinates of the touch point corresponding to the touch position of the previous touch.
Step S242: and if the distance is smaller than the preset lock point distance, taking the touch position of the previous touch as the touch position of the current touch.
As a manner, after the distance between the touch position of the current touch and the touch position of the previous touch is obtained through calculation in the manner described above, the distance is compared with a preset lock point distance, and if the distance is smaller than the preset lock point distance, it is recognized that the user does not move a finger, and at this time, the touch position of the previous touch may be used as the touch position of the current touch and reported to the cloud server or the background server. As shown in fig. 5, the lock point distance is set as a touch area parameter R1 in advance, and the filter parameter is set as a touch area parameter R2. If the touch position of the current touch is P1, acquiring a touch parameter R of a divided area where the current touch is located, calculating the distance between the current touch position P1 and the touch position P0 of the previous touch, and if the distance between the current touch position P0 and the previous touch position P0 is smaller than the touch area parameter R1 (lock point distance), reporting the P0 as the touch position of the current touch to a cloud server or a background server.
Step S243: and if the distance is greater than the lock point distance, filtering the touch position of the touch to obtain an estimated touch position, and taking the estimated touch position as the current touch position.
Specifically, if the distance is greater than the preset distance, the touch will be recognized as a sliding operation, and after the sliding operation is recognized, filtering processing will be performed on the current touch position, and the touch position of the touch obtained after processing will be used as the current touch position. As shown in fig. 5, if the distance from the previous touch position P0 is greater than the touch area parameter R1 (lock point distance), the touch position P1 is filtered to obtain an estimated position P2, and the estimated position P2 is reported to the cloud server or the background server as the touch position of the current touch.
In order to make the current touch position closer to the actual touch position, the current touch position needs to be filtered 6. As shown in fig. 6, the curve in fig. 6 is a curve of actual sliding of the touch of the user, but the touch points corresponding to the touch detected by the electronic device are black square points in fig. 6, and as can be seen from fig. 6, the touch points detected by the electronic device do not fall on the curve in fig. 6, but are distributed on two sides of the curve, and have a certain distance from the curve, which is called noise. If the positions of the touch points are directly reported to the cloud server or the background server, the user may feel that the current touch has significant jitter, and in order to reduce the jitter, the touch position corresponding to the current touch may be filtered in advance to obtain the estimated position of the current touch, as shown by the broken line in fig. 6.
When the filtering process is performed on the current touch position, the filtering process may be performed on the current touch position by using a filtering algorithm with preset filtering parameters. The filtering effect achieved by setting the filtering parameters to be different is also different, and the higher the filtering parameters are adjusted, the estimated position of touch can be made to be as close to an actual sliding curve as possible.
Step S250: and acquiring a touch area corresponding to the touch position.
Step S260: and identifying the touch category corresponding to the touch based on the touch parameter corresponding to the touch area.
For the step S250 and the step S260, reference may be made to the detailed explanation in the above embodiments, which is not described herein in detail in the embodiments of the present application.
The operation identification method includes the steps of firstly, when a specified event is detected to occur, obtaining a plurality of divided areas of a display screen corresponding to the specified event, obtaining touch parameters corresponding to the divided areas, updating the touch parameters of the divided areas into the touch parameters corresponding to the divided areas, then obtaining a touch position of touch if touch is detected, obtaining a touch area corresponding to the touch position, and identifying a touch type corresponding to the touch based on the touch parameters corresponding to the touch area. By the method, different touch areas use different touch parameters to identify the touch operation, different touch requirements in different touch areas can be met, and meanwhile, the accuracy of identifying the touch operation can be improved.
Referring to fig. 7, an operation identification method provided in the embodiment of the present application includes:
step S310: and acquiring historical touch data.
As one mode, the historical touch data is collected touches performed on the touch area of the display screen by all users before the current time, where all users are all users who have performed touches on the touch area of the display screen of the electronic device, and may include users corresponding to the current touches, and may also be other users.
After touch is detected, the electronic device may upload the touch and a touch position corresponding to the touch to a background server or a cloud server for storage and analysis, and may further directly acquire historical touch data from the background server or the cloud server. Optionally, when the touch of the user and the touch position corresponding to the touch are stored, the touch data of different users may be separately stored, and when the touch data are stored, the time when the touch is detected may also be stored.
Accordingly, when acquiring the historical touch data, the historical touch data of the designated user may be acquired, or the historical touch data of the designated user in a designated time period may be acquired. And then, the use habits of the user who performs touch control on the touch control area of the display screen by the appointed user can be analyzed in a targeted manner according to the acquired historical touch control data.
Optionally, when touch data of different users are separately stored, the touch data can be stored according to different application scene categories, for example, the touch data can be stored according to the touch area of the display screen by the user in different game scenes, and then the use habits of the user who performs touch on the touch area of the display screen by different users in a specific game scene can be analyzed.
Step S320: and performing area division on the touch area of the display screen based on the historical touch data to obtain a plurality of divided areas.
As a mode, after the historical touch data is obtained, statistical analysis may be performed on the historical touch data, and then region division may be performed on the touch region of the display screen according to a result of the statistical analysis to obtain a plurality of divided regions. Specifically, the main touch of each area may be determined according to the statistical analysis result, and the area may be defined as a touch area corresponding to the main touch. The main touch is the designated touch with the largest number of counted touch times for executing touch on the area.
Step S330: and configuring touch parameters corresponding to the plurality of divided areas.
As one way, when a plurality of divided areas of the touch area of the display screen corresponding to the historical touch data is determined by the above method, touch parameters suitable for responding to the main touch corresponding to the area may be configured for the plurality of divided areas. For example, if the touch performed by the user on the touch area of the display screen in an area is mainly a click operation, a set of touch parameters suitable for responding to the click operation can be set in the area.
Optionally, when configuring the touch parameters corresponding to the plurality of divided regions, a plurality of sets of touch parameters may be set for each divided region in advance, where each set of touch parameters is the touch parameters corresponding to the plurality of divided regions when the electronic device is detected to be in different postures.
When the change of the posture of the electronic equipment is detected, namely the display screen rotates, the corresponding touch parameters can be directly obtained, and the touch category corresponding to the touch is identified based on the corresponding touch parameters.
Optionally, the touch parameters corresponding to the plurality of divided areas may be configured in real time.
Step S340: when the occurrence of a specified event is detected, a plurality of divided areas of the display screen corresponding to the specified event are acquired.
Step S350: and acquiring touch parameters corresponding to the plurality of divided areas.
Step S360: and updating the touch parameters of the plurality of divided areas into the touch parameters corresponding to the divided areas.
Step S370: and if the touch is detected, acquiring the touch position of the touch.
Step S380: and acquiring a touch area corresponding to the touch position.
Step S390: and identifying the touch category corresponding to the touch based on the touch parameter corresponding to the touch area.
The steps S340, S350, S360, S370, S380, and S390 may specifically refer to the detailed explanation in the foregoing embodiments, and are not repeated herein in this embodiment.
The operation identification method includes the steps of firstly obtaining historical touch data, conducting area division on a touch area of a display screen based on the historical touch data to obtain a plurality of divided areas, configuring touch parameters corresponding to the plurality of divided areas, then obtaining the plurality of divided areas of the display screen corresponding to a specified event when the specified event is detected, obtaining the touch parameters corresponding to the plurality of divided areas, updating the touch areas of the plurality of divided areas to the touch parameters corresponding to the divided areas, finally obtaining a touch position of touch if touch is detected, obtaining the touch area corresponding to the touch position, and identifying the touch type corresponding to the touch based on the touch parameters corresponding to the touch area. By the method, different touch parameters are configured for different touch areas, and then different touch areas can identify touch operation based on different touch parameters, so that different touch requirements in different touch areas can be met, and meanwhile, the accuracy of identifying touch operation can be improved.
Referring to fig. 8, an operation identification method provided in the embodiment of the present application includes:
step S410: and acquiring historical touch data.
For step S410, specific reference may be made to the detailed explanation in the foregoing embodiments, which are not described herein again in this embodiment.
Step S420: and sending the historical touch data to a cloud server so that the cloud server can perform area division on the touch area of the display screen based on the historical touch data to obtain a plurality of divided areas.
As a mode, after the historical touch data is acquired, the historical touch data can be sent to the cloud server, and then the cloud server can perform statistical analysis on the historical touch data to identify a touch area, so that the touch area of the display screen can be divided into a plurality of divided areas.
Step S430: and configuring touch parameters corresponding to the plurality of divided areas.
As one way, as shown in fig. 9, the configuring the touch parameter corresponding to each of the plurality of divided areas includes:
step S431: and respectively counting the historical touch data of each divided area in the plurality of divided areas to obtain a plurality of touch types corresponding to each divided area and the touch times of each touch type in the plurality of touch types.
The touch type may include a long slide, a short slide, a click, and the like. After the historical touch data is acquired, statistical analysis is performed on the historical touch data, and then the touch area of the display screen can be divided according to the statistical analysis result to obtain a plurality of divided areas. Specifically, the main touch of each area may be determined according to the statistical analysis result, and then the touch type corresponding to the area may be defined as the type representing the main touch. The main touch is the designated touch with the largest number of counted touch times for executing touch on the area.
When the statistical analysis is performed on the touch of different touch types in each area, the touch times of the touch of different touch types can be counted, and then the main touch of each area can be determined according to the counted touch times.
Step S432: and taking the touch type corresponding to the touch with the largest touch frequency in each divided area as the touch type of the divided area so as to obtain the touch types corresponding to the plurality of divided areas.
As one mode, the counted touch type corresponding to the touch with the largest number of touches in each divided area is used as the touch type of the divided area. For example, as shown in fig. 10, it can be known that, by performing statistical analysis on the acquired historical touch data, the area 1:80% short slide operation, zone 2:70% long sliding operation, zone 3:65% are click operations. Therefore, the touch type corresponding to the area 1 may be defined as a short slide, the touch type corresponding to the area 2 may be defined as a long slide, and the touch type corresponding to the area 3 may be defined as a click. Further, the region 1 may be defined as a short slide operation region, the region 2 may be defined as a long slide operation region, and the region 3 may be defined as a click operation region. The long sliding operation and the short sliding operation can be determined according to whether the sliding distance exceeds a set sliding threshold distance, if the sliding distance of the touch is compared with the lock point distance, the sliding distance of the touch is compared with the sliding threshold distance after the touch is judged to be the sliding operation, if the sliding distance of the touch is greater than the sliding threshold distance, the touch is the long sliding operation, and if the sliding distance of the touch is less than or equal to the sliding threshold distance, the touch is the short sliding operation.
Step S433: and configuring the touch parameters corresponding to the plurality of divided areas according to the touch types corresponding to the plurality of divided areas.
As a way, after the touch types corresponding to the plurality of divided areas of the display screen are determined through the way, the touch areas of the display screen can be divided according to different touch types, and after the areas are divided, different touch parameters are configured for the different divided areas according to the touch types, so that different touch requirements in the different divided areas can be met simultaneously.
Step S440: when the occurrence of a specified event is detected, a plurality of divided areas of the display screen corresponding to the specified event are acquired.
Step S450: and acquiring touch parameters corresponding to the plurality of divided areas.
For the step S440 and the step S450, reference may be made to the detailed explanation in the above embodiments, which is not described herein in detail in the embodiments of the present application.
Step S460: and updating the touch parameters of the plurality of divided areas into the touch parameters corresponding to the divided areas.
As one mode, when it is detected that the display screen is rotated, area description information corresponding to each of the plurality of divided areas is updated.
Specifically, the region description information is contour information describing the region, for example, intersection coordinates of contour lines. In the embodiment of the application, whether the display screen rotates or not can be determined by detecting the posture of the electronic equipment, and when the electronic equipment is detected to be changed from the landscape screen posture to the portrait screen posture or the electronic equipment is detected to be changed from the portrait screen posture to the landscape screen posture, the display screen is determined to rotate. And when the display screen of the electronic equipment is detected to rotate, updating the area description of the touch area according to the rotation direction of the display screen. As shown in fig. 11, when it is detected that the display screen is rotated from fig. 11a to fig. 11B, positions and sizes of the area a, the area B, the area C, and the area D in the display screen are changed, so that corresponding rotation calculation needs to be performed according to the area description information of the area a, the area B, the area C, and the area D in fig. 11a, and the position arrangement of the area a, the area B, the area C, and the area D in fig. 11B is obtained.
Step S470: and if the touch is detected, acquiring the touch position of the touch.
Step S480: and acquiring a touch area corresponding to the touch position.
Step S490: and identifying the touch category corresponding to the touch based on the touch parameter corresponding to the touch area.
Step S470, step S480, and step S490 may specifically refer to the detailed explanation in the foregoing embodiments, and are not described herein in detail in this embodiment.
The method includes the steps of firstly obtaining historical touch data, sending the historical touch data to a cloud server, enabling the cloud server to perform area division on a touch area of a display screen based on the historical touch data to obtain a plurality of divided areas, configuring touch parameters corresponding to the divided areas, then obtaining the divided areas of the display screen corresponding to a specified event when the specified event is detected to occur, obtaining the touch parameters corresponding to the divided areas, updating the touch parameters of the divided areas to the touch parameters corresponding to the divided areas, finally obtaining a touch position of touch if the touch is detected, obtaining the touch area corresponding to the touch position, and identifying the touch category corresponding to the touch based on the touch parameters corresponding to the touch area. According to the method, the touch area of the display screen is divided into the plurality of different touch areas according to the historical touch parameters, and different touch parameters are configured for the different touch areas, so that the different touch areas can identify touch operation based on the different touch parameters, different touch requirements in the different touch areas are met, and meanwhile, the accuracy of identifying the touch operation can be improved.
Referring to fig. 12, an operation recognition apparatus 500 according to an embodiment of the present application includes:
a position obtaining unit 510, configured to obtain a touch position of the touch if the touch is detected.
As a manner, the position obtaining unit 510 may be configured to obtain a distance between the touch position and a touch position of a previous touch; if the distance is smaller than a preset lock point distance, taking the touch position of the previous touch as the touch position of the current touch; and if the distance is greater than the lock point distance, filtering the touch position of the touch to obtain an estimated touch position, and taking the estimated touch position as the current touch position.
An area obtaining unit 520, configured to obtain a touch area corresponding to the touch position.
An identifying unit 530, configured to identify the touch based on a touch parameter corresponding to the touch area.
Referring to fig. 13, the apparatus 500 further includes:
a parameter updating unit 540, configured to, when it is detected that a specified event occurs, acquire a plurality of divided regions of the display screen corresponding to the specified event; acquiring touch parameters corresponding to the plurality of divided areas; and updating the touch parameters of the plurality of divided areas into the touch parameters corresponding to the divided areas.
As one way, the parameter updating unit 540 may be configured to update the area description information corresponding to each of the plurality of divided areas when the rotation of the display screen is detected.
Referring to fig. 14, the apparatus 500 further includes:
a parameter configuration unit 550, configured to obtain historical touch data; based on the historical touch data, performing area division on the touch area of the display screen to obtain a plurality of divided areas; and configuring touch parameters corresponding to the plurality of divided areas.
As a manner, the parameter configuration unit 550 is further configured to send the historical touch data to a cloud server, so that the cloud server performs area division on the touch area of the display screen based on the historical touch data, so as to obtain the plurality of divided areas.
As another mode, the parameter configuration unit 550 is further configured to count the historical touch data of each of the plurality of divided areas respectively to obtain a plurality of touch types corresponding to each of the divided areas and the number of touches of each of the plurality of touch types; taking the touch type corresponding to the touch with the largest touch frequency in each divided area as the touch type of the divided area to obtain the touch types corresponding to the divided areas; and configuring touch parameters corresponding to the plurality of divided areas according to the touch types corresponding to the plurality of divided areas.
It should be noted that the device embodiment and the method embodiment in the present application correspond to each other, and specific principles in the device embodiment may refer to the contents in the method embodiment, which is not described herein again.
An electronic device provided by the present application will be described below with reference to fig. 15.
Referring to fig. 15, based on the operation recognition method and apparatus, another electronic device 800 capable of executing the operation recognition method is provided in the embodiment of the present application. The electronic device 800 includes one or more processors 802 (only one shown), a memory 804, a touch sensitive display 805, an image processor 806, a camera 807, and a network module 808 coupled to each other. The memory 804 stores programs that can execute the content of the foregoing embodiments, and the processor 802 can execute the programs stored in the memory 804.
Processor 802 may include one or more processing cores, among others. The processor 802 interfaces with various components throughout the electronic device 800 using various interfaces and circuitry to perform various functions of the electronic device 800 and process data by executing or executing instructions, programs, code sets, or instruction sets stored in the memory 804 and invoking data stored in the memory 804. Alternatively, the processor 802 may be implemented in hardware using at least one of Digital Signal Processing (DSP), field-Programmable Gate Array (FPGA), and Programmable Logic Array (PLA). The processor 802 may integrate one or more of a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), a modem, and the like. Wherein, the CPU mainly processes an operating system, a user interface, an application program and the like; the GPU is used for rendering and drawing display content; the modem is used to handle wireless communications. It is understood that the modem may not be integrated into the processor 802, but may be implemented by a communication chip.
The Memory 804 may include a Random Access Memory (RAM) or a Read-Only Memory (Read-Only Memory). The memory 804 may be used to store instructions, programs, code, sets of codes, or sets of instructions. The memory 804 may include a stored program area and a stored data area, wherein the stored program area may store instructions for implementing an operating system, instructions for implementing at least one function (such as a touch function, a sound playing function, an image playing function, etc.), instructions for implementing various method embodiments described below, and the like. The stored data area may also store data created by the terminal 800 during use (e.g., phone books, audio-visual data, chat log data), and the like.
The touch display screen 805 is configured to receive a touch operation of a user and respond to the touch operation, and may be mainly classified into a capacitive touch screen, a resistive touch screen, and a surface acoustic wave touch screen.
The camera 807 is used for image acquisition and transmitting the acquired image to the image processor 806 for processing.
The image processor 806 is used to improve the quality of the picture by image enhancement or restoration, etc. The processing mode comprises noise point removal, wide-angle distortion correction of the digital photo, picture contrast improvement, red eye elimination and the like. The image processor 806 can also be used for picture composition, that is, combining multiple pictures to realize a process of changing the content of the pictures, where picture composition usually needs to be realized by matting, and picture composition has a main meaning of replacing a background by matting, or directly combining multiple pictures, for example, various fusion modes are all picture composition modes.
The network module 808 is configured to receive and transmit electromagnetic waves, and implement interconversion between the electromagnetic waves and electrical signals, so as to communicate with a communication network or other devices, for example, an audio playing device. The network module 808 may include various existing circuit elements for performing these functions, such as an antenna, a radio frequency transceiver, a digital signal processor, an encryption/decryption chip, a Subscriber Identity Module (SIM) card, memory, and so forth. The network module 808 may communicate with various networks such as the internet, an intranet, a wireless network, or with other devices via a wireless network. The wireless network may comprise a cellular telephone network, a wireless local area network, or a metropolitan area network. For example, the network module 808 may communicate information with a base station.
Referring to fig. 16, a block diagram of a computer-readable storage medium according to an embodiment of the present application is shown. The computer-readable medium 900 has stored therein program code that can be called by a processor to perform the method described in the method embodiments above.
The computer-readable storage medium 900 may be an electronic memory such as a flash memory, an EEPROM (electrically erasable programmable read only memory), an EPROM, a hard disk, or a ROM. Alternatively, the computer-readable storage medium 900 includes a non-volatile computer-readable storage medium. The computer readable storage medium 900 has storage space for program code 910 to perform any of the method steps of the method described above. The program code can be read from and written to one or more computer program products. The program code 910 may be compressed, for example, in a suitable form.
According to the operation identification method, the operation identification device, the electronic equipment and the storage medium, firstly, if touch is detected, a touch position of the touch is obtained, then, a touch area corresponding to the touch position is obtained, and the touch category corresponding to the touch is identified based on the touch parameter corresponding to the touch area. By the method, different touch areas use different touch parameters to identify the touch operation, different touch requirements in different touch areas can be met, and meanwhile, the accuracy of identifying the touch operation can be improved.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (8)

1. An operation identification method, characterized in that the method comprises:
when a specified event is detected to occur, acquiring a plurality of divided areas of a display screen corresponding to the specified event, wherein the specified event is different game starting events;
acquiring touch parameters corresponding to the plurality of divided areas;
updating the touch parameters of the plurality of divided areas into respective corresponding touch parameters, wherein each divided area corresponds to a plurality of sets of touch parameters, and each set of touch parameters is the touch parameters corresponding to the plurality of divided areas when the electronic equipment is detected to be in different postures;
if touch control is detected, acquiring a touch control position of the touch control;
acquiring the distance between the touch position and the touch position of the previous touch;
if the distance is smaller than a preset lock point distance, taking the touch position of the previous touch as the touch position of the current touch, wherein the lock point distance is a preset distance between two touch points which can distinguish click operation and sliding operation;
if the distance is larger than the lock point distance, filtering the touch position of the touch to obtain an estimated touch position, and taking the estimated touch position as the current touch position;
acquiring a touch area corresponding to the touch position;
and identifying the touch category corresponding to the touch based on the touch parameter corresponding to the touch area.
2. The method of claim 1, wherein before the obtaining the plurality of divided regions of the display screen corresponding to the specific event when the specific event is detected to occur, further comprises:
acquiring historical touch data;
based on the historical touch data, performing area division on the touch area of the display screen to obtain a plurality of divided areas;
and configuring touch parameters corresponding to the plurality of divided areas.
3. The method of claim 2, wherein the performing area division on the touch area of the display screen based on the historical touch data to obtain a plurality of divided areas comprises:
and sending the historical touch data to a cloud server so that the cloud server performs area division on the touch area of the display screen based on the historical touch data to obtain a plurality of divided areas.
4. The method according to claim 2, wherein the configuring the touch parameter corresponding to each of the plurality of divided areas comprises:
respectively counting historical touch data of each divided area in the plurality of divided areas to obtain a plurality of touch types corresponding to each divided area and touch times of each touch type in the plurality of touch types;
taking the touch type corresponding to the touch with the largest number of times of touch in each divided area as the touch type of the divided area to obtain the touch types corresponding to the divided areas;
and configuring touch parameters corresponding to the plurality of divided areas according to the touch types corresponding to the plurality of divided areas.
5. The method according to any one of claims 2-4, further comprising:
and when the display screen is detected to rotate, updating the area description information corresponding to the plurality of divided areas.
6. An operation recognition apparatus, characterized in that the apparatus comprises:
the device comprises a parameter updating unit, a parameter updating unit and a parameter updating unit, wherein the parameter updating unit is used for acquiring a plurality of divided areas of a display screen corresponding to a specified event when the specified event is detected to occur, and the specified event is different game starting events; acquiring touch parameters corresponding to the plurality of divided areas; updating the touch parameters of the plurality of divided areas into corresponding touch parameters, wherein each divided area corresponds to a plurality of sets of touch parameters, and each set of touch parameters is the touch parameters corresponding to the plurality of divided areas when the electronic equipment is detected to be in different postures;
a position acquisition unit, configured to acquire a touch position of the touch if the touch is detected; acquiring the distance between the touch position and the touch position of the previous touch; if the distance is smaller than a preset lock point distance, taking the touch position of the previous touch as the touch position of the current touch, wherein the lock point distance is a preset distance between two touch points which can distinguish click operation and sliding operation; if the distance is larger than the lock point distance, filtering the touch position of the touch to obtain an estimated touch position, and taking the estimated touch position as the current touch position;
the area acquisition unit is used for acquiring a touch area corresponding to the touch position;
and the identification unit is used for identifying the touch category corresponding to the touch based on the touch parameter corresponding to the touch area.
7. An electronic device comprising one or more processors and memory; one or more programs stored in the memory and configured to be executed by the one or more processors to perform the method of any of claims 1-5.
8. A computer-readable storage medium, having program code stored therein, wherein the program code when executed by a processor performs the method of any of claims 1-5.
CN202110396825.XA 2021-04-13 2021-04-13 Operation identification method and device, electronic equipment and storage medium Active CN113138693B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110396825.XA CN113138693B (en) 2021-04-13 2021-04-13 Operation identification method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110396825.XA CN113138693B (en) 2021-04-13 2021-04-13 Operation identification method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN113138693A CN113138693A (en) 2021-07-20
CN113138693B true CN113138693B (en) 2023-04-07

Family

ID=76811998

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110396825.XA Active CN113138693B (en) 2021-04-13 2021-04-13 Operation identification method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113138693B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113778255B (en) * 2021-07-29 2023-11-10 荣耀终端有限公司 Touch recognition method and device

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011158475A1 (en) * 2010-06-16 2011-12-22 パナソニック株式会社 Information input device, information input method and programme
US10007290B2 (en) * 2010-09-15 2018-06-26 Inventus Engineering Gmbh Haptic operating device with a rotating element and method
US10698533B2 (en) * 2017-09-11 2020-06-30 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Method for responding to touch operation and electronic device
CN109126127B (en) * 2018-07-25 2022-07-26 南昌努比亚技术有限公司 Game control method, dual-screen mobile terminal and computer-readable storage medium
CN110362225B (en) * 2019-06-18 2024-01-30 腾讯科技(成都)有限公司 Touch screen sampling frequency control method and device, medium and electronic equipment
CN110502153B (en) * 2019-08-30 2022-11-11 Oppo(重庆)智能科技有限公司 Touch screen point reporting rate adjusting method, touch screen, electronic device and storage medium
CN111459823B (en) * 2020-04-01 2023-05-26 网易(杭州)网络有限公司 Method and device for testing game software, storage medium and processor

Also Published As

Publication number Publication date
CN113138693A (en) 2021-07-20

Similar Documents

Publication Publication Date Title
CN108985212B (en) Face recognition method and device
CN109241859B (en) Fingerprint identification method and related product
WO2019233212A1 (en) Text identification method and device, mobile terminal, and storage medium
CN107590461B (en) Face recognition method and related product
CN109614865B (en) Fingerprint identification method and related product
CN109145800B (en) Fingerprint template updating method and related product
CN104239879B (en) The method and device of separating character
CN108875667B (en) Target identification method and device, terminal equipment and storage medium
CN109376700B (en) Fingerprint identification method and related product
CN111259907B (en) Content identification method and device and electronic equipment
CN112463016B (en) Display control method and device, electronic equipment and wearable display equipment
US20160202948A1 (en) Causation of stylistic mimicry on a companion apparatus
CN113409342A (en) Training method and device for image style migration model and electronic equipment
CN113138693B (en) Operation identification method and device, electronic equipment and storage medium
CN110198421B (en) Video processing method and related product
CN112703534A (en) Image processing method and related product
CN112083858A (en) Method and device for adjusting display position of control
CN109194810B (en) Display control method and related product
CN110597480B (en) Custom voice instruction implementation method and terminal
CN112328895A (en) User portrait generation method, device, server and storage medium
CN110019883A (en) Obtain the method and device of expression picture
CN109376701B (en) Fingerprint identification method and related product
CN113473215B (en) Screen recording method, device, terminal and storage medium
CN110753159B (en) Incoming call processing method and related product
CN113900519A (en) Method and device for acquiring fixation point and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant