CN117406905A - Touch area classification method and device, electronic equipment and computer readable medium - Google Patents

Touch area classification method and device, electronic equipment and computer readable medium Download PDF

Info

Publication number
CN117406905A
CN117406905A CN202311433331.XA CN202311433331A CN117406905A CN 117406905 A CN117406905 A CN 117406905A CN 202311433331 A CN202311433331 A CN 202311433331A CN 117406905 A CN117406905 A CN 117406905A
Authority
CN
China
Prior art keywords
touch
area
operation area
type
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311433331.XA
Other languages
Chinese (zh)
Inventor
徐洪伟
王涛
王慧
魏海军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202311433331.XA priority Critical patent/CN117406905A/en
Publication of CN117406905A publication Critical patent/CN117406905A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Human Computer Interaction (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Position Input By Displaying (AREA)

Abstract

The application discloses a touch area classification method, a device, electronic equipment and a computer readable medium, and relates to the technical field of touch screens, wherein the method comprises the following steps: under the condition that the electronic equipment displays a target application program through a touch screen, acquiring a target image corresponding to a target interface of the target application program; determining a first operation area of the target interface based on the identification operation of the target image, wherein the first operation area corresponds to a first operation area position; determining a first touch type corresponding to the first operation area; acquiring gesture information corresponding to touch operation of a target interface; and determining a second operation area corresponding to the touch operation, and a second operation area position and a second touch type corresponding to the second operation area based on the gesture information. Therefore, the position and the type of the operable area corresponding to the target interface can be determined by combining the image recognition and the gesture recognition, so that the recognition of the operable area of the target interface is more reasonable and accurate.

Description

Touch area classification method and device, electronic equipment and computer readable medium
Technical Field
The present disclosure relates to the field of touch screens, and in particular, to a touch area classification method, a touch area classification device, an electronic device, and a computer readable medium.
Background
With the development of electronic technologies, electronic devices have increasingly abundant functions, for example, the electronic devices are provided with a touch screen, and a user performs a touch (touch) event on the touch screen, including a down (down) event, a up (up) event, a slide (move) event, and the like, to control the electronic devices.
Currently, some applications may allow a user to adjust the touch parameters of the touch screen while the application is running, thereby adjusting the touch response within the interface of the application. Then, the current adjustment mode is global adjustment, that is, the touch parameters of the whole interface are adjusted together, so that the adjustment mode is not reasonable enough.
Disclosure of Invention
The application provides a touch area classification method, a touch area classification device, electronic equipment and a computer readable medium, so as to improve the defects.
In a first aspect, the present application provides a touch area classification method, applied to a touch screen of an electronic device, the method including: acquiring a target image corresponding to a target interface of a target application program under the condition that the electronic equipment displays the target application program through the touch screen; determining a first operation area of the target interface based on the identification operation of the target image, wherein the first operation area corresponds to a first operation area position; determining a first touch type corresponding to the first operation area; acquiring gesture information corresponding to touch operation of the target interface; determining a second operation area corresponding to the touch operation and a second operation area position and a second touch type corresponding to the second operation area based on the gesture information; and determining the position and type of the operable area corresponding to the target interface based on the first operation area position, the first touch type, the second operation area position and the second touch type.
In a second aspect, the present application further provides a touch area classification device, which is applied to a touch screen of an electronic device, and the device includes: the device comprises a first acquisition unit, a first determination unit, a second acquisition unit, a second determination unit, a third determination unit and an identification unit. The first acquisition unit is used for acquiring a target image corresponding to a target interface of the target application program under the condition that the electronic equipment displays the target application program through the touch screen. And the first determining unit is used for determining a first operation area of the target interface based on the identification operation of the target image, wherein the first operation area corresponds to the first operation area position. And the second determining unit is used for determining a first touch type corresponding to the first operation area. And the second acquisition unit is used for acquiring gesture information corresponding to the touch operation of the target interface. And the third determining unit is used for determining a second operation area corresponding to the touch operation, and a second operation area position and a second touch type corresponding to the second operation area based on the gesture information. The identification unit is used for determining the position and the type of the operable area corresponding to the target interface based on the first operation area position, the first touch type, the second operation area position and the second touch type.
In a third aspect, the present application further provides an electronic device, including: one or more processors; a memory; one or more applications, wherein the one or more applications are stored in the memory and configured to be executed by the one or more processors, the one or more applications configured to perform the above-described method.
In a fourth aspect, the present application also provides a computer readable storage medium storing program code executable by a processor, the program code when executed by the processor causing the processor to perform the above method.
According to the touch area classification method, the touch area classification device, the electronic equipment and the computer readable medium, under the condition that the electronic equipment displays a target application program through the touch screen, a target image corresponding to a target interface of the target application program is obtained; determining a first operation area of the target interface based on the identification operation of the target image, wherein the first operation area corresponds to a first operation area position; determining a first touch type corresponding to the first operation area; acquiring gesture information corresponding to touch operation of the target interface; determining a second operation area corresponding to the touch operation and a second operation area position and a second touch type corresponding to the second operation area based on the gesture information; and determining the position and type of the operable area corresponding to the target interface based on the first operation area position, the first touch type, the second operation area position and the second touch type. Therefore, the position and the type of the operable area corresponding to the target interface can be determined by combining the image recognition and the gesture recognition, so that the recognition of the operable area of the target interface is more reasonable and accurate.
Additional features and advantages of the application will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by practice of the application. The objectives and other advantages of the application will be realized and attained by the structure particularly pointed out in the written description and claims thereof as well as the appended drawings.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments will be briefly introduced below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 illustrates a schematic diagram of an operating region provided by an embodiment of the present application;
fig. 2 is a flowchart of a method for classifying touch areas according to an embodiment of the present disclosure;
FIG. 3 is a schematic diagram of a detection process of a detection model according to an embodiment of the present application;
fig. 4 is a schematic diagram illustrating a touch operation for a target interface according to an embodiment of the present application;
FIG. 5 shows a schematic diagram of gesture information for the touch operation of FIG. 4;
Fig. 6 is a schematic diagram illustrating a touch operation for a target interface according to another embodiment of the present application;
FIG. 7 shows a schematic diagram of gesture information for the touch operation of FIG. 6;
fig. 8 is a schematic diagram illustrating a touch operation for a target interface according to another embodiment of the present application;
FIG. 9 shows a schematic diagram of gesture information for the touch operation of FIG. 8;
fig. 10 is a schematic diagram of a touch type according to an embodiment of the present application;
fig. 11 is a flowchart of a method for classifying touch areas according to another embodiment of the present disclosure;
FIG. 12 shows an overall flow diagram provided by an embodiment of the present application;
fig. 13 shows a schematic diagram of S1180 in fig. 11;
fig. 14 is a flowchart of a method for classifying touch areas according to another embodiment of the present disclosure;
FIG. 15 is a schematic flow chart of touch optimization according to an embodiment of the present disclosure;
fig. 16 is a block diagram of a touch area classification device according to an embodiment of the disclosure;
fig. 17 shows a block diagram of an electronic device according to an embodiment of the present application;
fig. 18 illustrates a storage unit for storing or carrying program codes for implementing the touch area classification method according to the embodiment of the present application.
Detailed Description
In order to better understand the embodiments of the present application, the following description will clearly and completely describe the embodiments of the present application with reference to the accompanying drawings in the embodiments of the present application, and it is apparent that the described embodiments are only some embodiments of the present application, not all embodiments. The components of the embodiments of the present application, which are generally described and illustrated in the figures herein, may be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present application, as provided in the accompanying drawings, is not intended to limit the scope of the application, as claimed, but is merely representative of selected embodiments of the application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present application without making any inventive effort, are intended to be within the scope of the present application.
It should be noted that: like reference numerals and letters denote like items in the following figures, and thus once an item is defined in one figure, no further definition or explanation thereof is necessary in the following figures. Meanwhile, in the description of the present application, the terms "first", "second", and the like are used only to distinguish the description, and are not to be construed as indicating or implying relative importance.
With the development of electronic technologies, electronic devices with Touch Panels (TP) such as mobile phones and tablet computers are widely popularized, and the development of Touch technologies has changed greatly, such as games, various Touch scenes, and other Touch scenes, so that the requirements of users on Touch are stronger, the use demands of users are met by improving the reporting speed in related technologies, but the reporting stability is poor, and the pictures displayed for the users in the Touch panels have the problem of shaking.
A user controls an electronic device by performing a touch (touch) event on a touch screen, including a push (down) event, a lift (up) event, a slide (move) event, and the like.
Typically, an application program, such as a gaming application, requires certain operational controls to be set to implement the corresponding functionality, with different controls triggering different responses when the user clicks on the touch screen. For example, in a game scene, the game scene is typically divided into different areas, and the user uses different gestures to operate in the different areas, and these areas often correspond to different operation controls. And different areas have different operation experience requirements on the operation gestures, for example, the areas pay attention to chirality, and some areas require smoothness, click time delay, accuracy and the like.
For example, in a game scene, the operation controls existing in the game scene interface of the game application include a direction wheel key, a direction auxiliary key, a skill button, a skill auxiliary button, a small map control, a tool box control, a view control and the like, and the respective corresponding operation areas are a direction wheel area, a direction auxiliary area, a skill auxiliary area, a small map area, a tool box area and a view area in sequence. It should be noted that, the operation area obtained through the operation control may correspond to an area control type, where the area control type is characterized by a type of the operation control in the operation area, for example, the direction wheel disc area, the direction auxiliary area, the skill auxiliary area, the small map area, the toolbox area, the field area, and the like all belong to the area control type.
Illustratively, as shown in FIG. 1, a user can operate a game character 300 at a game scene interface. The operational areas within the interface include a directional roulette area 201, a directional assistance area 202, a skills area 203, a skills assistance area 204, a toolbox area 205, a field of view area 206, a minimap area 207, and a smoothing hot zone 208. The direction auxiliary key corresponding to the direction auxiliary area 202 may be a touch position of a finger of a user, that is, the user touches the direction auxiliary key and slides the direction auxiliary key in the direction wheel area 201, so as to control the game character 300 to move in the game scene, and the user may trigger the game character to perform an operation corresponding to a key of the skill area 203, the skill auxiliary area 204, the tool box area 205, and the view area 206 by clicking the key, for example, the skill button of the skill area 203 is used to control the game character 300 to release a skill corresponding to the skill button when triggered, and the skill auxiliary area 204 is used to update a skill in the skill area corresponding to the skill auxiliary area 204, for example, upgrade a skill. The various toolbox controls within the toolbox region 205 are used to provide auxiliary functions that are generally independent of the game character 300, and that are generic functions provided by the target interface, such as sending speech, downloading the contents of the target interface, etc., and therefore are not considered skill keys.
The smooth hotspots 208 have a scene displayed therein, which can be understood as a background scene with respect to the controls in the scene, wherein the controls are the buttons and the display frames of the minimap. It should be noted that the smooth hot zone 208 shown in fig. 1 is only a partial area of all the smooth hot zones, and the area size of the smooth hot zone is not limited.
The inventor found in the study that the existing adjustment had limited applicability in some special areas, such as directional wheels, where the position of the game control was offset from the position of the hot zone that was active in the actual game. The smooth area in the game does not contain control area information, and defects exist in a control adjustment mode. Therefore, the existing method for determining the operation area is not reasonable and accurate enough, so that the optimization method based on the area determination is too limited.
Referring to fig. 2, a touch area classification method provided in an embodiment of the present application is applied to an electronic device having a touch screen, and includes: s201 to S206.
S201: and under the condition that the electronic equipment displays the target application program through the touch screen, acquiring a target image corresponding to a target interface of the target application program.
Displaying the target application program on the electronic device through the touch screen means that the target application program is installed in the electronic device, the target application program runs in the foreground of the electronic device, and the interface of the target application program is displayed on the touch screen of the electronic device. The target application program has a touch area in an interface displayed on a touch screen of the electronic device, that is, when the target application program is displayed on the touch screen, a user can input a touch gesture through the touch screen, and the touch gesture triggers the target application program to execute an operation corresponding to the touch gesture.
In one embodiment, whether the target application program is displayed on the touch screen can be determined through identification of the application identifier running in the foreground, and whether the target application program is displayed on the touch screen can also be determined through image identification mode of content displayed on the touch screen of the electronic device. The content displayed on the touch screen of the electronic device is obtained according to a preset time interval, compared with a reference interface diagram of a target application program obtained in advance, if the similarity between the content displayed on the touch screen and the reference interface diagram of the target application program is greater than a specified threshold, the target application program is judged to be displayed on the touch screen, wherein the reference interface diagram can be a representative interface diagram of the target application program, for example, an interface diagram of a top page or an interface of the target application program with the largest duration displayed on the touch screen by a user when the target application program is operated is counted to be used as the reference interface diagram.
In another embodiment, the touch screen setting method provided by the embodiment of the present application may be a setting module in an electronic device, where when a target application is displayed on a touch screen, the target application may send a specific message to the setting module, and the setting module may determine that the target application is displayed on the touch screen through the specific message.
It should be noted that, the acquiring manner of the target image corresponding to the target interface of the target application program may be that, when it is detected that the electronic device displays the target application program through the touch screen, the interface displayed by the target application program on the touch screen is detected, and if the interface is the target interface, the target image of the target interface is determined, where the target interface may be an interface having a higher requirement on the touch operation in multiple interfaces corresponding to the target application program, may be a pre-selected interface, for example, an interface of a specified type corresponding to the target application program may be used as the target interface, the specified type may be a frequent operation type, and the interface of the specified type may be predefined by a user or a developer of the target application program, and is not limited herein. Taking a game application as an example, the target interface may be a game scene interface of the game application, and because a user needs to operate a role in the game scene interface, a higher requirement is required for touch operation, and thus, the touch response of the game scene interface can be optimally set. Of course, the target interface may also be the current interface of the target application, which is not limited herein.
The target application may be an application that matches an application in a pre-set whitelist, which may be user-defined or pre-defined by a developer of the electronic device, and after a user of the electronic device updates the whitelist, the server of the electronic device may be able to synchronize the whitelist to other electronic devices. Generally, the application functions in the white list belong to application programs with frequent touch operations of the user, which are not limited herein. In the embodiment of the present application, the target application may be a game application.
S202: and determining a first operation area of the target interface based on the identification operation of the target image, wherein the first operation area corresponds to the first operation area position.
As described above, different operation controls are distributed on the target interface, and through the image recognition mode, the operation areas corresponding to the different operation controls can be determined and recorded as the first operation area.
For example, the images in different operation regions of the target interface are generally different for the target interface, and as shown in fig. 1, the graphics or displayed contents of the keys in the respective regions are different from each other, so that a plurality of operation regions in the target image can be determined by image recognition of the respective regions. Of course, the respective operation areas may also be determined by recognizing the respective keys in the target image.
It should be noted that, the target image may be a screenshot of the target interface, where the screenshot includes a screen of at least a portion of an operation area of the target interface, for example, the target image may be an entire screenshot of the target interface, that is, an image including the entire target interface, and the entire screenshot may be referred to as a display image of the target image, and the target image may also be an image other than a background image in the display image.
As an embodiment, the operation area of the target interface may be determined by identifying the control of the target image, and the obtained operation area may be named as a first operation area, that is, the operation area obtained by the image identification operation may be named as a first operation area, and after the first operation area is identified, the area position (named as a first operation area position) and the touch type (named as a first touch type) corresponding to the first operation area may be determined.
As an embodiment, a detection model may be pre-trained that is capable of identifying controls within each interface. For example, a sample set may be obtained in advance, where the sample set includes images corresponding to the interfaces, and controls in the images are labeled, and training the detection model by the sample set enables the detection model to have the capability of identifying the controls in the images.
In addition, in order to increase the rate of model detection, the detection model may be deployed in an electronic device, and in order to reduce power consumption of the electronic device while taking into account processing speed and memory occupation, the detection model may be a lightweight model, for example, YOLO.
YOLO (You Only Look Once) is a target detection algorithm that achieves real-time target detection through a single neural network model. In YOLO, a candidate box (bounding box) refers to a rectangular box used to identify and locate a target object. In the target detection process of YOLO, an input image is first divided into a plurality of grids, and each grid is responsible for detecting whether a target object exists in the grid. For each grid, a plurality of candidate boxes are predicted, each candidate box consisting of a bounding box and a corresponding confidence score. A candidate box is generally defined as a bounding box that encloses an object and is represented by its coordinates in the upper left and lower right corners. The candidate boxes, in addition to providing location information, also estimate a probability score for containing the target object. The probability score is a confidence level, that is, the confidence level of the detection frame is used to characterize the probability that the target control exists in the detection frame.
In this embodiment, the detection model is taken as a YOLO model as an example, and the identification manner is shown in fig. 3.
First, a screen capture game screen is input, that is, a target image is input to a detection model, and the resolution of an image obtained by screen capture is large, typically around 2K. To reduce the algorithm complexity and increase the processing time, the image is scaled and padded on the short side, ultimately scaled 768×768. In order to preserve the RGB information of the image, the difference between the icons and the network reasoning are conveniently distinguished, and the image is normalized.
A deep learning network for game control detection is trained, which typically includes a feature extraction network and a detection network. The feature extraction network is used for carrying out depth feature extraction on the normalized image to obtain feature graphs with different downsampling scales. And detecting the target frame and carrying out category regression on the specific feature map through the detection network. To ensure algorithm real-time, a lightweight detection network, such as YOLO-V7, may be selected.
As shown in fig. 3, the head (detection network) of YOLO-V7 predicts a plurality of candidate boxes, each of which corresponds to a confidence level as described above, and then searches for a candidate box with a confidence level greater than a confidence level threshold. Illustratively, the identified candidate boxes typically overlap, and each detection box is first screened using confidence, eliminating false identified candidate boxes. And then processing all the candidate frames by using non-maximum suppression to ensure that a plurality of candidate frames cannot appear in the same target. And then, remapping the candidate frames back to the original screen capturing image, namely the target image, and finally outputting the coordinates and the category of each control icon on the target image, namely the position of the first operation area.
Among them, non-maximal suppression (Non-Maximum Suppression, NMS) is a technique for filtering out overlapping candidate boxes in YOLO to obtain a final detection result. The goal of the NMS is to select the candidate box that is most likely to be the real object and filter out other candidate boxes that have a higher degree of overlap with it. Typically, the non-maxima suppression processing step in YOLO: for an input image, predicting a plurality of candidate boxes and confidence scores (confidence scores) corresponding to the candidate boxes and categories corresponding to the candidate boxes through a YOLO model; setting a confidence threshold, and only keeping candidate frames with confidence scores higher than the threshold, wherein the threshold is usually determined according to actual requirements and model performance; for a plurality of alternative detection frames corresponding to each control, executing the following steps: sorting the confidence scores, taking out the detection frame with the highest score as an initial selection frame, calculating the overlapping degree of the initial selection frame and other residual candidate frames, filtering the detection frame with the overlapping degree of the initial selection frame being higher than a preset threshold value, removing the detection frame from a candidate frame list, and repeating the steps for the residual candidate frames until all the candidate frames corresponding to the target control are processed. And finally, obtaining a unique detection frame corresponding to each screened target control, namely a target detection frame. By using a non-maximum suppression technology, redundant detection frames can be effectively reduced in a YOLO algorithm, and the accuracy and efficiency of target detection frames are improved.
S203: and determining a first touch type corresponding to the first operation area.
As an implementation manner, each first operation area of the target interface can be obtained through the foregoing image recognition, and then, based on the control type in each first operation area, the first touch type corresponding to each first operation area is determined based on the mapping relationship between the control type and the touch type. It should be noted that, the operation control in the first operation area may be applicable to multiple operation manners, for example, the small map control in the small map area may support clicking and sliding at the same time, and then the corresponding first touch type may also include clicking and sliding at the same time. In addition, the mapping relationship may be determined by analyzing touch manners of different types of controls in advance, which is not limited herein.
S204: and acquiring gesture information corresponding to the touch operation of the target interface.
And detecting touch operation of a user on a target interface of the target application program under the condition that the electronic equipment displays the target application program through the touch screen. After the electronic device is determined to display the target application program through the touch screen, and in the case that it is determined that the target interface of the target application program is displayed on the touch screen, detecting a touch operation acting on the target interface, and if the touch operation is detected, recording touch information corresponding to the touch operation, where the touch information may include a plurality of touch position information of the touch operation and a timestamp corresponding to each touch position information, for example, sampling the plurality of touch position information of the touch operation and the timestamp corresponding to each touch position information at a preset sampling frequency, so as to record the touch information of the touch operation, thereby being used for subsequent operations.
As one embodiment, the gesture information is used to characterize gesture characteristics of the touch operation, where the gesture characteristics may include touch duration, track, direction change, and the like. As described above, after the electronic device detects the touch operation, the touch information of the touch operation is recorded, so that the information such as the touch duration time and the coordinates of the touch operation can be obtained, and the gesture information of the touch operation can be obtained based on the touch information. In this embodiment of the present application, gesture information corresponding to a touch operation on the target interface includes a touch duration corresponding to the touch operation and a plurality of pieces of touch position information within the touch duration. The touch duration may be an accumulated duration from the time when the user's finger presses on the target interface, and when the finger lifts off the target interface, the accumulated duration ends, so the touch duration may be a time length from pressing to lifting, i.e. a start time and an end time are recorded, and a time difference between the start time and the end time is taken as the touch duration. Therefore, in this embodiment of the present application, the implementation manner of obtaining the gesture information corresponding to the touch operation may be that, when the touch operation is detected to be ended, that is, when the finger corresponding to the finger ID of the touch operation is lifted from the touch screen, the gesture information corresponding to the touch operation is obtained, that is, the touch duration and the plurality of pieces of touch position information within the touch duration are determined.
It should be noted that, after the touch operation is detected, an identity identifier may be allocated to the touch operation, and gesture information of the touch operation and the identity identifier of the touch operation are stored correspondingly, where the identity identifier of the touch operation is a finger ID corresponding to the touch operation, that is, a finger ID.
In one embodiment, the gesture information may be a time sequence feature vector, where the time sequence feature vector includes a plurality of touch coordinates, and the touch coordinates correspond to a time sequence number, and the time sequence feature vector is obtained by sorting the plurality of touch coordinates based on the time sequence number, and in addition, the time sequence feature vector corresponds to a finger ID corresponding to the touch operation, so that gesture information corresponding to the finger ID may be obtained.
S205: and determining a second operation area corresponding to the touch operation and a second operation area position and a second touch type corresponding to the second operation area based on the gesture information.
As shown in fig. 1, since the target interface includes different operation controls, some controls need to be clicked, some controls need to slide, and the sliding manners may also be different, for example, the sliding distances, directions and tracks may be different, and the time lengths of the touch may also be different, the control characteristics of the controls of different operation types on the target interface may be analyzed in advance, so that the control characteristics of the operation areas corresponding to the controls may be summarized, and the touch characteristics of the operation areas corresponding to the operation controls may be obtained. The gesture information of the touch operation of the user is generally matched with each operation control of the target interface when the user operates the operation control, for example, the operation control is a click control, and the gesture information of the touch operation of the user for the operation control is generally matched with the gesture characteristics of the click operation. Therefore, the target interface can be analyzed in advance to obtain different operation characteristics of each type of control, each operation characteristic is used as a reference characteristic corresponding to the target interface, so that the operation type of an operation area corresponding to the reference characteristic is determined by determining the reference characteristic matched with gesture information of the touch operation, and the touch type of the target operation area corresponding to the touch operation is obtained.
It should be noted that, the first touch type described above is determined based on the types of the operation controls in each operation area of the target interface, and is characterized by the type of touch mode that is generally supported by the operation controls in the first operation area, and the second touch type described herein is characterized by the touch characteristics of the touch gestures of the operation area corresponding to each control, which reflect the touch mode when the user actually operates the target interface.
As an embodiment, as shown in fig. 4, taking the target application program as an example of a game application, the target interface is a game scene interface, and it is assumed that a user clicks a control, for example, the control is a skill key, and the user can control the game character to release the skill corresponding to the skill key by clicking the skill key once. As can be seen from fig. 1, the action area of the clicking operation shown in fig. 4 is the clicking hot area 103 in fig. 1, referring to fig. 5, and referring to fig. 5, it can be seen that gesture information, that is, coordinate change information, of the touch operation of the user in operating the clicking hot area, the type of the operation area of the operation control corresponding to the foregoing clicking operation may be a clicking type, and it can be seen that the operation characteristic of the clicking operation for the clicking hot area, that is, the operation characteristic of the clicking type area is a fixed pressing position, that is, the coordinates are relatively fixed, the change range is small, that is, a plurality of touch points are collected in one clicking operation, and the change range of the distance value between the touch coordinates of each touch point is small. In addition, the click operation has an operation characteristic that the operation time is short, and the operation time of one click operation (i.e., the period of time from the finger pressing to the lifting) is usually several tens of milliseconds, which is much smaller than the operation time of one sliding operation or a long pressing operation.
As shown in fig. 6, taking a target application as a game application and the target interface as a game scene interface as an example, it is assumed that a user slides in a small map area to change the game field content of a game scene, and the operation in the small map area tends to slide in one direction for a short distance to switch the field of view, for example, to refer to the content of a certain azimuth around a game character. As can be seen from fig. 7, the sliding operation for the small map area belongs to a sliding gesture, and the sliding track of the sliding gesture is generally similar to a line, and the sliding direction of the sliding track is generally fixed, so the touch type of the operation area corresponding to the control for the sliding operation described in fig. 7 may be named as a short-slide type, that is, the touch type of the small map area, i.e., the jittered hot area, is named as a short-slide type. The operation characteristics of the short-slide type operation area are that the operation directivity is strong, the sliding track is generally close to a straight line, and the touch time is longer than the click type touch time. It should be noted that, the touch type of the jitter hot zone may also include a click type, which may be specifically determined based on the type of the target, for example, the small map area of the application of the first game type supports a sliding operation and a clicking operation, the small map area of the application of the second game type does not support a sliding operation but supports a clicking operation, and the small map area of the application of the third game type does not support a clicking operation but supports a sliding operation, which is not limited herein.
In addition, in addition to the touch type of the shake hot area corresponding to the map illustrated in fig. 6 being of the short slide type, there are other areas belonging to the short slide type, for example, a smooth hot area and an operation area corresponding to a part of skill keys, and it is understood that the user may also implement the field switching by inputting a sliding gesture in the smooth hot area, so that the characteristic types of the sliding gesture in the smooth hot area and the sliding gesture in the shake hot area of the small map also belong to the short slide type. The user presses the skill to slide to a head portrait to finish the skill release of the character corresponding to the head portrait, and the operation of the skill has the operation characteristic of the short slide type, so the touch type of the operation area corresponding to the skill key in the part also belongs to the short slide type.
It will be appreciated that a directional short slide type operation means that the manner of the short slide operation gesture is relatively fixed, such as skill key release, and the user typically needs to first click a skill key, then slide a short distance in the target direction and lift the hand. For another example, a slide operation for a small map is generally a left-right or front-back slide operation.
As shown in fig. 8, taking a target application as a game application and the target interface as a game scene interface as an example, it is assumed that the user slides in the operation area corresponding to the direction key, that is, inputs a sliding gesture in the sliding hot zone, so as to control the character movement of the game. In general, when a user moves a character controlling a game, the character is controlled to move for a long time (relative to the aforementioned short slide type), that is, a time period between the moment the slide gesture is pressed down to the moment the slide gesture is lifted up is longer than that of the short slide type, and since the character moving direction is generally not fixed, the slide track is also relatively irregular. For example, as shown in fig. 9, the sliding direction of the sliding track of the sliding gesture for the operation area corresponding to the direction key is generally not fixed, and the touch type of the operation area corresponding to the sliding gesture shown in fig. 9 may be named as a direction sliding type, that is, the touch type of the operation area corresponding to the direction key (that is, the sliding hot zone) is a direction sliding type, and the operation characteristic of the operation area of the direction sliding type is that the normal operation time is long, the action track is large, and no fixed direction is provided. That is, the touch time of the directional sliding type is generally longer than that of the short sliding type, the sliding range of the sliding track of the directional sliding type is large, and the sliding direction is not fixed.
Thus, as can be seen from the above examples, the operation characteristics of the operation gestures for different touch types are different, and such operation characteristics often correspond to the operation areas of different controls, so the operation characteristics can reflect the characteristics of the touch operations for different controls. Based on different operation characteristics expressed by different touch types, the operation characteristics of the touch operation can be determined through gesture information of the touch operation, and then the touch type matched with the touch operation can be determined, so that the touch type of a target operation area corresponding to the touch operation can be determined.
For example, if the gesture information includes a touch duration corresponding to the touch operation and a plurality of pieces of touch position information within the touch duration, the implementation of determining the second touch type corresponding to the touch operation is:
and if the touch duration is smaller than or equal to a first time threshold and the touch position change range is smaller than a first range threshold, determining that a second touch type of a second operation area corresponding to the touch operation is a click type. The touch position change range may be determined based on touch position information within the touch duration. As described above, the operation characteristic of the click operation, that is, the operation characteristic of the click type operation region is the fixed press position and the operation time is short, so after the touch duration and the touch position variation range of the target touch operation are acquired, it is determined whether the touch duration belongs to the aforementioned characteristic of the short operation time and it is determined whether the touch position variation range belongs to the fixed press position.
And if the touch duration is greater than a first time threshold and the touch position change range is greater than a first range threshold, determining that the touch type of the target operation area corresponding to the target touch operation is a preset sliding type based on target information, wherein the preset sliding type comprises a short sliding type and a directional sliding type. The implementation of determining that the touch type of the target operation area corresponding to the target touch operation is the preset sliding type based on the target information may be:
in the first manner, there is a large difference between the touch duration of the short slide type and the touch duration of the direction slide type, for example, the touch duration of the direction slide type is greater than the touch duration of the short slide type. Then, if the target information includes a touch duration, an implementation of determining, based on the target information, that the second touch type of the second operation area corresponding to the touch operation is the preset sliding type is: if the touch duration is greater than a first time threshold and less than a second time threshold, determining that a second touch type of a second operation area corresponding to the touch operation is a short slide type; and if the touch duration is greater than or equal to a second time threshold, determining that a second touch type of a second operation area corresponding to the touch operation is a direction sliding type. The first time threshold may be determined by referring to the foregoing description, and the second time threshold may be determined by a touch duration based on an operation gesture of the user on the short-slide type operation area, for example, the second time threshold may be a maximum touch duration of the operation gesture of the user on the short-slide type operation area. Therefore, the short slide type and the directional slide type can be distinguished by the second time threshold.
In the second manner, assuming that there is a large difference in at least one of the sliding direction, the sliding track, and the touch position change range of the short slide type and the directional slide type, that is, other information related to the change of the touch coordinates may be used to determine the short slide type and the directional slide type, the at least one of the sliding direction, the sliding track, and the touch position change range is named as sliding information, and the implementation of determining that the second touch type of the second operation area corresponding to the touch operation is the preset slide type based on the target information may be that if the sliding information satisfies the first preset sliding condition, it is determined that the second touch type of the second operation area corresponding to the touch operation is the short slide type; and if the sliding information meets a second preset sliding condition, determining that a second touch type of a second operation area corresponding to the touch operation is a short sliding type. Wherein the setting of the first preset sliding condition and the second preset sliding condition may be determined based on actual operation characteristics of the short sliding type and the directional sliding type, and the setting manner of the first preset sliding condition and the second preset sliding condition is different for different embodiments of the sliding information, for example. Therefore, the manner of determining that the sliding information satisfies the first preset sliding condition or the second preset sliding condition in the second manner is different based on different embodiments of the sliding information, and specifically, the second manner may be classified into the 2.1 th manner, the 2.2 nd manner, the 2.3 nd manner, and the 2.4 th manner.
In an exemplary manner of the 2.1, the sliding information includes a sliding direction, and the manner of determining that the sliding information meets the first preset sliding condition or the second preset sliding condition is that if the sliding direction is single, the sliding information is determined to meet the first preset sliding condition; and if the sliding direction is changeable, determining that the sliding information meets a second preset sliding condition. The sliding direction can be determined by curve fitting a plurality of pieces of touch position information within the touch duration. As described above, the sliding direction of the short slide type tends to be relatively single, and the sliding direction of the directional slide type tends to be relatively complex, so that the two can be distinguished by the sliding direction.
In an exemplary manner of the 2.2 th aspect, the sliding information includes a sliding track, and the manner of determining that the sliding information meets the first preset sliding condition or the second preset sliding condition is that if the sliding track matches the first track, the sliding information is determined to meet the first preset sliding condition; and if the sliding track is matched with the second track, determining that the sliding information meets a second preset sliding condition. The sliding track can also be determined by curve fitting a plurality of pieces of touch position information within the touch duration. Assuming that the short slide type is a short slide type and the directional slide type is a directional slide type, as described above, there is a large difference between the sliding track of the short slide type and the sliding track of the directional slide type, for example, the sliding track of the short slide type is often a straight line, and the sliding track of the directional slide type is often a disordered line segment, so that the sliding track corresponding to the short slide type may be counted in advance as a first track, for example, the sliding track of the user for the sliding gesture in the operation area belonging to the short slide type in the historical operation data in the preset period may be counted, and N tracks that occur frequently are counted by a plurality of sliding tracks may be counted as the first track, and similarly, the second track may also be counted in this way.
In an exemplary manner 2.3, the sliding information includes a touch position variation range, and the manner of determining that the sliding information meets the first preset sliding condition or the second preset sliding condition is that if the touch position variation range is greater than a first range threshold and less than a second range threshold, the sliding information is determined to meet the first preset sliding condition; and if the touch position change range is larger than a second range threshold, determining that the sliding information meets a second preset sliding condition. Assuming that the short slide type is a short slide type and the directional slide type is a directional slide type, as described above, there is a large difference between the touch position variation range of the short slide type and the touch position variation range of the directional slide type, it can be seen that the touch position variation range of the directional slide type is larger than the touch position variation range of the short slide type.
Since the sliding direction of the directional sliding type is not fixed, the short sliding type and the directional sliding type may have a case where the euclidean distance between the start position information and the end position information is not large, and therefore, in order to more easily distinguish the two, the manner of determining the touch position change range may be determined based on the aforementioned coverage area by a plurality of touch position information, that is, the area of the rectangular area.
Illustratively, the 2.4 mode is at least two modes of 2.1, 2.2 and 2.3, for example, the 2.4 mode is the 2.1 mode and the 2.2 mode, and the mode of determining that the sliding information satisfies the first preset sliding condition or the second preset sliding condition is that if the sliding direction is single and if the sliding track is matched with the first track, the sliding information satisfies the first preset sliding condition, and if the sliding direction is changeable and the sliding track is matched with the second track, the sliding information is determined to satisfy the second preset sliding condition. Other combinations may refer to the combination of the 2.1 nd and the 2.2 nd modes, and will not be described herein.
The third mode is a combination of the first mode and the second mode, that is, the target information includes the touch duration and the sliding information, the preset sliding type includes a short sliding type and a directional sliding type, and the implementation mode of determining that the second touch type of the second operation area corresponding to the touch operation is the preset sliding type based on the target information is that if the touch duration is greater than a first time threshold and less than a second time threshold and the sliding information meets a first preset sliding condition, the second touch type of the second operation area corresponding to the touch operation is determined to be the short sliding type; and if the touch duration is greater than or equal to a second time threshold and the sliding information meets a second preset sliding condition, determining that a second touch type of a second operation area corresponding to the touch operation is a short sliding type.
In addition, as described above, the gesture information may be a timing feature vector, that is, the gesture information is a timing feature vector, and the timing feature vector includes a plurality of touch location information corresponding to the target touch operation.
Illustratively, the touch position information refers to touch coordinates, and feature information of each frame is extracted, including information such as finger ID, touch coordinates, game frame setting, and the like, without limitation, to construct a time-series feature vector. It will be appreciated that the system will constantly collect and update information about the touch point at a certain frequency as the touch is being operated on the screen. These successive sampled data form a Frame of touch signals, also referred to as a Frame of touch events. The frames of the touch signal represent the state and properties of the touch point for a continuous time of one frame. Each touch frame typically contains the following information: a frame time stamp, recording the time point of the frame data acquisition; the number of touch points is recorded, and the number of touch points in the current frame, namely the number of fingers touching the screen simultaneously, is recorded. In this embodiment, the target touch operation corresponds to a touch operation of a finger ID. The game frame setting may be extracting relevant setting information of the current game frame, such as a game scene, a game state, a game character, a game interface layout, and the like. And combining the characteristic information into a time sequence signal characteristic. The feature sequence may be partitioned using time windows or fixed time intervals. For example, features are extracted every 100 milliseconds. For the finger ID and the touch coordinates, they may be used as a part of the feature sequence and arranged in time order, thereby obtaining the time sequence feature vector.
As an implementation manner, the implementation manner of determining the touch type of the target operation area corresponding to the target touch operation based on the gesture information may be that the time sequence feature vector is input into a pre-trained classification model, and the touch type of the target operation area corresponding to the target touch operation is determined through the classification model.
It should be noted that, the time sequence signal formed by the characteristic information collected by each frame is directly input into the lstm algorithm module, and the lstm algorithm has long-term memory capability, that is, the long-term dependency relationship can be effectively captured and stored by introducing the gating mechanism. By using forget gate, input gate and output gate, information can be selectively forgotten, added or output, thereby better preserving and utilizing relevant historical information in processing sequence data. Therefore, even the timing feature vector of the current frame of the present process may be referred to the history timing feature vector. In addition, the lstm algorithm is not affected by the time sequence signal length and other phenomena caused by overlong pressing time of a user, and can be processed frame by frame.
Therefore, the target interface can be divided into a direction sliding region, a clicking region and a short sliding region through gesture information of touch operation. As shown in fig. 10, the area corresponding to the bold solid line frame in fig. 10 is a click area, the area corresponding to the bold dashed line frame is a directional sliding area, and the area corresponding to the non-bold dashed line frame is a short sliding area. The above game area division is only abstract to divide the game area, and does not limit the touch behavior of the user. Such as clicking in the sliding area, short sliding, etc.
It should be noted that, the second operation area corresponding to the touch operation may be an area corresponding to a finger ID currently acting on the touch screen, and if the second touch type corresponding to the touch operation is a click type, the determining manner of the position of the second operation area corresponding to the touch operation is: any one touch position information acquired within the touch duration of the touch operation is used as a reference point, a target operation area is determined based on the reference point, the target operation area is a second operation area, wherein an area of a preset range of the reference point is used as the target operation area, for example, a circle area with the reference point as a circle center and a specified number of pixel points as a radius is used as the circle area. In this embodiment of the present application, the reference point may be a start touch point of the touch operation. If the second touch type corresponding to the touch operation is a directional sliding area or a short sliding area, integrating all the touch areas in the touch duration of the touch operation to obtain a new area as a second operation area.
S206: and determining the position and type of the operable area corresponding to the target interface based on the first operation area position, the first touch type, the second operation area position and the second touch type.
It can be understood that the operable area is a touch operation area corresponding to the finally determined target interface, and the determined position and type of the operable area can be optimized when the target interface is optimized in the later period. In the present embodiment, the operable area may also be named as an application hot zone.
Based on the foregoing steps, the first operation area position and the first touch type corresponding to the first operation area of the target interface may be determined by identifying the control of the target image of the target interface. Then, when the touch operation acting on the target interface is detected, a second operation area corresponding to the touch operation and a second operation area position and a second touch type corresponding to the second operation area can be determined based on gesture information of the touch operation. Therefore, if the touch operation is applied to the first operation area, the first operation area and the second operation area can be mutually verified to obtain the position and the type of the operable area corresponding to the target interface, so that the operable area corresponding to the target interface is more accurate, and if the touch operation is not applied to the first operation area, the operation area determined by the image recognition mode can be supplemented, so that the operable area corresponding to the target interface is more comprehensive.
Therefore, the position and the type of the operable area corresponding to the target interface can be determined by combining the image recognition and the gesture recognition, so that the recognition of the operable area of the target interface is more reasonable and accurate.
Referring to fig. 11, a touch area classification method provided in an embodiment of the present application is applied to an electronic device having a touch screen, and includes: s1110 to S1170.
S1110: and under the condition that the electronic equipment displays the target application program through the touch screen, acquiring a target image corresponding to a target interface of the target application program.
Referring to fig. 12, taking a game as an example, an overall framework suitable for the method provided in the embodiment of the present application is shown in fig. 12. Firstly, entering a game, namely, detecting that a game interface of a game application is displayed on a touch screen, then, capturing a picture of the game interface to obtain a target image, obtaining a pre-recognition hot zone, namely a first operation area, through recognition of a game control of the target image, then, checking the pre-recognition hot zone, and obtaining an operable area of the target interface after checking. Then, touch control optimization is conducted on the operable area, and in the game process, the operation of pre-recognition hot zone verification is continuously conducted. This is because, as the user continuously operates the target interface, the position and the touch type of the touch operation input by the user are continuously changed, and by continuously checking the pre-recognition hot zone in the game, the reasonability of the pre-recognition hot zone, that is, the first operation region, can be continuously verified by using the newly determined touch operation.
S1120: and determining a first operation area of the target interface based on the identification operation of the target image, wherein the first operation area corresponds to the first operation area position.
S1130: and determining a first touch type corresponding to the first operation area.
S1140: and acquiring gesture information corresponding to the touch operation of the target interface.
S1150: and determining a second operation area corresponding to the touch operation and a second operation area position and a second touch type corresponding to the second operation area based on the gesture information.
It should be noted that, the embodiments for obtaining the first operation area and the second operation area may refer to the foregoing embodiments, and are not described herein again.
Entering a game, detecting whether a user presses a finger, extracting touch characteristic information of a current frame, including information such as finger id, touch coordinates, game frame setting and the like, constructing time sequence signal characteristics and inputting the time sequence signal characteristics into an lstm gesture recognition algorithm module if the user presses the finger. And judging whether the current finger is lifted, if so, the lstm algorithm module outputs the recognition result of the current ID gesture, that is, determines the second operation area corresponding to the finger ID through the foregoing step S1140. The identification manner of the second operation area corresponding to the finger ID may refer to the foregoing embodiment, and will not be described herein.
And then, performing overlapping and category consistency checking operations on the hot zone information of the current gesture and the pre-recognition module result. First, it is determined whether the first operation area position and the second operation area position overlap, that is, whether the first operation area position and the second operation area position at least partially overlap, and if at least partially overlap, it indicates that the touch operation may be a certain first operation area, for example, that the user is operating a certain control. It is possible to determine whether the touch type determined by both are correct by checking the touch type determined by both.
S1160: and if the first operation area position and the second operation area position at least partially overlap and the first touch type and the second touch type are matched, determining that the first operation area passes the verification.
S1170: and obtaining the position and the type of the operable area corresponding to the target interface based on the fusion operation of the first operation area and the second operation area which pass the verification.
If the first operation area position and the second operation area position are at least partially overlapped, the first operation area corresponding to the first operation area position partially overlapped with the second operation area position is named as an operation area to be determined, which means that the operation area to be determined through image recognition has content which can be operated by a user, for example, can be a control, and means that the operation area determined through image recognition is indeed related to an operation control provided by a target application program. On the basis, if the first touch type and the second touch type are matched, two different recognition modes are represented, namely, the touch types determined by image recognition and gesture recognition are consistent, which indicates that the first operation area corresponding to the touch operation is accurately recognized through the verification of the actual touch operation of the user. The position and type of the operable area corresponding to the target interface can be obtained based on the fusion operation of the first operation area and the second operation area which pass the verification.
As an implementation manner, the fusing operation may refer to merging the position of the second operation area corresponding to the current touch operation with the position of the first operation area passing the verification to obtain a new area, and taking any one of the first touch type and the second touch type as the type of the operable area as the position of the operable area corresponding to the target interface. For example, a hot zone set corresponding to the target interface may be preset, where the hot zone set includes a plurality of the operable areas, that is, in an operable area obtained based on the fusion operation of the verified first operation area and the verified second operation area, the operable area may be stored in the hot zone set, and each operation area in the hot zone set may be used for touch optimization for the respective area.
It should be noted that, because the first operation area is determined based on the control identification manner of the target image, the determination of the first touch type corresponding to the first operation area is often determined based on the control type, and the mapping relationship between the control type and the touch type mentioned above is not described herein. And the second operation area is determined based on the position of the touch operation of the user on the target interface and gesture information of the touch operation, and the gesture characteristics are reflected, and as described above, the second touch type may include a click type, a short slide type and a direction slide type.
As an implementation manner, considering that the first touch types corresponding to some of the first operation areas may include multiple touch modes, for example, the first touch types of the first operation areas corresponding to the skill keys may include clicking and sliding at the same time, and therefore, if the first touch type and the second touch type exist in part the same touch type, it is determined that the first touch type and the second touch type are matched. It should be noted that, the types of the preset touch types included in the mapping relationship for determining the first touch type are at least partially the same as the types of all the second touch types supported by the target interface, for example, all the second touch types supported by the target interface include a click type, a short slide type and a directional slide type, and the preset touch types in the mapping relationship may also be a click type, a short slide type and a directional slide type, and the mapping relationship includes preset touch types corresponding to operation controls of different types.
Of course, the first touch type and the second touch type may be different, but there is a matching relationship. For example, the preset touch type in the mapping relationship includes a click type and a slide type, so the first touch type includes a click type and a slide type. That is, the specific sliding manner is not distinguished, but the second touch type includes a click type, a short sliding type and a directional sliding type, and it may be set that the short sliding type and the directional sliding type both belong to a matching relationship with the sliding type, so that whether the first touch type and the second touch type are matched may be determined by presetting the matching relationship.
As an implementation manner, since the touch optimization for the target interface is essentially to increase the touch experience of the user on the target interface, in the case that the first operation area passes verification, the second touch type corresponding to the current touch operation is used as the type of the operable area, for example, the user can slide short in the small map, the operation position (i.e., the second operation area position) of the sliding operation of the user can be determined to be matched with the first operation area position corresponding to the small map area obtained by image recognition in advance through verification, the type of the two can be determined to be matched in the foregoing manner, if the first operation area position is greater than the second operation area position, that is, the user always slides in the first operation area, then it can be determined that the operation area corresponding to the small map belongs to the operable area of the target interface, that is, the type of the operable area can be added into the hot area set of the target interface, and the type of the operable area is the short slide type, that is, the second touch type of the touch operation is used as the type of the operable area.
As an embodiment, the number of the first operation areas may be plural, and in the embodiment of S1150, in the plural first operation area positions, a first operation area at least partially overlapping with the second operation area position is searched for as a target first operation area, and if a first touch type and a second touch type corresponding to the target first operation area match, it is determined that verification of the target first operation area passes. If the determined target first operation area is a plurality of, that is, at least two first operation areas at least partially overlap with the second operation area in the plurality of first operation area positions, all the first operation areas at least partially overlapping with the second operation area are regarded as target first operation areas, and then the operation areas matching with the types of the second operation areas are searched in the target first operation areas and are regarded as verified first operation areas. Later, the multiple operation areas can be fused into the operable area of the target interface in a fusion mode. By the method, the plurality of first operation areas obtained by image recognition can be integrated into one operable area and correspond to the same touch type, so that the plurality of first operation areas are optimized uniformly.
As an implementation manner, when the first operation area is obtained through the detection model, in addition to obtaining the first operation area position and the first touch type corresponding to the first operation area, each first operation area corresponds to a confidence level, and the confidence level is used for representing the identification accuracy of the first operation area. The confidence level may be, for example, a default value, i.e. the confidence values of all the first operating regions are the same. Of course, the confidence of each first operation region may not be the same, and for example, the confidence of the first operation region may be the confidence of each candidate frame to which the detection model has been applied. In the embodiment of the present application, the initial values of the confidence degrees of all the first operation areas may be set to be the same, and by using the confidence degrees, not only the effect of verifying the image recognition by gesture recognition may be obtained, but also whether the first operation area obtained by the image recognition is a high-frequency operation area of the user may be verified.
By way of example, an implementation of S1160 may be: if the first operation area position and the second operation area position are at least partially overlapped, the first operation area which is at least partially overlapped with the second operation area is taken as a target pre-identification hot area; judging whether the first touch type and the second touch type of the target pre-recognition hot zone are matched; if so, increasing the confidence of the target pre-recognition hot zone; if the increased confidence coefficient is larger than a first threshold value, determining that the target pre-identification hot zone verification is passed; if the gesture information is not matched with the gesture information, the confidence coefficient of the target pre-recognition hot zone is reduced, and the operation of acquiring the gesture information corresponding to the touch operation of the target interface and the subsequent operation are returned to be executed.
And under the condition that the first operation area position and the second operation area position are at least partially overlapped, marking the first operation area which is at least partially overlapped with the second operation area position as a target pre-identification hot area, then determining whether a first touch type of the target pre-identification hot area is matched with a second touch type corresponding to the touch operation, and increasing the confidence of the target pre-identification hot area. For example, a specific value may be increased based on the confidence of the target pre-identified hot zone, for example, the specific value is 1, the confidence of the target pre-identified hot zone is +1, then, it is determined whether the increased confidence is greater than a first threshold, if so, the target pre-identified hot zone is determined to pass verification, and if not, the confidence of the target pre-identified hot zone is decreased, for example, the specific value is decreased based on the confidence of the target pre-identified hot zone, for example, the specific value is 1. And after the confidence coefficient of the target pre-recognition hot zone is reduced, returning to execute the operation of acquiring gesture information corresponding to the touch operation of the target interface and subsequent operations.
The setting of the first threshold may be set based on actual requirements, and may be set based on the requirements of the number of verifications for the first operation region, for example. That is, the first threshold may be the sum of an initial value of the confidence of the first operation region and a specified value of (N-1), where N is a desired number of verifications, for example, N is 3, and the initial value is 1, and if the first threshold is 3, three times of accumulation satisfy that the first operation region position and the second operation region position at least partially overlap and that the first touch type and the second touch type match, the confidence becomes 4, thereby satisfying the requirement of three times of verifications.
In addition, the confidence coefficient has a minimum value, that is, if the first touch type of the target pre-identification hot zone is not matched with the second touch type corresponding to the touch operation, the confidence coefficient of the target pre-identification hot zone is reduced; if the confidence is smaller than a second threshold, discarding the target pre-recognition hot zone; and if the confidence coefficient is greater than or equal to a second threshold value, returning to execute the operation of acquiring gesture information corresponding to the touch operation of the target interface and subsequent operations. The embodiment of discarding the target pre-recognition hot zone is that the target pre-recognition hot zone is no longer used for subsequent verification operations based on the second operating region. For example, the target interface corresponds to a pre-recognition hot zone set, the first operation area determined based on the recognition operation on the target image is stored in the pre-recognition hot zone set, and when the operation that the first operation area position and the second operation area position at least partially overlap is performed, the first operation area at least partially overlapping with the second operation area position is searched in the pre-recognition hot zone set. If the target pre-identified hot zone is discarded, the target pre-identified hot zone is removed from the set of pre-identified hot zones.
S1180: if the first operation area position and the second operation area position do not have an overlapping area and the operation frequency of the second operation area is larger than the preset frequency, determining the position and the type of the operable area corresponding to the target interface based on the second operation area.
If there is no overlapping area between the first operation area position and the second operation area position, that is, the area operated by the user is an area other than the first operation area, it is determined whether the operation frequency of the second operation area is greater than a preset frequency, if the operation frequency of the second operation area is greater than the preset frequency, it is indicated that the second operation area is input with a high frequency operation area, and since the position and type of the second operation area have been determined through the recognition of gesture information, the position and type of the operable area corresponding to the target interface can be determined based on the second operation area.
As an embodiment, referring to fig. 13, the embodiment of S1180 may include: s1181 to S1183.
S1181: and if the first operation area position and the second operation area position do not have the overlapping area, acquiring a candidate hot area set corresponding to the target interface.
As one embodiment, the candidate hot zone set includes a history operation region corresponding to a plurality of history touch operations for the target interface detected before the touch operation, and each history operation region in the candidate hot zone set has no overlapping region with the first operation region. That is, after the target interface of the current target application program is started on the touch screen, each touch operation output by the user on the target interface is used for verification with the first touch operation, and if it is determined that there is no overlapping area with the first touch operation, the overlapping area is stored in the candidate hot area set.
S1182: and searching a historical operation region matched with the second operation region in the candidate hot region set.
S1183: and if the historical operation area matched with the second operation area is found, adding the second operation area into the candidate hot area set, and obtaining the position and the type of the operable area corresponding to the target interface based on the fusion operation of the target historical operation area and the second operation area.
It can be understood that, because each history operation area corresponding to each history touch operation in the candidate hot area set is an operation area outside the first operation area obtained by image recognition, and the second operation area corresponding to the current touch operation is not overlapped with the first operation area, which indicates that the second operation area corresponding to the current touch operation is also an operation area outside the first operation area obtained by image recognition, it can be determined whether the second operation area corresponding to the current touch operation is an area of high-frequency operation or not by combining the candidate hot area set, and the user frequently operates an area, which indicates that there is operable content in the operation area, which also indicates that there is a possible operation control in the area, and the operation area corresponding to the operation control is not detected in the foregoing image recognition operation.
Therefore, if the historical operation area matched with the second operation area is found and the number of the historical operation areas matched with the second operation area is greater than the first specified threshold, the position and the type of the operable area corresponding to the target interface are obtained based on the fusion operation of the target historical operation area and the second operation area, wherein the specific fusion operation can refer to the fusion operation described above, and the principle is similar and will not be repeated here. In addition, the second operation area needs to be added into the candidate hot area set, so that the second operation area corresponding to the current touch operation can be used as the history operation area by the subsequent touch operation.
As an embodiment, the embodiment of searching the history operation area matched with the second operation area may be that searching a third touch type matched with the second touch type; and if the third touch type matched with the second touch type is found, and the position of the history operation area corresponding to the matched third touch type is at least partially overlapped with the position of the second operation area, taking the history operation area which is at least partially overlapped as the history operation area matched with the second operation area. It will be appreciated that the historical operating region corresponds to a third touch type, if a third touch type matching the second touch type is found in the candidate hot zone set, indicating that the user has entered a touch gesture of the same touch type as the current touch type outside the first operating region, then if the location of the historical operating region corresponding to the matching third touch type at least partially overlaps the location of the second operating region, indicating that the user has entered a touch operation of the same touch type at the same location, in some embodiments, the region may be used to determine the operable region of the target interface.
If the third touch type matched with the second touch type is not found, searching a to-be-selected historical operation area in the candidate hot area set, wherein the to-be-selected historical operation area is determined based on a specified number of historical operation areas which are overlapped in the candidate hot area set and have the same touch type, and the specified number is larger than a second specified threshold; acquiring operation control layout information of the target interface; and determining the position and the type of the operable area corresponding to the target interface based on the historical operation area to be selected and the operation control layout information.
In one embodiment, if the second specified threshold is smaller than the first specified threshold, if the third touch type matching the second touch type is not found in the candidate hot area set, the user is indicated to input a new touch gesture, and the history touch operation overlapping with the second touch operation is not required to be executed. At this time, the regions in the candidate list do overlap and correlation consistency checks themselves. It may be understood that, in the present touch operation, no matching history operation area is found in the candidate hot area set, but there may be a high-frequency operation area in the candidate hot area set, that is, a plurality of history operation areas with overlapping areas and the same touch type are found in the candidate hot area set, if a specified number of history operation areas with overlapping areas and the same touch type can be found and the specified number is greater than a second specified threshold, the history operation area to be selected may be determined based on the specified number of history operation areas with overlapping areas and the same touch type in the candidate hot area set, and for example, the specified number of history operation areas with overlapping areas and the same touch type may be fused to obtain the history operation area to be selected, where the area overlapping refers to at least partial overlapping.
It will be appreciated that the distribution of the individual controls of the target interface typically satisfies certain layout information, such as the directional wheel slip area on the left side of the screen typically, and the skill keys, tool keys, etc. are placed on the right side of the screen. Therefore, based on the to-be-selected historical operation area and the operation control layout information, the position and the type of the operable area corresponding to the target interface are determined, and the to-be-selected historical operation area can be verified through the layout information. Specifically, the operation control layout information comprises distribution areas of various types of operation controls in a target interface, and a distribution area corresponding to the operation control matched with the third touch type of the history operation area to be selected is determined in the operation control layout information and is used as the distribution area to be selected; if the position of the area of the history operation area to be selected is located in the distribution area to be selected, determining the position and the type of the operable area corresponding to the target interface based on the history operation area to be selected, for example, taking the history operation area to be selected as the operable area corresponding to the target interface, that is, taking the position of the history operation area to be selected as the position of the operable area corresponding to the target interface, and taking the touch type of the history operation area to be selected as the type of the operable area corresponding to the target interface.
Referring to fig. 14, a touch area classification method provided in an embodiment of the present application is applied to an electronic device having a touch screen, and includes: s1401 to S1407.
S1401: and under the condition that the electronic equipment displays the target application program through the touch screen, acquiring a target image corresponding to a target interface of the target application program.
S1402: and determining a first operation area of the target interface based on the identification operation of the target image, wherein the first operation area corresponds to the first operation area position.
S1403: and determining a first touch type corresponding to the first operation area.
S1404: and acquiring gesture information corresponding to the touch operation of the target interface.
As one implementation manner, determining whether a designated type exists in a first touch type of the first operation area; and if the specified type exists, acquiring gesture information corresponding to the touch operation of the target interface. The specified type may be used for verifying the identification result obtained in S1402, and if the specified type is not included, it indicates that the identification result is inaccurate and cannot be used. In general, the designated type may be a touch type corresponding to a control existing in a target interface of a different application program or a touch type corresponding to a control typical of the target interface, and, for example, a game application, the designated type may be a touch type corresponding to a direction key.
S1405: and determining a second operation area corresponding to the touch operation and a second operation area position and a second touch type corresponding to the second operation area based on the gesture information.
S1406: and determining the position and type of the operable area corresponding to the target interface based on the first operation area position, the first touch type, the second operation area position and the second touch type.
S1407: touch optimization is performed for the operable area.
As an implementation manner, a touch optimization policy of the operable area may be set, and in this embodiment, the touch optimization policy that may be used may include adjusting a touch response parameter, filtering and debouncing, and improving chirality.
As an embodiment, the touch response parameter may include all parameters that can affect the recognition effect of the touch gesture, such as touch sensitivity and input response time, which are not limited herein. The touch sensitivity of the touch screen refers to the degree to which the touch screen can sense and respond to touch input of a user. That is, the likelihood that a touch signal input by a user can be recognized as a useful signal is related to a touch threshold corresponding to the useful signal, the lower the threshold, the higher the sensitivity, but the more likely to be touched by mistake. Common touch screens include resistive touch screens and capacitive touch screens. The resistive touch screen adopts a contact mode between two layers of conductive films. When a user touches the screen with a finger or other object, a current is formed between the two conductive films, and the screen determines the touch position according to the change in the current. Capacitive touch screens detect touch locations by inducing changes in the charge of the human body. Therefore, the touch sensitivity of the touch area can be adjusted by adjusting the touch threshold of the touch area.
The input response time of the touch screen refers to the time that the user can quickly perceive the touch input and make corresponding actions after touching the screen. Lower response times may provide faster, immediate touch feedback, thereby improving the user experience. It should be noted that, the foregoing pointing rate and touch sensitivity can both affect the input response time, and the input response time is independently used as a regulation parameter, and considering that the input response time can be adjusted in the following ways, it may include: (1) reducing background running applications: closing unnecessary background application programs and processes to release system resources; (2) updating a driver of the touch screen; (3) Animation and special effects are reduced, and the animation and special effects of certain operating systems or application programs can cause the response speed of the touch screen to be slow.
The filtering and debouncing refers to removing interference points generated in the process of the target touch operation, that is, the filtering and debouncing of the sliding gesture is to reduce or eliminate noise and instability in gesture input, so that gesture recognition is more accurate and smooth. Typically, the filtering debounces effect can be achieved by some filtering algorithm, such as an exponentially weighted moving average (Exponential Moving Average) or Kalman Filter (Kalman Filter). In this embodiment of the present application, a IIR (Infinite Impulse Response) filter may be used to implement a filtering and debouncing effect, and the obtained current coordinate=previous coordinate+coefficient (current coordinate-previous coordinate) is passed through an IIR filter, where the coefficient determines the filtering strength, and the higher the coefficient, the larger the filtering strength, the lower the coefficient, and the lower the filtering strength.
The following chirality means that when a user slides the screen, the screen can react according to the sliding angle of the user, the higher the following chirality is, the smoother the picture feel is, and the screen response is more timely at this time. Based on the fact that the following chirality is emphasized, namely, how the hand action of the user moves and how the controlled character moves, as described above, the following chirality can be improved by adjusting the pointing rate of the touch screen, of course, the following chirality can also be set by adjusting the system of the filter for the sliding gesture, as described above, the filter is used for smoothing the sliding operation, so that the sliding track of the sliding operation input by the user is smoother, accordingly, the following chirality can be lost, and because the corrected track point possibly has an access with the actual track point, the following chirality can also be enhanced by reducing the filtering strength of the filter, and meanwhile, the performance can also be properly considered.
Referring to fig. 15, fig. 15 illustrates an embodiment of touch tuning of an operable area (i.e., a hot zone) of a target interface, if the target application is a game application, the hot zone is a game key area. After the hot zone pre-identification and hot zone verification compensation are completed, firstly, performing an attribute classification operation on the game key area, namely, receiving the area information, and then determining the touch type to which the area information belongs according to preset information, wherein the specific determination of the touch type of each hot zone can refer to the foregoing embodiment, and is not repeated herein. When the fact that the finger ID of the user is pressed down and falls in the game key area is detected, judging the touch type of the game key area corresponding to the finger ID, namely the game area type, and binding the finger ID with the game area type when pressed down; different touch optimization strategies are executed for game types of different finger IDs, for example, touch coordinate filtering processing is enhanced for a short-slide type region, and sensitivity of touch coordinates is improved for a click region.
When the finger ID is lifted, recording operation track information, namely gesture information, of the finger ID in the game key area, and if the gesture information is matched with the game area type, for example, if the game area type is a click type, the gesture information corresponding to the finger ID is also a click operation, and if the gesture information is matched with the game area type. Therefore, the touch optimization strategy can be executed for the game key area corresponding to the finger ID. In addition, in addition to performing the hot zone type judgment on the touch operation, a hot zone state maintenance algorithm needs to be maintained, and when the hot zone is judged to be needed to exit, the game key area information is cleared, that is, if the hot zone state is not cleared, the hot zone state information is continuously updated. The hotspot state is used to indicate whether the game key area corresponding to the finger ID is an operation hotspot, for example, the user may switch to other interfaces in the game state, and in this case, the current hotspot position does not belong to the hotspot.
Referring to fig. 16, a block diagram illustrating a touch area classifying device 1600 according to an embodiment of the present application is shown, where the device may include: a first acquisition unit 1601, a first determination unit 1602, a second determination unit 1603, a second acquisition unit 1604, a third determination unit 1605, and an identification unit 1606.
A first obtaining unit 1601, configured to obtain, when the electronic device displays a target application through the touch screen, a target image corresponding to a target interface of the target application.
A first determining unit 1602, configured to determine a first operation area of the target interface based on an identification operation of the target image, where the first operation area corresponds to a first operation area position.
A second determining unit 1603, configured to determine a first touch type corresponding to the first operation area.
The second obtaining unit 1604 is configured to obtain gesture information corresponding to a touch operation on the target interface.
Further, the second obtaining unit 1604 is further configured to determine whether a specified type exists in the first touch type of the first operation area; and if the specified type exists, acquiring gesture information corresponding to the touch operation of the target interface.
Further, the gesture information includes a touch duration and a plurality of touch position information within the touch duration.
The third determining unit 1605 is configured to determine, based on the gesture information, a second operation area corresponding to the touch operation, a second operation area position corresponding to the second operation area, and a second touch type.
The recognition unit 1606 is configured to determine a position and a type of the operable area corresponding to the target interface based on the first operation area position, the first touch type, the second operation area position, and the second touch type.
Further, the recognition unit 1606 is further configured to determine that the first operation area passes the verification if the first operation area position and the second operation area position at least partially overlap, and the first touch type and the second touch type are matched; and obtaining the position and the type of the operable area corresponding to the target interface based on the fusion operation of the first operation area and the second operation area which pass the verification.
Further, the number of the first operation areas is plural, and the recognition unit 1606 is further configured to search, among the plurality of the first operation area positions, for a first operation area that at least partially overlaps with the second operation area position, as a target first operation area; and if the first touch type and the second touch type corresponding to the target first operation area are matched, determining that the verification of the target first operation area is passed.
Further, the identifying unit 1606 is further configured to, if the first operation area position and the second operation area position at least partially overlap, take the first operation area at least partially overlapping the second operation area as the target pre-identification hot area; judging whether the first touch type and the second touch type of the target pre-recognition hot zone are matched; if so, increasing the confidence of the target pre-recognition hot zone; if the increased confidence coefficient is larger than a first threshold value, determining that the target pre-identification hot zone verification is passed; if the gesture information is not matched with the gesture information, the confidence coefficient of the target pre-recognition hot zone is reduced, and the operation of acquiring the gesture information corresponding to the touch operation of the target interface and the subsequent operation are returned to be executed.
Further, the identifying unit 1606 is further configured to reduce the confidence level of the target pre-identified hot zone if the target pre-identified hot zone is not matched; if the confidence is smaller than a second threshold, discarding the target pre-recognition hot zone; and if the confidence coefficient is greater than or equal to a second threshold value, returning to execute the operation of acquiring gesture information corresponding to the touch operation of the target interface and subsequent operations.
Further, the identifying unit 1606 is further configured to determine, if there is no overlapping area between the first operation area position and the second operation area position and the operation frequency of the second operation area is greater than the preset frequency, the position and the type of the operable area corresponding to the target interface based on the second operation area.
Further, the identifying unit 1606 is further configured to obtain a candidate hot area set corresponding to the target interface if the first operation area position and the second operation area position do not have an overlapping area, where the candidate hot area set includes a history operation area corresponding to a plurality of history touch operations detected before the touch operation, and each history operation area in the candidate hot area set does not have an overlapping area with the first operation area; searching a historical operation area matched with the second operation area in the candidate hot area set; and if the number of the history operation areas matched with the second operation area is larger than a first specified threshold, adding the second operation area into the candidate hot area set, and obtaining the position and the type of the operable area corresponding to the target interface based on the fusion operation of the target history operation area and the second operation area, wherein the target history operation area is the history operation area matched with the second operation area.
Further, the recognition unit 1606 is further configured to find a third touch type that matches the second touch type; and if the third touch type matched with the second touch type is found, and the position of the history operation area corresponding to the matched third touch type is at least partially overlapped with the position of the second operation area, taking the history operation area which is at least partially overlapped as the history operation area matched with the second operation area.
Further, the recognition unit 1606 is further configured to search, if a third touch type that matches the second touch type is not found, a history operation area to be selected in the candidate hot area set, where the history operation area to be selected is determined based on a specified number of history operation areas that overlap the candidate hot area set and have the same touch type, where the specified number is greater than a second specified threshold; acquiring operation control layout information of the target interface; and determining the position and the type of the operable area corresponding to the target interface based on the historical operation area to be selected and the operation control layout information.
Further, the recognition unit 1606 is further configured to determine, in the operation control layout information, a distribution area corresponding to an operation control that matches the third touch type in the history operation area to be selected, as a distribution area to be selected; and if the area position of the to-be-selected historical operation area is positioned in the to-be-selected distribution area, determining that the to-be-selected historical operation area is the position and the type of the operable area corresponding to the target interface.
It will be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working process of the apparatus and modules described above may refer to the corresponding process in the foregoing method embodiment, which is not repeated herein.
In several embodiments provided herein, the coupling of the modules to each other may be electrical, mechanical, or other.
In addition, each functional module in each embodiment of the present application may be integrated into one processing module, or each module may exist alone physically, or two or more modules may be integrated into one module. The integrated modules may be implemented in hardware or in software functional modules.
Referring to fig. 17, a block diagram of an electronic device according to an embodiment of the present application is shown. The electronic device 100 may be a smart phone, a tablet computer, an electronic book, or the like capable of running an application program. The electronic device 100 in this application may include one or more of the following components: a processor 110, a memory 120, and one or more application programs, wherein the one or more application programs may be stored in the memory 120 and configured to be executed by the one or more processors 110, the one or more program(s) configured to perform the method as described in the foregoing method embodiments.
Processor 110 may include one or more processing cores. The processor 110 utilizes various interfaces and lines to connect various portions of the overall electronic device 100, perform various functions of the electronic device 100, and process data by executing or executing instructions, programs, code sets, or instruction sets stored in the memory 120, and invoking data stored in the memory 120. Alternatively, the processor 110 may be implemented in hardware in at least one of digital signal processing (Digital Signal Processing, DSP), field programmable gate array (Field-Programmable Gate Array, FPGA), programmable logic array (Programmable Logic Array, PLA). The processor 110 may integrate one or a combination of several of a central processing unit (Central Processing Unit, CPU), an image processor (Graphics Processing Unit, GPU), and a modem, etc. The CPU mainly processes an operating system, a user interface, an application program and the like; the GPU is used for being responsible for rendering and drawing of display content; the modem is used to handle wireless communications. It will be appreciated that the modem may not be integrated into the processor 110 and may be implemented solely by a single communication chip.
The Memory 120 may include a random access Memory (Random Access Memory, RAM) or a Read-Only Memory (Read-Only Memory). Memory 120 may be used to store instructions, programs, code, sets of codes, or sets of instructions. The memory 120 may include a stored program area and a stored data area, wherein the stored program area may store instructions for implementing an operating system, instructions for implementing at least one function (such as a touch function, a sound playing function, an image playing function, etc.), instructions for implementing the various method embodiments described below, etc. The storage data area may also store data created by the electronic device 100 in use (e.g., phonebook, audiovisual data, chat log data), and the like.
Referring to fig. 18, a block diagram of a computer readable medium according to an embodiment of the present application is shown. The computer readable medium 1800 has stored therein program code that can be invoked by a processor to perform the methods described in the method embodiments described above.
The computer readable medium 1800 may be an electronic memory such as a flash memory, an EEPROM (electrically erasable programmable read only memory), an EPROM, a hard disk, or a ROM. Optionally, the computer readable medium 1800 includes a non-volatile computer readable medium (non-transitory computer-readable storage medium). The computer readable medium 1800 has storage space for program code 1810 that performs any of the method steps described above. The program code can be read from or written to one or more computer program products. Program code 1810 may be compressed, for example, in a suitable form.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present application, and are not limiting thereof; although the present application has been described in detail with reference to the foregoing embodiments, one of ordinary skill in the art will appreciate that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not drive the essence of the corresponding technical solutions to depart from the spirit and scope of the technical solutions of the embodiments of the present application.

Claims (15)

1. A touch area classification method, which is applied to a touch screen of an electronic device, the method comprising:
acquiring a target image corresponding to a target interface of a target application program under the condition that the electronic equipment displays the target application program through the touch screen;
determining a first operation area of the target interface based on the identification operation of the target image, wherein the first operation area corresponds to a first operation area position;
determining a first touch type corresponding to the first operation area;
acquiring gesture information corresponding to touch operation of the target interface;
determining a second operation area corresponding to the touch operation and a second operation area position and a second touch type corresponding to the second operation area based on the gesture information;
And determining the position and type of the operable area corresponding to the target interface based on the first operation area position, the first touch type, the second operation area position and the second touch type.
2. The method of claim 1, wherein determining the location and type of the operable area corresponding to the target interface based on the first operation area location, the first touch type, the second operation area location, and the second touch type comprises:
if the first operation area position and the second operation area position are at least partially overlapped and the first touch type and the second touch type are matched, determining that the first operation area passes verification;
and obtaining the position and the type of the operable area corresponding to the target interface based on the fusion operation of the first operation area and the second operation area which pass the verification.
3. The method of claim 2, wherein the first operation area is a plurality of, and determining that the first operation area information is verified if the first operation area position and the second operation area position at least partially overlap and the first touch type and the second touch type match comprises:
Searching a first operation area which at least partially overlaps with the second operation area position in the plurality of first operation area positions as a target first operation area;
and if the first touch type and the second touch type corresponding to the target first operation area are matched, determining that the verification of the target first operation area is passed.
4. The method of claim 2, wherein determining that the first operation region is verified if the first operation region position and the second operation region position at least partially overlap and the first touch type and the second touch type match comprises:
if the first operation area position and the second operation area position are at least partially overlapped, the first operation area which is at least partially overlapped with the second operation area is taken as a target pre-identification hot area;
judging whether the first touch type and the second touch type of the target pre-recognition hot zone are matched;
if so, increasing the confidence of the target pre-recognition hot zone;
if the increased confidence coefficient is larger than a first threshold value, determining that the target pre-identification hot zone verification is passed;
if the gesture information is not matched with the gesture information, the confidence coefficient of the target pre-recognition hot zone is reduced, and the operation of acquiring the gesture information corresponding to the touch operation of the target interface and the subsequent operation are returned to be executed.
5. The method of claim 4, wherein if the first operation area is not matched, reducing the confidence level of the first operation area, and returning to perform an operation of acquiring gesture information corresponding to the touch operation of the target interface and a subsequent operation, including:
if the target pre-identification hotspots are not matched, the confidence of the target pre-identification hotspots is reduced;
if the confidence is smaller than a second threshold, discarding the target pre-recognition hot zone;
and if the confidence coefficient is greater than or equal to a second threshold value, returning to execute the operation of acquiring gesture information corresponding to the touch operation of the target interface and subsequent operations.
6. The method of claim 1, wherein determining the location and type of the operable area corresponding to the target interface based on the first operation area location, the first touch type, the second operation area location, and the second touch type comprises:
if the first operation area position and the second operation area position do not have an overlapping area and the operation frequency of the second operation area is larger than the preset frequency, determining the position and the type of the operable area corresponding to the target interface based on the second operation area.
7. The method of claim 6, wherein if the first operation region position and the second operation region position do not have an overlapping region, and the operation frequency of the second operation region is greater than a preset frequency, determining the position and the type of the operable region corresponding to the target interface based on the second operation region comprises:
if the first operation area position and the second operation area position do not have an overlapping area, acquiring a candidate hot area set corresponding to the target interface, wherein the candidate hot area set comprises historical operation areas corresponding to multiple historical touch operations on the target interface, which are detected before touch operations, and each historical operation area in the candidate hot area set does not have an overlapping area with the first operation area;
searching a historical operation area matched with the second operation area in the candidate hot area set;
and if the number of the history operation areas matched with the second operation area is larger than a first specified threshold, adding the second operation area into the candidate hot area set, and obtaining the position and the type of the operable area corresponding to the target interface based on the fusion operation of the target history operation area and the second operation area, wherein the target history operation area is the history operation area matched with the second operation area.
8. The method of claim 7, wherein the historical operating region corresponds to a third touch type, the looking up the historical operating region that matches the second operating region comprising:
searching a third touch type matched with the second touch type;
and if the third touch type matched with the second touch type is found, and the position of the history operation area corresponding to the matched third touch type is at least partially overlapped with the position of the second operation area, taking the history operation area which is at least partially overlapped as the history operation area matched with the second operation area.
9. The method as recited in claim 8, further comprising:
if the third touch type matched with the second touch type is not found, searching a to-be-selected historical operation area in the candidate hot area set, wherein the to-be-selected historical operation area is determined based on a specified number of historical operation areas which are overlapped in the candidate hot area set and have the same touch type, and the specified number is larger than a second specified threshold;
acquiring operation control layout information of the target interface;
and determining the position and the type of the operable area corresponding to the target interface based on the historical operation area to be selected and the operation control layout information.
10. The method according to claim 9, wherein the operation control layout information includes distribution areas of various types of operation controls in a target interface, and the position and type of the operable area corresponding to the target interface are determined based on the history operation area to be selected and the operation control layout information;
determining a distribution area corresponding to the operation control matched with the third touch type of the history operation area to be selected in the operation control layout information, and taking the distribution area as the distribution area to be selected;
and if the area position of the to-be-selected historical operation area is positioned in the to-be-selected distribution area, determining the position and the type of the operable area corresponding to the target interface based on the to-be-selected historical operation area.
11. The method according to any one of claims 1-10, wherein the obtaining gesture information corresponding to the touch operation on the target interface includes:
determining whether a designated type exists in a first touch type of the first operation area;
and if the specified type exists, acquiring gesture information corresponding to the touch operation of the target interface.
12. The method of any of claims 1-10, wherein the gesture information comprises a touch duration and a plurality of touch location information within the touch duration.
13. A touch area classification apparatus, characterized by being applied to a touch screen of an electronic device, the apparatus comprising:
the first acquisition unit is used for acquiring a target image corresponding to a target interface of the target application program under the condition that the electronic equipment displays the target application program through the touch screen;
a first determining unit configured to determine a first operation area of the target interface based on an identification operation of the target image, the first operation area corresponding to a first operation area position;
the second determining unit is used for determining a first touch type corresponding to the first operation area;
the second acquisition unit is used for acquiring gesture information corresponding to the touch operation of the target interface;
the third determining unit is used for determining a second operation area corresponding to the touch operation and a second operation area position and a second touch type corresponding to the second operation area based on the gesture information;
the identification unit is used for determining the position and the type of the operable area corresponding to the target interface based on the first operation area position, the first touch type, the second operation area position and the second touch type.
14. An electronic device, comprising:
one or more processors;
a memory;
one or more applications, wherein the one or more applications are stored in the memory and configured to be executed by the one or more processors, the one or more applications configured to perform the method of any of claims 1-12.
15. A computer readable medium, characterized in that the computer readable medium stores a program code executable by a processor, which program code, when executed by the processor, causes the processor to perform the method of any of claims 1-12.
CN202311433331.XA 2023-10-31 2023-10-31 Touch area classification method and device, electronic equipment and computer readable medium Pending CN117406905A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311433331.XA CN117406905A (en) 2023-10-31 2023-10-31 Touch area classification method and device, electronic equipment and computer readable medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311433331.XA CN117406905A (en) 2023-10-31 2023-10-31 Touch area classification method and device, electronic equipment and computer readable medium

Publications (1)

Publication Number Publication Date
CN117406905A true CN117406905A (en) 2024-01-16

Family

ID=89492281

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311433331.XA Pending CN117406905A (en) 2023-10-31 2023-10-31 Touch area classification method and device, electronic equipment and computer readable medium

Country Status (1)

Country Link
CN (1) CN117406905A (en)

Similar Documents

Publication Publication Date Title
US10126826B2 (en) System and method for interaction with digital devices
JP6765545B2 (en) Dynamic gesture recognition method and device, gesture dialogue control method and device
US9046929B2 (en) System and method for inputting user commands to a processor
KR101632963B1 (en) System and method for object recognition and tracking in a video stream
CN108920202B (en) Application preloading management method and device, storage medium and intelligent terminal
CN105556438A (en) Systems and methods for providing response to user input using information about state changes predicting future user input
CN107871001B (en) Audio playing method and device, storage medium and electronic equipment
KR20130112061A (en) Natural gesture based user interface methods and systems
JP6334767B1 (en) Information processing apparatus, program, and information processing method
EP4307096A1 (en) Key function execution method, apparatus and device, and storage medium
CN112540696A (en) Screen touch control management method, intelligent terminal, device and readable storage medium
CN111831204B (en) Device control method, device, storage medium and electronic device
CN111367457A (en) Content sharing method and device and electronic equipment
CN107797748B (en) Virtual keyboard input method and device and robot
CN108845756B (en) Touch operation method and device, storage medium and electronic equipment
CN106601217A (en) Interactive-type musical instrument performing method and device
CN107544740B (en) Application processing method and device, storage medium and electronic equipment
CN111265881B (en) Model training method, content generation method and related device
CN115421591B (en) Gesture control device and image pickup apparatus
CN117406905A (en) Touch area classification method and device, electronic equipment and computer readable medium
CN110750193B (en) Scene topology determination method and device based on artificial intelligence
CN103905629B (en) Display processing method and display processing device
CN110568989A (en) service processing method, service processing device, terminal and medium
CN113360071B (en) Touch screen control method and device, storage medium and electronic device
WO2022252872A1 (en) Device control method and apparatus, electronic device, and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination