CN114270298A - Touch event processing method and device, mobile terminal and storage medium - Google Patents

Touch event processing method and device, mobile terminal and storage medium Download PDF

Info

Publication number
CN114270298A
CN114270298A CN201980099360.2A CN201980099360A CN114270298A CN 114270298 A CN114270298 A CN 114270298A CN 201980099360 A CN201980099360 A CN 201980099360A CN 114270298 A CN114270298 A CN 114270298A
Authority
CN
China
Prior art keywords
touch
event
event information
navigation gesture
coordinate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201980099360.2A
Other languages
Chinese (zh)
Inventor
戴聪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Shenzhen Huantai Technology Co Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Shenzhen Huantai Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd, Shenzhen Huantai Technology Co Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Publication of CN114270298A publication Critical patent/CN114270298A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]

Abstract

The application discloses a processing method and a processing device of a touch event, a mobile terminal and a storage medium, wherein the processing method of the touch event is applied to an operating system of the mobile terminal, the mobile terminal comprises a touch screen, and the method comprises the following steps: when the navigation gesture function of the mobile terminal is started, the operating system monitors a touch event of the touch screen; when a touch event of the touch screen is monitored, acquiring multiple event information corresponding to the touch event, wherein the multiple event information comprises coordinate event information; determining touch coordinates of the touch event according to the coordinate event information; and when the touch coordinate is located in a preset range, reporting the event information corresponding to the navigation gesture function in the multiple kinds of event information to an application layer, wherein the preset range comprises the touch coordinate range of the navigation gesture corresponding to the navigation gesture function. The method can effectively avoid the failure problem of the navigation gesture.

Description

Touch event processing method and device, mobile terminal and storage medium Technical Field
The present application relates to the field of mobile terminal technologies, and in particular, to a method and an apparatus for processing a touch event, a mobile terminal, and a storage medium.
Background
Mobile terminals, such as tablet computers, mobile phones, etc., have become one of the most common consumer electronic products in people's daily life. As the screen of the mobile terminal is continuously enlarged, the keys on the mobile terminal are gradually cancelled. Since keys for returning to a home page, returning, viewing a task, etc. are not provided on the mobile terminal, a navigation gesture function is generated.
Disclosure of Invention
In view of the foregoing problems, the present application provides a method and an apparatus for processing a touch event, a mobile terminal, and a storage medium.
In a first aspect, an embodiment of the present application provides a method for processing a touch event, which is applied to an operating system of a mobile terminal, where the mobile terminal includes a touch screen, and the method includes: when the navigation gesture function of the mobile terminal is started, the operating system monitors a touch event of the touch screen; when a touch event of the touch screen is monitored, acquiring multiple event information corresponding to the touch event, wherein the multiple event information comprises coordinate event information; determining touch coordinates of the touch event according to the coordinate event information; and when the touch coordinate is located in a preset range, reporting the event information corresponding to the navigation gesture function in the multiple kinds of event information to an application layer, wherein the preset range comprises the touch coordinate range of the navigation gesture corresponding to the navigation gesture function, and the event information corresponding to the navigation gesture function is used for indicating the application layer to perform processing operation corresponding to the navigation gesture function.
In a second aspect, an embodiment of the present application provides an apparatus for processing a touch event, which is applied to an operating system of a mobile terminal, where the mobile terminal includes a touch screen, and the apparatus includes: the mobile terminal comprises a touch monitoring module, an event acquisition module, a coordinate acquisition module and an event reporting module, wherein the touch monitoring module is used for monitoring a touch event of the touch screen by the operating system when a navigation gesture function of the mobile terminal is started; the event acquisition module is used for acquiring various event information corresponding to a touch event when the touch event of the touch screen is monitored, wherein the various event information comprises coordinate event information; the coordinate acquisition module is used for determining the touch coordinate of the touch event according to the coordinate event information; the event reporting module is configured to report, to an application layer, event information corresponding to the navigation gesture function in the multiple types of event information when the touch coordinate is located in a preset range, where the preset range includes a touch coordinate range of a navigation gesture corresponding to the navigation gesture function, and the event information corresponding to the navigation gesture function is used to instruct the application layer to perform a processing operation corresponding to the navigation gesture function.
In a third aspect, an embodiment of the present application provides an electronic device, including: one or more processors; a memory; one or more application programs, wherein the one or more application programs are stored in the memory and configured to be executed by the one or more processors, the one or more programs configured to perform the method of processing touch events provided in the first aspect above.
In a fourth aspect, an embodiment of the present application provides a computer-readable storage medium, where a program code is stored in the computer-readable storage medium, and the program code may be called by a processor to execute the processing method for a touch event provided in the first aspect.
According to the scheme, when the navigation gesture function of the mobile terminal is started, the operating system monitors a touch event to the touch screen, when the touch event to the touch screen is monitored, various event information corresponding to the touch event is obtained, the various event information comprises coordinate event information, then the touch coordinate of touch time is determined according to the coordinate event information, when the touch coordinate is located in a preset range, the event information corresponding to the navigation gesture function in the various event information is reported to the application layer, the preset range comprises the touch coordinate range of the navigation gesture corresponding to the navigation gesture function, and the event information corresponding to the navigation gesture function is used for indicating the application layer to perform processing operation corresponding to the navigation gesture function. Therefore, the navigation gesture failure caused by reporting redundant event information to the application layer when a touch event in the touch coordinate range of the navigation gesture occurs under the condition that the function of the navigation gesture is started can be avoided.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 shows an interface schematic diagram provided in an embodiment of the present application.
FIG. 2 illustrates a flow diagram of a method for processing touch events according to one embodiment of the present application.
FIG. 3 shows a flow diagram of a method for processing touch events according to another embodiment of the present application.
FIG. 4 illustrates a flow diagram of a method for processing touch events according to yet another embodiment of the present application.
FIG. 5 illustrates a flow diagram of a method for processing touch events according to yet another embodiment of the present application.
FIG. 6 is a flow chart illustrating a method for processing a touch event according to yet another embodiment of the present application.
Fig. 7 shows a schematic view of an interface provided in yet another embodiment of the present application.
Fig. 8 shows another interface schematic provided in yet another embodiment of the present application.
FIG. 9 shows a block diagram of a processing device for touch events according to one embodiment of the present application.
Fig. 10 is a block diagram illustrating an event reporting module in a device for processing a touch event according to an embodiment of the application.
Fig. 11 is a block diagram of a mobile terminal for performing a method of processing a touch event according to an embodiment of the present application.
Fig. 12 is a storage unit for storing or carrying program codes for implementing a processing method of a touch event according to an embodiment of the present application.
Detailed Description
In order to make the technical solutions better understood by those skilled in the art, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application.
The display screen generally plays a role in an electronic device such as a mobile phone and a tablet computer to display contents such as text, pictures, icons or videos. With the development of touch technologies, more and more display screens arranged on electronic devices are touch display screens, and when a user is detected to perform touch operations such as dragging, clicking, double clicking, sliding and the like on the touch display screen, the touch operations of the user can be responded under the condition of arranging the touch display screens.
As the user demands higher definition and higher fineness of the displayed content, more electronic devices employ a touch display screen with a larger size. However, in the process of setting a touch display screen with a large size, it is found that functional devices such as a front camera, a proximity optical sensor, and a receiver, which are arranged at the front end of the electronic device, affect an area that the touch display screen can extend to.
Generally, an electronic device includes a front panel, a rear cover, and a bezel. The front panel includes a forehead area, a middle screen area and a lower key area. Generally, the forehead area is provided with a sound outlet of a receiver and functional devices such as a front camera, the middle screen area is provided with a touch display screen, and the lower key area is provided with one to three physical keys.
As the screen of the mobile terminal is continuously enlarged, the lower key area on the mobile terminal is gradually cancelled. Since keys for returning to a home page, returning, viewing a task, etc. are not provided on the mobile terminal, a navigation gesture function is generated. The navigation gesture function is a function which is completed by using an entity key or a virtual key, such as returning to a desktop, returning to a previous level, checking a recent task and the like, through a navigation gesture on a system interface or an application interface. For example, as shown in fig. 1, the user may return to the desktop by a slide-up operation gesture on the bottom area of the touch screen, and of course, the specific navigation gesture may not be a limitation.
The navigation function is realized by reporting event information of a touch event to an upper layer system (application layer) by using an operating system of the mobile terminal, and the upper layer system determines whether the touch event is a navigation gesture according to the event information of the touch event, so as to determine whether the navigation gesture function is triggered.
The inventor finds that under the condition that the navigation gesture function of the mobile terminal is started, when a system of the mobile terminal reports event information of a touch event to an application layer, redundant event information is reported, and the navigation gesture is invalid.
In view of the above problems, the inventor provides a method and an apparatus for processing a touch event, a mobile terminal, and a storage medium, which are provided in the embodiments of the present application, so that when a touch event occurs in a touch coordinate range of a navigation gesture while a function of the navigation gesture is turned on, redundant event information is reported to an application layer to cause failure of the navigation gesture. The specific touch event processing method is described in detail in the following embodiments.
Referring to fig. 2, fig. 2 is a flowchart illustrating a method for processing a touch event according to an embodiment of the present application. The processing method of the touch event is used for avoiding that the navigation gesture is invalid due to the fact that redundant event information is reported to an application layer when the touch event in the touch coordinate range of the navigation gesture occurs under the condition that the function of the navigation gesture is started. In a specific embodiment, the method for processing the touch event is applied to the processing apparatus 400 of the touch event as shown in fig. 9 and the mobile terminal 100 (fig. 11) configured with the processing apparatus 400 of the touch event. The following will describe a specific flow of the embodiment by taking a mobile terminal as an example, and it is understood that the mobile terminal applied in the embodiment may be a smart phone, a tablet computer, a smart watch, and the like, which is not limited herein. Specifically, the processing method of the touch event is applied to an operating system of the mobile terminal. Wherein the mobile terminal comprises a touch screen. As will be described in detail with respect to the flow shown in fig. 2, the method for processing the touch event may specifically include the following steps:
step S110: and when the navigation gesture function of the mobile terminal is started, the operating system monitors a touch event of the touch screen.
In the embodiment of the application, the mobile terminal can monitor the navigation gesture function, and when the navigation gesture function is monitored to be started, a corresponding processing mode is adopted for a touch event of the touch screen. The navigation gesture function is used for realizing functions finished by using the entity keys or the virtual keys, such as returning to a desktop, returning to the upper level, checking a recent task and the like according to the detected navigation gesture in the system interface or the application interface. The navigation gesture is a preset touch gesture for triggering to return to the desktop, return to the upper level and check the latest task, and the navigation gesture may be a slide-up gesture, a slide-left gesture, a slide-right gesture, and the like, which is not limited herein.
In some embodiments, a switch for controlling the on and off of the navigation gesture function may be provided in the mobile terminal, for example, a switch in a setting interface of the navigation function, and the on and off of the navigation gesture function may be implemented by controlling the switch. Correspondingly, the mobile terminal can detect the state of the switch and determine whether the navigation gesture function is on according to the state of the switch, specifically, when the switch is in the on state, the navigation gesture function is in the on state, and when the switch is in the off state, the navigation gesture function is in the off state.
In other embodiments, the mobile terminal may include a navigation gesture module, where the navigation gesture module is configured to implement a navigation gesture function, and the navigation gesture module may be a software module, and is configured to determine whether a touch event is a navigation gesture according to event information of the touch event reported by an operating system, and perform a corresponding navigation gesture function according to the navigation gesture. Correspondingly, the operating system may monitor whether the navigation gesture module is in an open state, so as to determine whether the navigation gesture function is open, specifically, when the navigation gesture module is in the open state, the navigation gesture function is in the open state, and when the navigation gesture module is in the closed state, the navigation gesture function is in the closed state.
For example, a mobile terminal with an operating system of an Android system is taken as an example, and the principle of the navigation gesture function is introduced. The android system framework comprises an inner core layer, a core class library layer, a framework layer and an application layer from bottom to top. The kernel layer provides core system services including security, memory management, process management, network protocol stack, hardware driver, and the like. And recording hardware drive in the kernel layer as a drive layer, wherein the drive layer comprises a touch display screen drive, a camera drive and the like. The core class library layer comprises an Android running environment (Android Runtime) and class Libraries (Libraries). Among other things, the android runtime environment provides most of the functionality available in the Java programming language core class libraries, including the core libraries (CoreLibraries) and the Dalvik virtual machine (Dalvik VM). Each android application is an instance in a Dalvik virtual machine, running in their own process. The class library is used by each component of the android system, and comprises the following functions: media library (Media Framework), interface Manager (Surface Manager), SQLite (relational database engine), Free Type (bitmap and vector font rendering), etc., each of which is exposed to the developer through the Framework layer of the android system for use. The framework layer provides a series of class libraries required by android application program development, so that developers can rapidly develop the application program, the components can be reused conveniently, personalized extension can be realized through inheritance, and the provided services comprise component management services, window management services, system data source components, space frameworks, resource management services, installation package management services and the like. The application layer comprises various application programs which are directly interacted with a user, or service programs which are written by Java language and run in a background, wherein the service programs comprise desktop applications, contact person applications, conversation applications, camera applications, picture browsers, games, maps, web browsers and other application programs developed by developers.
Further, the navigation gesture module may be located in the application layer, after the bottom layer of the android system (the touch screen system in the kernel layer) monitors the touch event, the event information of the touch event is reported to the application layer, and the navigation gesture module of the application layer may determine whether the touch event is a navigation gesture for triggering the navigation gesture function according to the event information of the touch event, and perform display control corresponding to the navigation gesture function when the touch event is the navigation gesture, such as returning to a desktop, returning to a previous interface, checking a recent task, and the like.
In some embodiments, when the operating system monitors that the navigation gesture function is turned on, a touch event to the touch screen may be monitored. In particular, touch events to the touch screen may be monitored by the underlying layers of the operating system (i.e., the touch screen system in the kernel layer). For example, when a press-down event is monitored, it is determined that a touch event to the touch screen is monitored.
Step S120: when a touch event of the touch screen is monitored, acquiring a plurality of event information corresponding to the touch event, wherein the plurality of event information comprises coordinate event information.
In the embodiment of the present application, input events (e.g., events of peripheral devices such as a touch screen) in the mobile terminal may be uniformly mounted in an input system (input system) of the kernel layer, and the input system may include a touch screen system and the like. The input system is divided into different event types (types), event codes (codes), and event attributes (values). The processing principle of the operating system for the input event is as follows: the application layer is located at the upper layer of the system and is mainly used for monitoring, receiving and processing event information of input events reported by the system bottom layer.
For example, in the case of a touch screen system, the monitored event information of the touch screen event may include a press event, a lift event, coordinate event information, pressure event information of a hand press, approximate event information of a diameter of a finger contacting the touch screen, and approximate event information of a diameter of the finger, and the coordinate event information may be divided into event information of an abscissa and event information of an ordinate. When the touch screen system reports the event information of the touch event, the event information which must be reported may include a press event, a lift event and coordinate event information, and the selectable event information may include pressure event information of a hand press, approximate event information of a diameter of a finger touching the touch screen and approximate event information of a diameter of the finger.
In some embodiments, when the operating system monitors a touch event on the touch screen, the operating system may acquire a plurality of event information corresponding to the touch event. The plurality of event information may include a press event, a lift event, coordinate event information, pressure event information of a hand press, approximate event information of a diameter of a finger contacting the touch screen, and approximate event information of a diameter of the finger. Of course, the event information corresponding to the specific touch event may not be limited.
Step S130: and determining the touch coordinate of the touch event according to the coordinate event information.
In the embodiment of the application, after the operating system acquires the information of the multiple events of the touch event, that is, the underlying touch screen system acquires the information of the multiple events, the touch coordinate of the touch event can be determined according to coordinate event information in the information of the multiple events.
In some embodiments, the coordinate information acquired by the touch screen system may include an abscissa and an ordinate of one or more touch points corresponding to the touch event. It can be understood that, when the touch event is an event corresponding to a click operation, the coordinate information may include an abscissa and an ordinate corresponding to a click position, and when the touch event is an event corresponding to a slide operation, the coordinate information may include abscissas and ordinates corresponding to a plurality of touch points on the slide track.
Step S140: and when the touch coordinate is located in a preset range, reporting the event information corresponding to the navigation gesture function in the multiple kinds of event information to an application layer, wherein the preset range comprises the touch coordinate range of the navigation gesture corresponding to the navigation gesture function, and the event information corresponding to the navigation gesture function is used for indicating the application layer to perform processing operation corresponding to the navigation gesture function.
In the embodiment of the application, after the touch screen system at the bottom layer of the operating system acquires the touch coordinate of the touch event, whether the touch coordinate is within a preset range can be judged to determine whether the touch position of the touch event is within the touch range of the navigation gesture. If the touch coordinate is located in the preset range, it indicates that the touch position of the touch event is located in the touch range of the navigation gesture, and the touch event may be an input event of the navigation gesture, so that when the operating system (i.e., the underlying touch screen system) reports the event information of the touch event to the application layer, the event information corresponding to the navigation gesture function may be reported to the application layer. It can be understood that when the navigation gesture function is turned on, and the user triggers the navigation gesture function, the touch screen system only needs to report necessary input event information, that is, only needs to report a press event, a lift event and coordinate information, but cannot report redundant pressure event information, diameter event information of a finger touching the touch screen, diameter event information of the finger, and report redundant event information, which may cause misjudgment of the navigation gesture algorithm and failure of the navigation gesture. The preset range may be a screen coordinate range corresponding to the navigation gesture, and may also be understood as a screen coordinate range for triggering the navigation gesture, and the specific preset range may not be limited.
After reporting the event information corresponding to the navigation gesture function of the application layer, the navigation gesture module in the application layer may determine whether the current touch event is an input event of the navigation gesture according to the reported event information, and if the current touch event is the input event of the navigation gesture, may perform control corresponding to the navigation gesture function, such as returning to a desktop, returning to a previous-level interface, checking a recent task, and the like, and if the current touch event is not the input event of the navigation gesture, may not perform control corresponding to the navigation gesture function.
In some embodiments, if the touch coordinate is not within the preset range, it indicates that the touch position of the touch event is not within the touch range of the navigation gesture, and the touch event cannot be an input event of the navigation gesture, so that when the operating system (i.e., the underlying touch screen system) reports the event information of the touch event to the application layer, all the event information of the touch event can be reported to the application layer, and it is ensured that the touch event input by the user can be recognized by the application layer, thereby achieving the required input purpose.
The method for processing the touch event provided by the embodiment of the application monitors the touch event of the touch screen by the operating system when the navigation gesture function of the mobile terminal is started, acquires a plurality of event information corresponding to the touch event when the touch event of the touch screen is monitored, the plurality of event information comprises coordinate event information, then determines the touch coordinate of the touch time according to the coordinate event information, reports the event information corresponding to the navigation gesture function in the plurality of event information to the application layer when the touch coordinate is in a preset range, the preset range comprises the touch coordinate range of the navigation gesture corresponding to the navigation gesture function, the event information corresponding to the navigation gesture function is used for indicating the application layer to perform processing operation corresponding to the navigation gesture function, so that when the touch event in the touch coordinate range of the navigation gesture occurs under the condition that the navigation gesture function is started can be avoided, and reporting redundant event information to an application layer to cause the navigation gesture to be invalid.
Referring to fig. 3, fig. 3 is a flowchart illustrating a method for processing a touch event according to another embodiment of the present application. The method is applied to the operating system of the mobile terminal, the mobile terminal includes a touch screen, and the following process shown in fig. 3 is described in detail, where the method for processing the touch event may specifically include the following steps:
step S210: and when the navigation gesture function of the mobile terminal is started, the operating system monitors a touch event of the touch screen.
Step S220: when a touch event of the touch screen is monitored, acquiring a plurality of event information corresponding to the touch event, wherein the plurality of event information comprises coordinate event information.
Step S230: and determining the touch coordinate of the touch event according to the coordinate event information.
In the embodiment of the present application, steps S210 to S230 may refer to the contents of the foregoing embodiments, and are not described herein again.
Step S240: and determining the starting point coordinates of the touch starting point of the touch event according to the touch coordinates.
In the embodiment of the application, after the operating system acquires the touch coordinates of the touch event, the operating system may determine the initial coordinates of the touch starting point of the touch event according to the touch coordinates. The touch coordinates of the touch point in the whole touch process of the touch event can be included in the touch coordinates of the touch event. The touch screen system at the bottom layer of the operating system can acquire the starting point coordinates of the touch starting point of the touch event according to the touch coordinates corresponding to the touch event. It can be understood that the navigation gesture in the navigation gesture function is a sliding gesture sliding from an edge region of the touch screen to a center of the screen, and therefore, a touch starting point of the navigation gesture is usually located in the edge region, so that coordinates of the touch starting point can be determined, and whether a touch event is likely to be an input event of the navigation gesture is determined according to the touch starting point coordinates.
Step S250: when the starting point coordinate is located in the preset range, reporting the event information corresponding to the navigation gesture function in the multiple kinds of event information to an application layer, wherein the preset range comprises a touch coordinate range of the navigation gesture corresponding to the navigation gesture function, and the event information corresponding to the navigation gesture function is used for indicating the application layer to perform processing operation corresponding to the navigation gesture function.
In the embodiment of the application, after the operating system acquires the initial point coordinate of the touch initial point of the touch event, whether the initial point coordinate is within a preset range can be judged, and if the initial point coordinate is within the preset range, the touch event is an input event which may be a navigation gesture; and if the starting point coordinate is not located in the preset range, the touch event is not likely to be an input event of the navigation gesture.
In some embodiments, the preset range may be a touch coordinate range corresponding to a navigation gesture of the navigation gesture function. The touch coordinate range guides a coordinate range corresponding to a touch point corresponding to the navigation gesture.
As a specific embodiment, since whether the touch event is likely to be an input event of the navigation gesture is determined according to the coordinates of the starting point of the touch event, and the navigation gesture is generally a sliding gesture sliding from the edge of the touch screen to the middle, the preset range may be a coordinate range corresponding to an edge region of the touch screen, and the edge region is a region having a distance from the edge of the touch screen smaller than a set distance. Thus, whether the touch event is likely to be an input event of the navigation gesture can be determined by determining whether the start point coordinate of the touch event is in the edge area corresponding coordinate range.
In some embodiments, the navigation gesture generally requires that the sliding length corresponding to the navigation gesture is greater than a certain length in addition to the starting point located at the edge of the touch screen, so that the distance between the touch starting point and the touch terminal can be determined, and whether the touch event is likely to be an input event of the navigation gesture is further determined according to the distance, thereby accurately confirming whether the touch event is likely to be the input event of the navigation gesture. Therefore, the method for processing the touch event may further include: when the starting point coordinate is located in the preset range, determining a terminal point coordinate of the touch event according to the touch coordinate; acquiring the distance between the starting point coordinate and the end point coordinate; and if the distance is larger than a set threshold value, reporting the event information corresponding to the navigation gesture function in the multiple kinds of event information to an application layer.
It can be understood that, by using the coordinate information, the end point coordinate of the touch focus of the touch event is obtained, and according to the distance between the start point coordinate and the end point coordinate, if the distance is greater than the set threshold, the sliding length of the touch event is greater than a certain length, the touch event may be an input event of the navigation gesture, and if the distance is less than or equal to the set threshold, the touch event may not be an input event of the navigation gesture. The specific value of the threshold may not be limited. Therefore, when the distance is greater than the set threshold value and the touch event is an input event which possibly represents a navigation gesture, the event information corresponding to the navigation gesture function in the multiple kinds of event information is reported to the application layer, and the navigation gesture is prevented from being invalid due to the fact that redundant event information is reported to the application layer.
Further, the processing method of the touch event may further include: and if the distance is smaller than or equal to the set threshold, reporting the various event information to an application layer. It can be understood that when the threshold is less than or equal to the set threshold, it indicates that the touch event is not likely to be an input event of the navigation gesture, so that all the acquired various event information can be reported to the application layer, and it is ensured that the touch event input by the user can be recognized by the application layer, thereby achieving the required input purpose.
Step S260: and if the touch coordinate is located in other ranges except the preset range, reporting the various event information to the application layer.
In the embodiment of the application, if the touch coordinate is not within the preset range, it indicates that the touch position of the touch event is not within the touch range of the navigation gesture, and the touch event cannot be an input event of the navigation gesture, so that when an operating system (i.e., a bottom-layer touch screen system) reports event information of the touch event to an application layer, all event information of the touch event can be reported to the application layer, and it is ensured that the touch event input by a user can be recognized by the application layer, thereby achieving a required input purpose.
According to the method for processing the touch event, whether the touch event is possibly an input event of the navigation gesture is determined according to whether the coordinate of the touch starting point of the touch event is in the preset range, and when the coordinate of the touch starting point is in the preset range, the event information corresponding to the navigation gesture function in the multiple kinds of event information of the touch event is reported to the application layer, so that the navigation gesture is prevented from being invalid due to the fact that redundant event information is reported to the application layer when the touch event in the touch coordinate range of the navigation gesture occurs under the condition that the function of the navigation gesture is started.
Referring to fig. 4, fig. 4 is a flowchart illustrating a method for processing a touch event according to another embodiment of the present application. The method is applied to the operating system of the mobile terminal, the mobile terminal includes a touch screen, and the following process shown in fig. 4 will be described in detail, where the method for processing the touch event may specifically include the following steps:
step S310: when the navigation gesture function of the mobile terminal is started, the operating system monitors the touch event of the touch screen
Step S320: when a touch event of the touch screen is monitored, acquiring a plurality of event information corresponding to the touch event, wherein the plurality of event information comprises coordinate event information.
Step S330: and determining the touch coordinate of the touch event according to the coordinate event information.
In the embodiment of the present application, steps S310 to S330 may refer to the contents of the foregoing embodiments, and are not described herein again.
Step S340: when all the coordinates in the touch coordinates are located in a preset range, reporting event information corresponding to the navigation gesture function in the multiple kinds of event information to an application layer, wherein the preset range comprises the touch coordinate range of the navigation gesture corresponding to the navigation gesture function, and the event information corresponding to the navigation gesture function is used for indicating the application layer to perform processing operation corresponding to the navigation gesture function.
Step S350: and when the touch coordinates which are not in the preset range exist in the touch coordinates, reporting the various event information to the application layer.
In the embodiment of the application, after the operating system acquires the touch coordinates of the touch event, whether the touch coordinates of all touch points are within a preset range can be determined according to the touch coordinates. It can be understood that the preset range is a touch coordinate range corresponding to the navigation gesture, if the touch coordinates of all the touch point coordinates are all in the preset range, it indicates that the touch event is most likely to be an input event of the navigation gesture, and if the touch coordinates of all the touch point coordinates are in the touch coordinate range that is not in the preset range, it indicates that the touch event is not likely to be an input event of the navigation gesture. Therefore, when the touch coordinates of all the touch points are located in the preset range, the event information corresponding to the navigation gesture function in the multiple kinds of event information is reported to the application layer, and the navigation gesture failure caused by reporting redundant event information to the application layer is avoided. And when touch coordinates which are not in the preset range exist in the touch coordinates of all the touch points, all the various event information is reported to the application layer, so that the normal reporting of the input events on the touch screen is ensured.
According to the method for processing the touch event, whether the touch event is likely to be an input event of the navigation gesture is determined according to whether the touch coordinates of all touch points of the touch event are all in the preset range, when the touch coordinates of all touch points of the touch event are all in the preset range, event information corresponding to the navigation gesture function in the multiple kinds of event information of the touch event is reported to the application layer, and when the touch coordinates of all touch points of the touch event are not in the preset range, the multiple kinds of event information of the touch event are reported to the application layer, so that the navigation gesture is prevented from being invalid due to the fact that redundant event information is reported to the application layer when the touch event in the touch coordinate range of the navigation gesture occurs under the condition that the function of the navigation gesture is started.
Referring to fig. 5, fig. 5 is a flowchart illustrating a method for processing a touch event according to still another embodiment of the present application. The method is applied to the operating system of the mobile terminal, the mobile terminal includes a touch screen, and the following process shown in fig. 5 is described in detail, where the method for processing the touch event may specifically include the following steps:
step S410: and requesting an application layer to acquire event information corresponding to the navigation gesture function, and storing the event information corresponding to the navigation gesture function, wherein the event information corresponding to the navigation gesture function is determined by the application layer according to a navigation gesture algorithm corresponding to the navigation gesture function.
In the embodiment of the application, when the event information corresponding to the navigation gesture function is started, that is, when the navigation gesture function is started, the bottom layer of the operating system needs to report the event information of the touch event to the application layer, which can be acquired from the navigation gesture module in the application layer by the kernel layer of the operating system. Specifically, the kernel layer may initiate a request for event information corresponding to the navigation gesture function to the navigation gesture module in the application layer, and after receiving the request, the navigation gesture module in the application layer may return the event information corresponding to the navigation gesture function to the kernel layer, and the kernel layer acquires the event information corresponding to the navigation gesture function, may store the event information corresponding to the navigation gesture function in the framework layer, and when reporting the event information of the touch event, the kernel layer may acquire the event type to be reported from the framework layer.
Step S420: and when the navigation gesture function of the mobile terminal is started, the operating system monitors a touch event of the touch screen.
Step S430: when a touch event of the touch screen is monitored, acquiring a plurality of event information corresponding to the touch event, wherein the plurality of event information comprises coordinate event information.
Step S440: and determining the touch coordinate of the touch event according to the coordinate event information.
Step S450: and when the touch coordinate is located in a preset range, reporting the event information corresponding to the navigation gesture function in the multiple kinds of event information to an application layer, wherein the preset range comprises the touch coordinate range of the navigation gesture corresponding to the navigation gesture function, and the event information corresponding to the navigation gesture function is used for indicating the application layer to perform processing operation corresponding to the navigation gesture function.
In the embodiment of the present application, steps S430 to S450 may refer to the contents of the foregoing embodiments, and are not described herein again.
Step S460: and if the touch event of the touch screen is monitored for multiple times within the preset time length and the touch coordinate of the touch event monitored for multiple times is within the preset range, requesting the application layer to acquire event information corresponding to the navigation gesture function again, and storing the event information corresponding to the navigation gesture.
In the embodiment of the application, the bottom layer of the operating system can monitor a subsequent event after reporting event information corresponding to the navigation gesture function to the application layer under the condition that the touch event is determined to be an input event of the navigation gesture. If the touch event of the touch screen is monitored for multiple times according to the preset duration content, and the touch coordinate of the touch event monitored for multiple times is located in the preset range, it indicates that the user may make the navigation gesture for multiple times within the preset duration, but the navigation gesture is not recognized. Therefore, the bottom layer of the operating system may obtain the event information corresponding to the navigation gesture function from the application layer again to determine whether the event information corresponding to the navigation gesture function needs to be updated, so as to avoid that the navigation gesture cannot be recognized due to the fact that the event information required by the navigation gesture function is updated. For example, event information required by the previous navigation gesture function includes a press event, a lift event and coordinate information, and the navigation gesture algorithm also includes an approximate event of the diameter of a finger after being updated, so that the bottom layer of the operating system only reports the press event, the lift event and the coordinate information, the navigation gesture module in the application layer cannot recognize the navigation gesture, and the navigation gesture function cannot be realized, and therefore the bottom layer of the operating system can acquire event information corresponding to the navigation gesture function again, and subsequent navigation gestures cannot be recognized.
Step S470: and requesting the application layer to acquire a touch coordinate range of the navigation gesture corresponding to the navigation gesture function, and updating the preset range according to the acquired touch coordinate range.
In the embodiment of the application, if the content of the preset duration monitors the touch event on the touch screen for multiple times, and the touch coordinate of the touch event monitored for multiple times is located in the preset range, it indicates that the navigation gesture algorithm may be updated, so that multiple navigation gestures are not recognized. Therefore, after the bottom layer of the operating system updates the event information corresponding to the navigation gesture function, the touch coordinate range corresponding to the navigation gesture can be updated, that is, the preset range is updated. Therefore, the bottom layer of the operating system can acquire the touch coordinate range of the navigation gesture corresponding to the navigation gesture function from the navigation gesture module of the application layer, and update the preset range, so that the situation that the navigation gesture made by a subsequent user cannot be identified due to the change of the touch coordinate range of the navigation gesture is avoided.
According to the touch event processing method provided by the embodiment of the application, the bottom layer of the operating system acquires the event information corresponding to the navigation gesture function from the application layer, stores the event information in the driving layer, and reports the event information according to the event information corresponding to the navigation gesture function stored in the driving layer every time the touch event of the navigation gesture is reported. And when the touch event is monitored for multiple times and the touch coordinate of the touch event is in the preset range, updating the event information corresponding to the navigation gesture function and updating the preset range, so that the influence on user experience caused by the fact that the navigation gesture subsequently input by the user cannot be identified is avoided.
Referring to fig. 6, fig. 6 is a flowchart illustrating a method for processing a touch event according to yet another embodiment of the present application. The method is applied to the operating system of the mobile terminal, the mobile terminal includes a touch screen, and the following process shown in fig. 6 is described in detail, where the method for processing the touch event may specifically include the following steps:
step S501: and displaying a navigation setting interface, wherein the navigation setting interface comprises function options of the navigation gesture function.
In some embodiments, the mobile terminal may provide for the setting of navigation functions. The operating system of the mobile terminal can control the display screen to display a navigation setting interface, and the navigation setting interface comprises function options of a navigation gesture function. For example, as shown in FIG. 7, the function option may be in the form of a switch control A1, and the user can control the navigation gesture function to be turned on and off by operating the switch control A1.
Step S502: and when the opening operation of the function option is detected, the navigation gesture function is started.
In some embodiments, when the operating system detects an opening operation on the function option, it indicates that the user needs to open the function of the navigation gesture, so that the navigation gesture function can be controlled to be opened, for example, the opening operation on the switch control is detected, that is, the navigation gesture function can be controlled to be opened. Specifically, the bottom layer of the operating system may send an instruction to the application layer, where the instruction is used to control the navigation gesture module corresponding to the navigation gesture function to be started.
Step S503: and displaying a selection interface of the navigation gesture, wherein the selection interface comprises a plurality of selection options corresponding to the navigation modes.
In some embodiments, the navigation gesture function may have a plurality of navigation modes, and when the navigation gesture function is set, a selection interface of the navigation gesture may be displayed, where the selection interface includes a plurality of selection options corresponding to the navigation modes, and the selection options are used for a user to select the navigation modes. For example, as shown in fig. 8, the navigation modes include a two-side return, a simple gesture, a right-side return and a left-side return, and the user can select the selection option corresponding to the navigation mode to select the navigation mode. The two-side returning mode is that the upward sliding operation from the two side areas at the bottom of the touch screen correspondingly realizes returning to the upper level, the upward sliding operation from the middle area at the bottom of the touch screen correspondingly realizes returning to the desktop, and the upward sliding and stopping operation from the middle area at the bottom of the touch screen correspondingly realizes checking the latest task; the simple gesture mode is that the operation of sliding upwards from the bottom area of the touch screen correspondingly realizes the return to the desktop, and the operation of sliding upwards and stopping from the bottom area of the touch screen correspondingly realizes the view of the latest task; the right side is returned in a mode that the upward sliding operation from the right side area of the bottom area of the touch screen is used for returning to the upper level, the upward sliding operation from the middle area of the bottom area of the touch screen is used for returning to the desktop, and the upward sliding operation from the left side area of the bottom area of the touch screen is used for checking the latest task; the right-side return mode is that the upward sliding operation from the right-side area of the bottom area of the touch screen is used for realizing the view of the latest task, the upward sliding operation from the middle area of the bottom area of the touch screen is used for realizing the return to the desktop, and the upward sliding operation from the left-side area of the bottom area of the touch screen is used for realizing the return to the previous stage. Of course, the above navigation manners are only examples, and do not represent limitations on the navigation manners of the navigation gesture function in the embodiment of the present application.
Step S504: and determining a selected navigation mode according to the operation of the selection options, wherein the navigation mode corresponds to different navigation gestures.
Step S505: and determining a navigation gesture corresponding to the navigation mode according to the navigation mode.
Step S506: and acquiring a touch coordinate range of the navigation gesture corresponding to the navigation mode, and taking the touch coordinate range as the preset range.
It can be understood that the navigation gestures corresponding to different navigation manners may be different, and therefore, the navigation gesture corresponding to the navigation manner may be determined according to the selected navigation manner, and the touch coordinate range of the navigation gesture is determined as the preset range according to the navigation gesture.
Step S507: and when the navigation gesture function of the mobile terminal is started, the operating system monitors a touch event of the touch screen.
Step S508: when a touch event of the touch screen is monitored, acquiring a plurality of event information corresponding to the touch event, wherein the plurality of event information comprises coordinate event information.
Step S509: and determining the touch coordinate of the touch event according to the coordinate event information.
Step S510: and when the touch coordinate is located in a preset range, reporting the event information corresponding to the navigation gesture function in the multiple kinds of event information to an application layer, wherein the preset range comprises the touch coordinate range of the navigation gesture corresponding to the navigation gesture function, and the event information corresponding to the navigation gesture function is used for indicating the application layer to perform processing operation corresponding to the navigation gesture function.
In the embodiment of the present application, steps S507 to S510 may refer to the contents of the foregoing embodiments, and are not described herein again.
Step S511: when the navigation gesture function of the mobile terminal is closed, if the operating system monitors a touch event to the touch screen, acquiring multiple event information corresponding to the touch event, and reporting the multiple event information to the application layer.
It can be understood that when the navigation gesture function of the mobile terminal is turned off, all event information of the touch event can be reported to the application layer, and therefore, if the operating system monitors the touch event on the touch screen, all time information of the touch event can be reported to the application layer.
The method for processing the touch event, provided by the embodiment of the application, provides a method for setting the navigation gesture function, and is convenient for a user to manage the navigation gesture function. And determining whether the touch event is likely to be an input event of the navigation gesture according to whether the touch coordinate of the touch event is in a preset range, and reporting event information corresponding to the navigation gesture function in the multiple event information of the touch event to an application layer when the touch coordinate is in the preset range, so that the navigation gesture is prevented from being invalid due to the fact that redundant event information is reported to the application layer when the touch event in the touch coordinate range of the navigation gesture occurs under the condition that the function of the navigation gesture is started.
Referring to fig. 9, a block diagram of a device 400 for processing a touch event according to an embodiment of the present application is shown. The processing apparatus 400 for the touch event applies the operating system of the mobile terminal, and the processing apparatus 400 for the touch event includes: the touch monitoring module 410, the event acquiring module 420, the coordinate acquiring module 430, and the event reporting module 440. The touch monitoring module 410 is configured to monitor a touch event on the touch screen by the operating system when a navigation gesture function of the mobile terminal is turned on; the event obtaining module 420 is configured to, when a touch event of the touch screen is monitored, obtain a plurality of event information corresponding to the touch event, where the plurality of event information includes coordinate event information; the coordinate obtaining module 430 is configured to determine a touch coordinate of the touch event according to the coordinate event information; the event reporting module 440 is configured to report, when the touch coordinate is located in a preset range, event information corresponding to the navigation gesture function in the multiple types of event information to an application layer, where the preset range includes the touch coordinate range of the navigation gesture corresponding to the navigation gesture function, and the event information corresponding to the navigation gesture function is used to instruct the application layer to perform processing operation corresponding to the navigation gesture function.
In some embodiments, the event reporting module 440 is further configured to report the multiple types of event information to the application layer if the touch coordinate is located in a range other than the preset range.
In some embodiments, referring to fig. 10, the event reporting module 440 includes: a starting point obtaining unit 441 and a report executing unit 442. The starting point obtaining unit 441 is configured to determine, according to the touch coordinates, starting point coordinates of a touch starting point of the touch event; the reporting execution unit 442 is configured to report, to an application layer, event information corresponding to the navigation gesture function in the multiple types of event information when the starting point coordinate is located in the preset range.
In this embodiment, the reporting execution unit 442 may specifically be configured to: when the starting point coordinate is located in the preset range, determining a terminal point coordinate of the touch event according to the touch coordinate; acquiring the distance between the starting point coordinate and the end point coordinate; and if the distance is larger than a set threshold value, reporting the event information corresponding to the navigation gesture function in the multiple kinds of event information to an application layer.
Further, the reporting execution unit 442 may be further configured to: and if the distance is smaller than or equal to the set threshold, reporting the various event information to an application layer.
In some embodiments, the reporting execution unit 442 may also be specifically configured to: and when all the coordinates in the touch coordinates are in a preset range, reporting the event information corresponding to the navigation gesture function in the multiple kinds of event information to an application layer.
In some embodiments, the processing apparatus 400 of the touch event may further include: and an event information acquisition module. The event information acquisition module may be configured to request an application layer to acquire event information corresponding to the navigation gesture function, and store the event information corresponding to the navigation gesture function, where the event information corresponding to the navigation gesture function is determined by the application layer according to a navigation gesture algorithm corresponding to the navigation gesture function.
In this embodiment, the processing apparatus 400 for a touch event may further include: and an event information updating module. The event information updating module may be configured to, if a touch event on the touch screen is monitored for multiple times within a preset duration, and a touch coordinate of the touch event monitored for multiple times is within a preset range, request the application layer to acquire event information corresponding to the navigation gesture function again, and store the event information corresponding to the navigation gesture.
Further, the processing apparatus 400 for touch events may further include: and a range updating module. The range updating module may be configured to request the application layer to acquire a touch coordinate range of a navigation gesture corresponding to the navigation gesture function, and update the preset range according to the acquired touch coordinate range.
In some embodiments, the processing apparatus 400 of the touch event may further include: the device comprises an interface display module and a navigation gesture enabling module. The setting interface display module is used for displaying a navigation setting interface before the operating system monitors a touch event to the touch screen when a navigation gesture function of the mobile terminal is started, wherein the navigation setting interface comprises function options of the navigation gesture function; the navigation gesture enabling module is used for enabling the navigation gesture function when the opening operation of the function option is detected.
In this embodiment, the processing apparatus 400 for a touch event may further include: the device comprises a selection interface display module and a navigation mode acquisition module. The selection interface display module is used for displaying a selection interface of a navigation gesture after the navigation gesture function is started when the starting operation of the function option is detected, wherein the selection interface comprises a plurality of selection options corresponding to navigation modes; the navigation mode acquisition module is used for determining a selected navigation mode according to the operation of the selection options, and the navigation mode corresponds to different navigation gestures.
Further, the processing apparatus 400 for touch events may further include: the device comprises a navigation gesture acquisition module and a preset range determination module. The navigation gesture acquisition module is used for determining a navigation gesture corresponding to the navigation mode according to the navigation mode; the preset range determining module is used for acquiring a touch coordinate range of the navigation gesture corresponding to the navigation mode and taking the touch coordinate range as the preset range.
In some embodiments, the event reporting module 440 may further be configured to: when the navigation gesture function of the mobile terminal is closed, if the operating system monitors a touch event to the touch screen, acquiring multiple event information corresponding to the touch event, and reporting the multiple event information to the application layer.
In some embodiments, the plurality of event information includes coordinate event information, a press event, a lift event, event information of a press pressure, an approximation event of a diameter of a finger contacting the touch screen, and an approximation event of a diameter of a finger; the event information corresponding to the navigation gesture function comprises coordinate event information, a press-down event and a lift-up event.
In some embodiments, the preset range includes a coordinate range corresponding to an edge area of the touch screen, where the edge area is an area having a distance from the edge of the touch screen smaller than a set distance.
It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described apparatuses and modules may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, the coupling between the modules may be electrical, mechanical or other type of coupling.
In addition, functional modules in the embodiments of the present application may be integrated into one processing module, or each of the modules may exist alone physically, or two or more modules are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode.
In summary, when the navigation gesture function of the mobile terminal is turned on, the operating system monitors a touch event to the touch screen, and when the touch event to the touch screen is monitored, acquires multiple event information corresponding to the touch event, where the multiple event information includes coordinate event information, then determines a touch coordinate of the touch time according to the coordinate event information, and when the touch coordinate is within a preset range, reports the event information corresponding to the navigation gesture function in the multiple event information to the application layer, where the preset range includes a touch coordinate range of a navigation gesture corresponding to the navigation gesture function, and the event information corresponding to the navigation gesture function is used to instruct the application layer to perform a processing operation corresponding to the navigation gesture function. Therefore, the navigation gesture failure caused by reporting redundant event information to the application layer when a touch event in the touch coordinate range of the navigation gesture occurs under the condition that the function of the navigation gesture is started can be avoided.
Referring to fig. 11, a block diagram of an electronic device according to an embodiment of the present application is shown. The mobile terminal 100 may be a smart phone, a tablet computer, an electronic book, or other electronic devices capable of running an application program. The mobile terminal 100 in the present application may include one or more of the following components: a processor 110, a memory 120, a touch screen 130, and one or more applications, wherein the one or more applications may be stored in the memory 120 and configured to be executed by the one or more processors 110, the one or more programs configured to perform the methods as described in the aforementioned method embodiments.
Processor 110 may include one or more processing cores. The processor 110 connects various parts within the overall electronic device 100 using various interfaces and lines, and performs various functions of the electronic device 100 and processes data by executing or executing instructions, programs, code sets, or instruction sets stored in the memory 120 and calling data stored in the memory 120. Alternatively, the processor 110 may be implemented in hardware using at least one of Digital Signal Processing (DSP), Field-Programmable Gate Array (FPGA), and Programmable Logic Array (PLA). The processor 110 may integrate one or more of a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), a modem, and the like. Wherein, the CPU mainly processes an operating system, a user interface, an application program and the like; the GPU is used for rendering and drawing display content; the modem is used to handle wireless communications. It is understood that the modem may not be integrated into the processor 110, but may be implemented by a communication chip.
The Memory 120 may include a Random Access Memory (RAM) or a Read-Only Memory (Read-Only Memory). The memory 120 may be used to store instructions, programs, code sets, or instruction sets. The memory 120 may include a stored program area and a stored data area, wherein the stored program area may store instructions for implementing an operating system, instructions for implementing at least one function (such as a touch function, a sound playing function, an image playing function, etc.), instructions for implementing various method embodiments described below, and the like. The storage data area may also store data created by the terminal 100 in use, such as a phonebook, audio-video data, chat log data, and the like.
The touch screen 130 may collect touch operations of a user (e.g., operations of a user on or near the touch screen 130 using a finger, a stylus, or any other suitable object or accessory) thereon or nearby, and drive the corresponding connection device according to a preset program. Optionally, the touch screen 130 may include a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch detection device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 110, and can receive and execute commands sent by the processor 110. In addition, the touch detection function of the touch screen 130 can be implemented by various types, such as resistive, capacitive, infrared, and surface acoustic wave.
Referring to fig. 12, a block diagram of a computer-readable storage medium according to an embodiment of the present application is shown. The computer-readable medium 800 has stored therein a program code that can be called by a processor to execute the method described in the above-described method embodiments.
The computer-readable storage medium 800 may be an electronic memory such as a flash memory, an EEPROM (electrically erasable programmable read only memory), an EPROM, a hard disk, or a ROM. Alternatively, the computer-readable storage medium 800 includes a non-volatile computer-readable storage medium. The computer readable storage medium 800 has storage space for program code 810 to perform any of the method steps of the method described above. The program code can be read from or written to one or more computer program products. The program code 810 may be compressed, for example, in a suitable form.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solutions of the present application, and not to limit the same; although the present application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not necessarily depart from the spirit and scope of the corresponding technical solutions in the embodiments of the present application.

Claims (20)

  1. A method for processing a touch event is applied to an operating system of a mobile terminal, the mobile terminal comprises a touch screen, and the method comprises the following steps:
    when the navigation gesture function of the mobile terminal is started, the operating system monitors a touch event of the touch screen;
    when a touch event of the touch screen is monitored, acquiring multiple event information corresponding to the touch event, wherein the multiple event information comprises coordinate event information;
    determining touch coordinates of the touch event according to the coordinate event information;
    and when the touch coordinate is located in a preset range, reporting the event information corresponding to the navigation gesture function in the multiple kinds of event information to an application layer, wherein the preset range comprises the touch coordinate range of the navigation gesture corresponding to the navigation gesture function, and the event information corresponding to the navigation gesture function is used for indicating the application layer to perform processing operation corresponding to the navigation gesture function.
  2. The method of claim 1, further comprising:
    and if the touch coordinate is located in other ranges except the preset range, reporting the various event information to the application layer.
  3. The method according to claim 1, wherein reporting event information corresponding to the navigation gesture function in the plurality of types of event information to an application layer when the touch coordinate is within a preset range includes:
    determining a starting point coordinate of a touch starting point of the touch event according to the touch coordinate;
    and when the starting point coordinate is located in the preset range, reporting the event information corresponding to the navigation gesture function in the multiple kinds of event information to an application layer.
  4. The method according to claim 3, wherein reporting event information corresponding to the navigation gesture function in the plurality of types of event information to an application layer when the start point coordinate is located in the preset range comprises:
    when the starting point coordinate is located in the preset range, determining a terminal point coordinate of the touch event according to the touch coordinate;
    acquiring the distance between the starting point coordinate and the end point coordinate;
    and if the distance is larger than a set threshold value, reporting the event information corresponding to the navigation gesture function in the multiple kinds of event information to an application layer.
  5. The method of claim 4, further comprising:
    and if the distance is smaller than or equal to the set threshold, reporting the various event information to an application layer.
  6. The method according to claim 1, wherein reporting event information corresponding to the navigation gesture function in the plurality of types of event information to an application layer when the touch coordinate is within a preset range includes:
    and when all the coordinates in the touch coordinates are in a preset range, reporting the event information corresponding to the navigation gesture function in the multiple kinds of event information to an application layer.
  7. The method according to any one of claims 1-6, further comprising:
    and requesting an application layer to acquire event information corresponding to the navigation gesture function, and storing the event information corresponding to the navigation gesture function, wherein the event information corresponding to the navigation gesture function is determined by the application layer according to a navigation gesture algorithm corresponding to the navigation gesture function.
  8. The method according to claim 7, wherein after reporting the event information corresponding to the navigation gesture function in the plurality of types of event information to an application layer when the touch coordinate is within a preset range, the method further comprises:
    and if the touch event of the touch screen is monitored for multiple times within the preset time length and the touch coordinate of the touch event monitored for multiple times is within the preset range, requesting the application layer to acquire event information corresponding to the navigation gesture function again, and storing the event information corresponding to the navigation gesture.
  9. The method of claim 8, further comprising:
    and requesting the application layer to acquire a touch coordinate range of the navigation gesture corresponding to the navigation gesture function, and updating the preset range according to the acquired touch coordinate range.
  10. The method according to any one of claims 1-9, wherein before the operating system monitors the touch event on the touch screen while the navigation gesture function of the mobile terminal is turned on, the method further comprises:
    displaying a navigation setting interface, wherein the navigation setting interface comprises function options of the navigation gesture function;
    and when the opening operation of the function option is detected, the navigation gesture function is started.
  11. The method of claim 10, wherein after the navigation gesture function is enabled when the open operation on the function option is detected, the method further comprises:
    displaying a selection interface of a navigation gesture, wherein the selection interface comprises a plurality of selection options corresponding to navigation modes;
    and determining a selected navigation mode according to the operation of the selection options, wherein the navigation mode corresponds to different navigation gestures.
  12. The method of claim 11, further comprising:
    determining a navigation gesture corresponding to the navigation mode according to the navigation mode;
    and acquiring a touch coordinate range of the navigation gesture corresponding to the navigation mode, and taking the touch coordinate range as the preset range.
  13. The method according to any one of claims 1-12, further comprising:
    when the navigation gesture function of the mobile terminal is closed, if the operating system monitors a touch event to the touch screen, acquiring multiple event information corresponding to the touch event, and reporting the multiple event information to the application layer.
  14. The method according to any one of claims 1 to 13, wherein the plurality of types of event information includes coordinate event information, a press event, a lift event, event information of a press pressure, an approximation of a diameter of a finger contacting the touch screen, and an approximation of a diameter of a finger;
    the event information corresponding to the navigation gesture function comprises coordinate event information, a press-down event and a lift-up event.
  15. The method according to any one of claims 1 to 14, wherein the preset range comprises a coordinate range corresponding to an edge area of the touch screen, and the edge area is an area having a distance from an edge of the touch screen smaller than a set distance.
  16. An apparatus for processing a touch event, applied to an operating system of a mobile terminal, the mobile terminal including a touch screen, the apparatus comprising: a touch monitoring module, an event acquisition module, a coordinate acquisition module and an event reporting module, wherein,
    the touch monitoring module is used for monitoring a touch event of the touch screen by the operating system when a navigation gesture function of the mobile terminal is started;
    the event acquisition module is used for acquiring various event information corresponding to a touch event when the touch event of the touch screen is monitored, wherein the various event information comprises coordinate event information;
    the coordinate acquisition module is used for determining the touch coordinate of the touch event according to the coordinate event information;
    the event reporting module is configured to report, to an application layer, event information corresponding to the navigation gesture function in the multiple types of event information when the touch coordinate is located in a preset range, where the preset range includes a touch coordinate range of a navigation gesture corresponding to the navigation gesture function, and the event information corresponding to the navigation gesture function is used to instruct the application layer to perform a processing operation corresponding to the navigation gesture function.
  17. The apparatus of claim 16, wherein the event reporting module is further configured to report the plurality of event information to the application layer if the touch coordinate is located in a range other than the preset range.
  18. The apparatus of claim 16, wherein the event reporting module comprises: a starting point obtaining unit and a reporting executing unit, wherein,
    the starting point acquisition unit is used for determining the starting point coordinates of the touch starting point of the touch event according to the touch coordinates;
    and the reporting execution unit is used for reporting the event information corresponding to the navigation gesture function in the multiple kinds of event information to an application layer when the starting point coordinate is located in the preset range.
  19. A mobile terminal, comprising:
    one or more processors;
    a memory;
    one or more applications, wherein the one or more applications are stored in the memory and configured to be executed by the one or more processors, the one or more programs configured to perform the method of any of claims 1-15.
  20. A computer-readable storage medium, having stored thereon program code that can be invoked by a processor to perform the method according to any one of claims 1 to 15.
CN201980099360.2A 2019-10-08 2019-10-08 Touch event processing method and device, mobile terminal and storage medium Pending CN114270298A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2019/109993 WO2021068112A1 (en) 2019-10-08 2019-10-08 Method and apparatus for processing touch event, mobile terminal and storage medium

Publications (1)

Publication Number Publication Date
CN114270298A true CN114270298A (en) 2022-04-01

Family

ID=75437785

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201980099360.2A Pending CN114270298A (en) 2019-10-08 2019-10-08 Touch event processing method and device, mobile terminal and storage medium

Country Status (2)

Country Link
CN (1) CN114270298A (en)
WO (1) WO2021068112A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116991302A (en) * 2023-09-22 2023-11-03 荣耀终端有限公司 Application and gesture navigation bar compatible operation method, graphical interface and related device

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102023735B (en) * 2009-09-21 2016-03-30 联想(北京)有限公司 A kind of touch input device, electronic equipment and mobile phone
US20120206399A1 (en) * 2011-02-10 2012-08-16 Alcor Micro, Corp. Method and System for Processing Signals of Touch Panel
CN102819331B (en) * 2011-06-07 2016-03-02 联想(北京)有限公司 Mobile terminal and touch inputting method thereof
CN103257820A (en) * 2012-02-20 2013-08-21 联想(北京)有限公司 Control method and electronic device
CN105487705B (en) * 2015-11-20 2019-08-30 努比亚技术有限公司 Mobile terminal, input processing method and user equipment
CN109766043A (en) * 2018-12-29 2019-05-17 华为技术有限公司 The operating method and electronic equipment of electronic equipment

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116991302A (en) * 2023-09-22 2023-11-03 荣耀终端有限公司 Application and gesture navigation bar compatible operation method, graphical interface and related device
CN116991302B (en) * 2023-09-22 2024-03-19 荣耀终端有限公司 Application and gesture navigation bar compatible operation method, graphical interface and related device

Also Published As

Publication number Publication date
WO2021068112A1 (en) 2021-04-15

Similar Documents

Publication Publication Date Title
CN110663018B (en) Application launch in a multi-display device
CN109062479B (en) Split screen application switching method and device, storage medium and electronic equipment
US8881047B2 (en) Systems and methods for dynamic background user interface(s)
EP3435209B1 (en) Method for recognizing a screen-off gesture, and storage medium and terminal thereof
US20220350624A1 (en) Control display method and electronic device
EP2490114A1 (en) Electronic device, controlling method thereof and computer program product
CN114341784A (en) Touch event processing method and device, mobile terminal and storage medium
CN105320417B (en) Page switching method and client
CN110377141B (en) Application processing method and device, electronic equipment and storage medium
CN107608550B (en) Touch operation response method and device
CN110637282B (en) Touch method and terminal
WO2019218886A1 (en) Application pre-loading management method, device, storage medium and smart terminal
US11086442B2 (en) Method for responding to touch operation, mobile terminal, and storage medium
CN110442267B (en) Touch operation response method and device, mobile terminal and storage medium
RU2741516C1 (en) Display processing method and electronic device
US11194425B2 (en) Method for responding to touch operation, mobile terminal, and storage medium
CN107153546B (en) Video playing method and mobile device
US10055119B2 (en) User input method and apparatus in electronic device
EP3133481A1 (en) Terminal device display method and terminal device
US10466894B2 (en) Method, device, storage medium and mobile terminal for recognizing an off-screen gesture
US20210191579A1 (en) Window adjustment method, window adjustment device and mobile terminal
CN112579187A (en) Optimization method and device for cold start of application program
CN110989877B (en) Message management method, related equipment and computer readable storage medium
CN110795172A (en) Foreground process control method and device, electronic equipment and storage medium
CN114270298A (en) Touch event processing method and device, mobile terminal and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination