WO2021068112A1 - 触摸事件的处理方法、装置、移动终端及存储介质 - Google Patents
触摸事件的处理方法、装置、移动终端及存储介质 Download PDFInfo
- Publication number
- WO2021068112A1 WO2021068112A1 PCT/CN2019/109993 CN2019109993W WO2021068112A1 WO 2021068112 A1 WO2021068112 A1 WO 2021068112A1 CN 2019109993 W CN2019109993 W CN 2019109993W WO 2021068112 A1 WO2021068112 A1 WO 2021068112A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- touch
- event
- event information
- navigation gesture
- navigation
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
Definitions
- This application relates to the technical field of mobile terminals, and more specifically, to a method, device, mobile terminal, and storage medium for processing touch events.
- Mobile terminals such as tablet computers and mobile phones, have become one of the most commonly used consumer electronic products in people's daily lives. With the continuous increase of the screen of the mobile terminal, the buttons on the mobile terminal are gradually cancelled. Since there are no buttons for returning to the homepage, rewinding, viewing tasks, etc. on the mobile terminal, a navigation gesture function is created.
- this application proposes a touch event processing method, device, mobile terminal, and storage medium.
- an embodiment of the present application provides a touch event processing method, which is applied to an operating system of a mobile terminal, the mobile terminal includes a touch screen, and the method includes: when the navigation gesture function of the mobile terminal is turned on, The operating system monitors a touch event to the touch screen; when a touch event to the touch screen is monitored, obtains various event information corresponding to the touch event, and the various event information includes coordinate event information; The coordinate event information determines the touch coordinates of the touch event; when the touch coordinates are within a preset range, the event information corresponding to the navigation gesture function in the multiple event information is reported to the application layer, and The preset range includes the touch coordinate range of the navigation gesture corresponding to the navigation gesture function, and the event information corresponding to the navigation gesture function is used to instruct the application layer to perform processing operations corresponding to the navigation gesture function.
- an embodiment of the present application provides a touch event processing device, which is applied to an operating system of a mobile terminal, the mobile terminal includes a touch screen, and the device includes: a touch monitoring module, an event acquisition module, a coordinate acquisition module, and The event reporting module, wherein the touch monitoring module is used to monitor touch events to the touch screen when the navigation gesture function of the mobile terminal is turned on; the event acquisition module is used to monitor touch events to the touch screen when the navigation gesture function of the mobile terminal is turned on; When the touch event of the touch screen is described, various event information corresponding to the touch event is acquired, and the various event information includes coordinate event information; the coordinate acquisition module is used to determine the touch event according to the coordinate event information The touch coordinates; the event reporting module is used to report the event information corresponding to the navigation gesture function in the multiple event information to the application layer when the touch coordinates are in a preset range, and the preset range includes The touch coordinate range of the navigation gesture corresponding to the navigation gesture function, and the event information corresponding to the navigation gesture function is
- an embodiment of the present application provides an electronic device, including: one or more processors; a memory; one or more application programs, wherein the one or more application programs are stored in the memory and It is configured to be executed by the one or more processors, and the one or more programs are configured to execute the touch event processing method provided in the first aspect described above.
- an embodiment of the present application provides a computer-readable storage medium, and the computer-readable storage medium stores program code, and the program code can be invoked by a processor to execute the touch provided in the first aspect.
- the processing method of the event is not limited to:
- the solution provided by this application uses the operating system to monitor touch events on the touch screen when the navigation gesture function of the mobile terminal is turned on, and when a touch event on the touch screen is monitored, various event information corresponding to the touch event is obtained.
- the coordinate event information is included in the coordinate event information, and then the touch coordinate of the touch time is determined according to the coordinate event information.
- the touch coordinate is in the preset range
- the event information corresponding to the navigation gesture function in the multiple event information is reported to the application layer.
- the preset range includes navigation The touch coordinate range of the navigation gesture corresponding to the gesture function, and the event information corresponding to the navigation gesture function is used to instruct the application layer to perform processing operations corresponding to the navigation gesture function. Therefore, when the function of the navigation gesture is turned on, when a touch event within the touch coordinate range of the navigation gesture occurs, the redundant event information is reported to the application layer, which causes the navigation gesture to become invalid.
- Fig. 1 shows a schematic diagram of an interface provided by an embodiment of the present application.
- Fig. 2 shows a flowchart of a method for processing a touch event according to an embodiment of the present application.
- Fig. 3 shows a flowchart of a method for processing a touch event according to another embodiment of the present application.
- Fig. 4 shows a flowchart of a method for processing a touch event according to another embodiment of the present application.
- Fig. 5 shows a flowchart of a method for processing a touch event according to another embodiment of the present application.
- Fig. 6 shows a flowchart of a method for processing a touch event according to still another embodiment of the present application.
- Fig. 7 shows a schematic diagram of an interface provided in yet another embodiment of the present application.
- Fig. 8 shows a schematic diagram of another interface provided in yet another embodiment of the present application.
- Fig. 9 shows a block diagram of a touch event processing device according to an embodiment of the present application.
- Fig. 10 shows a block diagram of an event reporting module in a touch event processing apparatus according to an embodiment of the present application.
- FIG. 11 is a block diagram of a mobile terminal for executing the method for processing a touch event according to an embodiment of the present application according to an embodiment of the present application.
- FIG. 12 is a storage unit for storing or carrying program code implementing the method for processing touch events according to the embodiment of the present application according to an embodiment of the present application.
- the display screen usually plays a role in electronic devices such as mobile phones and tablet computers to display content such as text, pictures, icons, or videos.
- electronic devices such as mobile phones and tablet computers
- content such as text, pictures, icons, or videos.
- touch screens With the development of touch technology, more and more electronic devices are equipped with touch screens.
- the touch screen When the touch screen is set, when it is detected that the user is dragging on the touch screen , Single-click, double-click, slide and other touch operations, it can respond to the user's touch operations.
- electronic equipment includes a front panel, a rear cover, and a frame.
- the front panel includes the upper forehead area, the middle screen area and the lower button area.
- the upper forehead area is provided with functional devices such as the earpiece sound hole and the front camera
- the middle screen area is provided with a touch display screen
- the lower key area is provided with one to three physical buttons.
- the navigation gesture function is to realize the functions of returning to the desktop, returning to the previous level, and viewing recent tasks through the navigation gestures on the system interface or the application interface using physical keys or virtual keys. For example, as shown in FIG. 1, the user can return to the desktop through an upward sliding operation gesture on the bottom area of the touch screen.
- the specific navigation gestures may not be limited.
- the realization of the navigation function is to use the operating system of the mobile terminal to report the event information of the touch event to the upper system (application layer), and the upper system determines whether the touch event is a navigation gesture according to the event information of the touch event, and then determines whether to trigger the navigation Gesture function.
- the inventor found that when the navigation gesture function of the mobile terminal is enabled, when the system of the mobile terminal reports event information of a touch event to the application layer, there is a situation in which redundant event information is reported, which causes the navigation gesture to fail.
- the inventors proposed the touch event processing method, device, mobile terminal, and storage medium provided by the embodiments of the present application, which can avoid the occurrence of touch events within the touch coordinate range of the navigation gesture when the function of the navigation gesture is turned on. At the time, the redundant event information is reported to the application layer, which causes the navigation gesture to become invalid.
- the specific touch event processing method will be described in detail in the subsequent embodiments.
- FIG. 2 shows a schematic flowchart of a touch event processing method provided by an embodiment of the present application.
- the touch event processing method is used to avoid that when the function of the navigation gesture is turned on, when a touch event within the touch coordinate range of the navigation gesture occurs, the redundant event information is reported to the application layer, which causes the navigation gesture to become invalid.
- the touch event processing method is applied to the touch event processing device 400 as shown in FIG. 9 and the mobile terminal 100 configured with the touch event processing device 400 (FIG. 11 ).
- FIG. 11 shows a mobile terminal as an example to describe the specific process of this embodiment.
- the mobile terminal applied in this embodiment may be a smart phone, a tablet computer, a smart watch, etc., which is not limited here.
- the touch event processing method is applied to the operating system of the mobile terminal.
- the mobile terminal includes a touch screen.
- the process shown in FIG. 2 will be described in detail below, and the method for processing the touch event may specifically include the following steps:
- Step S110 When the navigation gesture function of the mobile terminal is turned on, the operating system monitors a touch event to the touch screen.
- the mobile terminal can monitor the navigation gesture function, and when it is monitored that the navigation gesture function is turned on, the touch event of the touch screen is processed in a corresponding manner.
- the navigation gesture function is used to implement functions such as returning to the desktop, returning to the previous level, and viewing recent tasks based on the detected navigation gestures in the system interface or application interface, using physical keys or virtual keys.
- the navigation gestures are preset touch gestures used to trigger returning to the desktop, returning to the previous level, and viewing recent tasks.
- the navigation gestures can be up-swiping gestures, left-swiping gestures, right-swiping gestures, etc., which are not limited here.
- the mobile terminal may be provided with a switch for controlling the on and off of the navigation gesture function, such as a switch in the navigation function setting interface.
- a switch for controlling the on and off of the navigation gesture function, such as a switch in the navigation function setting interface.
- the navigation gesture function can be controlled. Turn on and off.
- the mobile terminal can detect the state of the switch and determine whether the navigation gesture function is turned on according to the state of the switch. Specifically, when the switch is in the on state, the navigation gesture function is in the on state, and when the switch is in the off state , The navigation gesture function is in the closed state.
- the mobile terminal may include a navigation gesture module, which is used to implement the navigation gesture function
- the navigation gesture module may be a software module for determining the touch according to the event information of the touch event reported by the operating system. Whether the event is a navigation gesture, and perform the corresponding navigation gesture function according to the navigation gesture.
- the operating system can monitor whether the navigation gesture module is in the on state, so as to determine whether the navigation gesture function is on. Specifically, when the navigation gesture module is in the on state, the navigation gesture function is in the on state. When it is off, the navigation gesture function is off.
- the Android system framework includes the kernel layer, the core class library layer, the framework layer and the application layer from bottom to top.
- the kernel layer provides core system services, including security, memory management, process management, network protocol stacks, and hardware drivers.
- the hardware driver in the kernel layer is recorded as the driver layer, and the driver layer includes touch display driver, camera driver, etc.
- the core class library layer includes Android Runtime and Libraries. Among them, the Android runtime environment provides most of the functions available in the core class library of the Java programming language, including the core library (CoreLibraries) and the Dalvik virtual machine (Dalvik VM).
- Each Android application is an instance in the Dalvik virtual machine, running in its own process.
- the class library is used by the various components of the Android system, including the following functions: Media Framework, Surface Manager, SQLite (Relational Database Engine), Free Type (Bitmap and Vector Font Rendering), etc., and its various functions It is exposed to developers through the framework layer of the Android system.
- the framework layer provides a series of class libraries needed to develop Android applications, so that developers can carry out rapid application development, facilitate the reuse of components, and can also achieve personalized extensions through inheritance.
- the services provided include component management services and windows. Management services, system data source components, spatial framework, resource management services and installation package management services, etc.
- the application layer includes all kinds of applications that directly interact with users, or service programs written in Java that run in the background, including desktop applications, contact applications, call applications, camera applications, image browsers, games, maps, and web Programs such as browsers, and other applications developed by developers.
- the navigation gesture module can be located in the application layer. After the bottom layer of the Android system (the touch screen system in the kernel layer) detects the touch event, it reports the event information of the touch event to the application layer. The navigation gesture module of the application layer can be based on the touch The event information of the event, to determine whether the touch event is a navigation gesture used to trigger the navigation gesture function, and when the touch event is a navigation gesture, perform the display control corresponding to the navigation gesture function, such as returning to the desktop, returning to the previous interface, and viewing the latest Tasks and so on.
- the touch event on the touch screen can be monitored.
- the bottom layer of the operating system that is, the touch screen system in the kernel layer
- Step S120 When a touch event to the touch screen is monitored, multiple types of event information corresponding to the touch event are acquired, and the multiple types of event information include coordinate event information.
- input events in the mobile terminal can be uniformly mounted in the input system (input system) of the kernel layer, and the input system can include a touch screen system and the like.
- the input system is divided into different event types (type), event codes (code) and event attributes (value).
- type event types
- code event codes
- value event attributes
- the operating system's processing principle for input events is: the application layer is at the upper layer of the system, and the application layer is mainly used to monitor, receive, and process event information from input events reported by the bottom layer of the system.
- the event information of the touch screen events monitored may include press event, lift event, coordinate event information, pressure event information of hand pressing, approximate event information of the diameter of the finger contacting the touch screen, and finger
- the approximate event information of the diameter and the coordinate event information can be divided into the event information of the abscissa and the event information of the ordinate.
- the event information that must be reported can include press event, lift event, and coordinate event information.
- Optional event information can include hand pressure event information, and the diameter of the finger touching the touch screen. Approximate event information and approximate event information of the diameter of the finger.
- the operating system when it detects a touch event on the touch screen, it can acquire various event information corresponding to the touch event.
- Various event information may include pressing events, lifting events, coordinate event information, pressure event information of hand pressing, event information of approximate event information of the diameter of the finger contacting the touch screen, and approximate event information of the diameter of the finger.
- the event information corresponding to a specific touch event may not be limited.
- Step S130 Determine the touch coordinates of the touch event according to the coordinate event information.
- the operating system after the operating system obtains various event information of the touch event, that is, the underlying touch screen system obtains various event information, it can determine the touch event of the touch event according to the coordinate event information in the multiple event information. coordinate.
- the coordinate information acquired by the touch screen system may include the abscissa and ordinate of one or more touch points corresponding to the touch event. It is understandable that when the touch event is an event corresponding to a click operation, the coordinate information may include the abscissa and ordinate corresponding to the click position. When the touch event is an event corresponding to a sliding operation, the coordinate information may include the sliding track. The abscissa and ordinate corresponding to multiple touch points.
- Step S140 When the touch coordinates are in a preset range, report the event information corresponding to the navigation gesture function in the multiple event information to the application layer, and the preset range includes the navigation gesture function corresponding to the The touch coordinate range of the navigation gesture, and the event information corresponding to the navigation gesture function is used to instruct the application layer to perform the processing operation corresponding to the navigation gesture function.
- the touch screen system at the bottom of the operating system can determine whether the touch coordinates are within a preset range after acquiring the touch coordinates of the touch event, so as to confirm whether the touch position of the touch event is within the touch range of the navigation gesture . If the touch coordinates are within the preset range, it means that the touch position of the touch event is within the touch range of the navigation gesture.
- the touch event may be an input event of the navigation gesture. Therefore, the operating system (ie, the underlying touch screen system) is reporting to the application layer When the event information of the touch event is touched, the event information corresponding to the navigation gesture function can be reported to the application layer.
- the touch screen system when the navigation gesture function is turned on, when the user triggers the navigation gesture function, the touch screen system only needs to report the necessary input event information, that is, only the press event, the lift event and the coordinate information need to be reported, and the redundant information cannot be reported.
- the pressure event information, the diameter event information of the finger contacting the touch screen, the diameter event information of the finger, and the reporting of redundant event information will cause the navigation gesture algorithm to misjudge and cause the navigation gesture to fail. Therefore, the touch screen event may be the touch of the navigation gesture In the event of an event, the press event, lift event, and coordinate information are reported to the application layer, instead of reporting more event information to the application layer, so as to avoid navigation gesture failure.
- the preset range may be the screen coordinate range corresponding to the navigation gesture, or it may be understood as the screen coordinate range that triggers the navigation gesture, and the specific preset range may not be limited.
- the navigation gesture module in the application layer can determine whether the touch event is the input event of the navigation gesture according to the reported event information, and if it is the input of the navigation gesture Event, the control corresponding to the navigation gesture function can be performed, such as returning to the desktop, returning to the previous interface, and viewing recent tasks. If it is not an input event of the navigation gesture, the control corresponding to the navigation gesture function is not performed.
- the touch screen system can report all event information of the touch event to the application layer to ensure that the touch event input by the user can be recognized by the application layer to achieve the required input purpose.
- the method for processing touch events uses the operating system to monitor the touch event on the touch screen when the navigation gesture function of the mobile terminal is turned on, and when the touch event on the touch screen is monitored, multiple events corresponding to the touch event are obtained Information, a variety of event information includes coordinate event information, and then the touch coordinates of the touch time are determined according to the coordinate event information.
- the event information corresponding to the navigation gesture function in the multiple event information is reported to the application layer .
- the preset range includes the touch coordinate range of the navigation gesture corresponding to the navigation gesture function, and the event information corresponding to the navigation gesture function is used to instruct the application layer to perform processing operations corresponding to the navigation gesture function, thereby avoiding the situation that the navigation gesture function is turned on
- the redundant event information is reported to the application layer, causing the navigation gesture to become invalid.
- FIG. 3 shows a schematic flowchart of a touch event processing method provided by another embodiment of the present application. This method is applied to the operating system of the above-mentioned mobile terminal.
- the mobile terminal includes a touch screen.
- the process shown in FIG. 3 will be described in detail below.
- the touch event processing method may specifically include the following steps:
- Step S210 When the navigation gesture function of the mobile terminal is turned on, the operating system monitors the touch event on the touch screen.
- Step S220 When a touch event to the touch screen is monitored, multiple types of event information corresponding to the touch event are acquired, and the multiple types of event information include coordinate event information.
- Step S230 Determine the touch coordinates of the touch event according to the coordinate event information.
- steps S210 to S230 may refer to the content of the foregoing embodiment, and details are not described herein again.
- Step S240 Determine the starting point coordinates of the touch starting point of the touch event according to the touch coordinates.
- the operating system may determine the initial coordinates of the touch start point of the touch event according to the touch coordinates.
- the touch coordinates of the touch event may include the touch coordinates of the touch point during the entire touch process of the touch event.
- the touch screen system at the bottom of the operating system can obtain the start point coordinates of the touch start point of the touch event according to the touch coordinates corresponding to the touch event.
- the navigation gesture in the navigation gesture function is usually a sliding gesture that slides from the edge area of the touch screen to the center of the screen.
- the touch start point of the navigation gesture is usually located in the edge area, so that the coordinates of the touch start point can be determined, and the coordinates of the touch start point can be determined according to the touch
- the starting point coordinates determine whether the touch event may be an input event of a navigation gesture.
- Step S250 When the starting point coordinates are in the preset range, report event information corresponding to the navigation gesture function in the multiple event information to the application layer, and the preset range includes the navigation gesture function
- the touch coordinate range of the corresponding navigation gesture, and the event information corresponding to the navigation gesture function is used to instruct the application layer to perform the processing operation corresponding to the navigation gesture function.
- the operating system after the operating system obtains the start point coordinates of the touch start point of the touch event, it can determine whether the start point coordinates are within the preset range. If the start point coordinates are within the preset range, it means the touch The event may be an input event of a navigation gesture; if the starting point coordinates are not within a preset range, it means that the touch event cannot be an input event of a navigation gesture.
- the preset range may be the touch coordinate range corresponding to the navigation gesture of the navigation gesture function.
- the touch coordinate range refers to the coordinate range corresponding to the touch point corresponding to the navigation gesture.
- the preset range can be It is the coordinate range corresponding to the edge area of the touch screen, and the edge area is the area whose distance from the edge of the touch screen is less than the set distance. In this way, it can be determined whether the touch event may be an input event of a navigation gesture by determining whether the coordinates of the starting point of the touch event are in the corresponding coordinate range of the edge area.
- the sliding length corresponding to the navigation gesture usually needs to be greater than a certain length. Therefore, the distance between the touch start point and the touch terminal can be determined, and further determined according to the distance Whether the touch event may be an input event of a navigation gesture, so as to accurately confirm whether the touch event may be an input event of a navigation gesture.
- the touch event processing method may further include: when the starting point coordinates are in the preset range, determining the end point coordinates of the touch event according to the touch coordinates; acquiring the starting point coordinates and the The distance between the endpoint coordinates; if the distance is greater than a set threshold, the event information corresponding to the navigation gesture function in the multiple event information is reported to the application layer.
- the end point coordinates of the touch focus of the touch event are obtained through the coordinate information, and according to the distance between the start point coordinates and the end point coordinates, if the distance is greater than the set threshold, the sliding length of the touch event is greater than a certain length, the touch event It may be an input event of a navigation gesture. If the distance is less than or equal to a set threshold, it means that the touch event cannot be an input event of a navigation gesture. Among them, the specific numerical value of the set threshold may not be regarded as a limitation.
- the event information corresponding to the navigation gesture function in the multiple event information is reported to the application layer to avoid reporting redundant event information to the application Layer and cause navigation gestures to fail.
- the method for processing the touch event may further include: if the distance is less than or equal to the set threshold, reporting the various event information to the application layer. It is understandable that when the threshold is less than or equal to the set threshold, it means that the touch event cannot be an input event of a navigation gesture. Therefore, all acquired event information can be reported to the application layer to ensure that the touch event input by the user can be The application layer recognizes, and realizes the input purpose of the requirement.
- Step S260 If the touch coordinates are located in a range other than the preset range, report the various event information to the application layer.
- the operating system When reporting the event information of a touch event to the application layer, the underlying touch screen system can report all event information of the touch event to the application layer to ensure that the touch event input by the user can be recognized by the application layer to achieve the required input purpose.
- the touch event processing method determines whether the touch event may be an input event of a navigation gesture according to whether the coordinates of the touch start point of the touch event are in the preset range, and when the touch start point coordinates are in the preset range , Report the event information corresponding to the navigation gesture function in the various event information of the touch event to the application layer, so as to avoid the case that the navigation gesture function is turned on, when the touch event within the touch coordinate range of the navigation gesture occurs, the redundant event is reported Information to the application layer causes navigation gestures to fail.
- FIG. 4 shows a schematic flowchart of a touch event processing method provided by another embodiment of the present application. This method is applied to the operating system of the above-mentioned mobile terminal.
- the mobile terminal includes a touch screen.
- the process shown in FIG. 4 will be described in detail below.
- the touch event processing method may specifically include the following steps:
- Step S310 When the navigation gesture function of the mobile terminal is turned on, the operating system monitors touch events on the touch screen
- Step S320 When a touch event to the touch screen is monitored, multiple event information corresponding to the touch event is acquired, and the multiple event information includes coordinate event information.
- Step S330 Determine the touch coordinates of the touch event according to the coordinate event information.
- step S310 to step S330 can refer to the content of the foregoing embodiment, which will not be repeated here.
- Step S340 When all the coordinates in the touch coordinates are within a preset range, report event information corresponding to the navigation gesture function in the multiple event information to the application layer, and the preset range includes the navigation gesture.
- the touch coordinate range of the navigation gesture corresponding to the function, and the event information corresponding to the navigation gesture function is used to instruct the application layer to perform the processing operation corresponding to the navigation gesture function.
- Step S350 When there are touch coordinates that are not in the preset range among the touch coordinates, report the various event information to the application layer.
- the operating system after the operating system obtains the touch coordinates of the touch event, it can determine whether the touch coordinates of all touch points are within a preset range according to the touch coordinates. It is understandable that the preset range is the touch coordinate range corresponding to the navigation gesture. If the touch coordinates of all touch point coordinates are within the preset range, it means that the touch event is most likely the input event of the navigation gesture, and the coordinates of all touch points are If there are touch coordinates that are not in the preset range in the touch coordinates, it means that the touch event cannot be an input event of a navigation gesture.
- the event information corresponding to the navigation gesture function in the multiple event information is reported to the application layer to avoid reporting redundant event information to the application layer and causing the navigation gesture to become invalid.
- all kinds of event information are reported to the application layer to ensure the normal reporting of input events on the touch screen.
- the touch event processing method determines whether the touch event may be an input event of a navigation gesture according to whether the touch coordinates of all touch points of the touch event are all in the preset range.
- the event information corresponding to the navigation gesture function in the multiple event information of the touch event is reported to the application layer.
- the touch coordinates of all touch points in the touch event have touch coordinates that are not in the preset range.
- the event information of the touch event is reported to the application layer, so as to avoid that when the navigation gesture function is turned on, when a touch event within the touch coordinate range of the navigation gesture occurs, the redundant event information is reported to the application layer and navigation The gesture is invalid.
- FIG. 5 shows a schematic flowchart of a touch event processing method according to another embodiment of the present application. This method is applied to the operating system of the above-mentioned mobile terminal.
- the mobile terminal includes a touch screen.
- the process shown in FIG. 5 will be described in detail below.
- the touch event processing method may specifically include the following steps:
- Step S410 Request to the application layer to obtain the event information corresponding to the navigation gesture function, and store the event information corresponding to the navigation gesture function.
- the event information corresponding to the navigation gesture function is determined by the application layer according to the navigation
- the navigation gesture algorithm corresponding to the gesture function is determined.
- the event information corresponding to the navigation gesture function that is, when the navigation gesture function is turned on, the bottom layer of the operating system needs to report the event information of the touch event to the application layer, which can be navigated from the kernel layer of the operating system to the application layer Obtained by the gesture module.
- the kernel layer can initiate a request for event information corresponding to the navigation gesture function to the navigation gesture module in the application layer. After receiving the request, the navigation gesture module in the application layer can return the event information corresponding to the navigation gesture function in the kernel layer.
- the layer obtains the event information corresponding to the navigation gesture function, can store the event information corresponding to the navigation gesture function in the framework layer, and when reporting the event information of the touch event, the kernel layer can obtain the event type that needs to be reported from the framework layer.
- Step S420 When the navigation gesture function of the mobile terminal is turned on, the operating system monitors the touch event on the touch screen.
- Step S430 When a touch event to the touch screen is monitored, obtain various event information corresponding to the touch event, and the various event information includes coordinate event information.
- Step S440 Determine the touch coordinates of the touch event according to the coordinate event information.
- Step S450 When the touch coordinates are in a preset range, report event information corresponding to the navigation gesture function in the multiple event information to the application layer, and the preset range includes the navigation gesture function corresponding to the The touch coordinate range of the navigation gesture, and the event information corresponding to the navigation gesture function is used to instruct the application layer to perform the processing operation corresponding to the navigation gesture function.
- steps S430 to S450 can refer to the content of the foregoing embodiment, which will not be repeated here.
- Step S460 If multiple touch events on the touch screen are monitored within a preset time period, and the touch coordinates of the multiple monitored touch events are within a preset range, request the application layer to obtain the navigation gesture function again Corresponding event information, and store the event information corresponding to the navigation gesture.
- the bottom layer of the operating system can monitor subsequent events after reporting the event information corresponding to the navigation gesture function to the application layer when it is determined that the touch event may be an input event of the navigation gesture. If the preset duration content detects multiple touch events on the touch screen, and the touch coordinates of the multiple monitored touch events are within the preset range, it means that the user may make navigation gestures multiple times within the preset duration, but it has not been distinguish. Therefore, the bottom layer of the operating system can obtain the event information corresponding to the navigation gesture function from the application layer again to determine whether the event information corresponding to the navigation gesture function needs to be updated, so as to avoid updating the event information required by the navigation gesture function. As a result, navigation gestures cannot be recognized.
- the event information required by the previous navigation gesture function includes press events, lift events, and coordinate information
- the navigation gesture algorithm is updated to also include approximate events of the diameter of the finger. Therefore, the bottom layer of the operating system only reports press events and lift events. If the event and coordinate information are triggered, the navigation gesture module in the application layer cannot recognize the navigation gestures, so that the navigation gesture function cannot be realized. Therefore, the bottom layer of the operating system can reacquire the event information corresponding to the navigation gesture function to prevent subsequent navigation gestures from being unacceptable. Recognition.
- Step S470 request the application layer to obtain the touch coordinate range of the navigation gesture corresponding to the navigation gesture function, and update the preset range according to the obtained touch coordinate range.
- the preset duration content detects multiple touch events on the touch screen, and the touch coordinates of the multiple monitored touch events are within the preset range, it means that the navigation gesture algorithm may have been updated, resulting in multiple The second navigation gesture was not recognized. Therefore, after the bottom layer of the operating system updates the event information corresponding to the navigation gesture function, it can also update the touch coordinate range corresponding to the navigation gesture, that is, update the preset range. Therefore, the bottom layer of the operating system can obtain the touch coordinate range of the navigation gesture corresponding to the navigation gesture function from the navigation gesture module of the application layer, and update the preset range to prevent the touch coordinate range of the navigation gesture from changing, which may cause subsequent users The navigation gestures made could not be recognized.
- the bottom layer of the operating system obtains event information corresponding to the navigation gesture function from the application layer, and stores the event information in the driver layer. Each time the touch event of the navigation gesture is reported, it is based on the driver The event information corresponding to the navigation gesture function stored in the layer is reported. In addition, when multiple touch events are monitored and the touch coordinates of the touch event are within the preset range, the event information corresponding to the navigation gesture function is updated, and the preset range is updated, so as to prevent the subsequent navigation gesture input by the user from being unable to be affected. Identify, and affect the user experience.
- FIG. 6 shows a schematic flowchart of a touch event processing method provided by yet another embodiment of the present application. This method is applied to the operating system of the above-mentioned mobile terminal.
- the mobile terminal includes a touch screen.
- the process shown in FIG. 6 will be described in detail below.
- the touch event processing method may specifically include the following steps:
- Step S501 Display a navigation setting interface, and the navigation setting interface includes functional options of the navigation gesture function.
- the mobile terminal may provide navigation function settings.
- the operating system of the mobile terminal can control the display screen to display the navigation setting interface, and the navigation setting interface includes functional options for the navigation gesture function.
- the function option may be in the form of a switch control A1, and the user can control the opening and closing of the navigation gesture function by operating the switch control A1.
- Step S502 When an opening operation on the function option is detected, enable the navigation gesture function.
- the operating system when the operating system detects the opening operation of the function option, it means that the user needs to turn on the navigation gesture function, so the navigation gesture function can be controlled to turn on, for example, the opening operation of the switch control is detected, that is, The navigation gesture function can be controlled to turn on.
- the bottom layer of the operating system may issue an instruction to the application layer, and the instruction is used to control the activation of the navigation gesture module corresponding to the navigation gesture function.
- Step S503 Display a selection interface of the navigation gesture, where the selection interface includes selection options corresponding to multiple navigation modes.
- the navigation mode of the navigation gesture function can be multiple.
- the navigation gesture selection interface can be displayed.
- the selection interface includes selection options corresponding to multiple navigation methods.
- the selection options are used for user selection.
- Navigation method For example, as shown in FIG. 8, the navigation mode includes two-side return, simple gesture, right-side return, and left-side return, and the user can select the navigation mode by selecting the corresponding selection option of the navigation mode.
- the way of returning on both sides is that the sliding operation from the two areas at the bottom of the touch screen corresponds to returning to the previous level, and the sliding operation from the middle area at the bottom of the touch screen corresponds to returning to the desktop, and from the upper middle area at the bottom of the touch screen.
- the operation of sliding and staying corresponds to viewing recent tasks;
- the simple gesture method is to return to the desktop by sliding up from the bottom area of the touch screen, and sliding and staying from the bottom area of the touch screen to view recent tasks;
- right The way of side return is that the sliding operation from the right area of the bottom area of the touch screen is used to return to the previous level, and the sliding operation from the middle area of the bottom area of the touch screen is used to return to the desktop, from the bottom area of the touch screen.
- the swipe up operation in the left area of the touch screen is used to view recent tasks; the way to return on the right is to swipe up from the right area of the bottom area of the touch screen to view recent tasks, from the middle area of the bottom area of the touch screen
- the slide-up operation performed is used to return to the desktop, and the slide-up operation performed from the left area of the bottom area of the touch screen is used to return to the previous level.
- the above navigation mode is only an example, and does not represent a limitation on the navigation mode of the navigation gesture function in the embodiment of the present application.
- Step S504 Determine the selected navigation mode according to the operation on the selected option, and the navigation mode corresponds to different navigation gestures.
- Step S505 Determine a navigation gesture corresponding to the navigation mode according to the navigation mode.
- Step S506 Obtain the touch coordinate range of the navigation gesture corresponding to the navigation mode, and use the touch coordinate range as the preset range.
- the navigation gestures corresponding to different navigation methods may be different. Therefore, the navigation gesture corresponding to the navigation method can be determined according to the selected navigation method, and the touch coordinate range of the navigation gesture can be determined as the preset range according to the navigation gesture.
- Step S507 When the navigation gesture function of the mobile terminal is turned on, the operating system monitors the touch event to the touch screen.
- Step S508 When a touch event to the touch screen is monitored, multiple event information corresponding to the touch event is acquired, and the multiple event information includes coordinate event information.
- Step S509 Determine the touch coordinates of the touch event according to the coordinate event information.
- Step S510 When the touch coordinates are in a preset range, report the event information corresponding to the navigation gesture function in the multiple event information to the application layer, and the preset range includes the navigation gesture function corresponding to the The touch coordinate range of the navigation gesture, and the event information corresponding to the navigation gesture function is used to instruct the application layer to perform the processing operation corresponding to the navigation gesture function.
- steps S507 to S510 can refer to the content of the foregoing embodiment, and details are not described herein again.
- Step S511 When the navigation gesture function of the mobile terminal is turned off, if the operating system detects a touch event on the touch screen, acquire various event information corresponding to the touch event, and combine the various events The information is reported to the application layer.
- the touch event processing method provided in the embodiment of the present application provides a method for setting the navigation gesture function, which is convenient for the user to manage the navigation gesture function. And according to whether the touch coordinate of the touch event is in the preset range, it is determined whether the touch event may be the input event of the navigation gesture. When the touch coordinate is in the preset range, the event corresponding to the navigation gesture function is included in the various event information of the touch event Information is reported to the application layer, so as to avoid that when the navigation gesture function is turned on, when a touch event within the touch coordinate range of the navigation gesture occurs, the redundant event information is reported to the application layer and the navigation gesture becomes invalid.
- FIG. 9 shows a structural block diagram of a touch event processing apparatus 400 provided by an embodiment of the present application.
- the touch event processing device 400 applies the aforementioned operating system of the mobile terminal.
- the touch event processing device 400 includes a touch monitoring module 410, an event acquisition module 420, a coordinate acquisition module 430, and an event reporting module 440.
- the touch monitoring module 410 is used to monitor touch events on the touch screen when the navigation gesture function of the mobile terminal is turned on
- the event acquisition module 420 is used to monitor touch events on the touch screen when the navigation gesture function of the mobile terminal is turned on.
- multiple event information corresponding to the touch event is acquired, and the multiple event information includes coordinate event information; the coordinate acquisition module 430 is configured to determine the touch event’s event information according to the coordinate event information. Touch coordinates; the event reporting module 440 is used to report the event information corresponding to the navigation gesture function in the multiple event information to the application layer when the touch coordinates are within a preset range, the preset range including The touch coordinate range of the navigation gesture corresponding to the navigation gesture function, and the event information corresponding to the navigation gesture function is used to instruct the application layer to perform processing operations corresponding to the navigation gesture function.
- the event reporting module 440 is further configured to report the various event information to the application layer if the touch coordinates are located in a range other than the preset range.
- the event reporting module 440 includes a starting point obtaining unit 441 and a reporting execution unit 442.
- the starting point acquiring unit 441 is configured to determine the starting point coordinates of the touch starting point of the touch event according to the touch coordinates;
- the reporting execution unit 442 is configured to determine when the starting point coordinates are located in the preset In the case of the range, the event information corresponding to the navigation gesture function in the multiple event information is reported to the application layer.
- the reporting execution unit 442 may be specifically configured to: when the starting point coordinates are within the preset range, determine the end point coordinates of the touch event according to the touch coordinates; and obtain the starting point coordinates; The distance between the start point coordinates and the end point coordinates; if the distance is greater than a set threshold, the event information corresponding to the navigation gesture function in the multiple event information is reported to the application layer.
- the report execution unit 442 may be further configured to report the various event information to the application layer if the distance is less than or equal to the set threshold.
- the reporting execution unit 442 may also be specifically configured to: when all of the touch coordinates are within a preset range, combine the event information corresponding to the navigation gesture function in the multiple event information Report to the application layer.
- the device 400 for processing a touch event may further include: an event information acquisition module.
- the event information acquisition module may be used to request the application layer to acquire the event information corresponding to the navigation gesture function, and to store the event information corresponding to the navigation gesture function.
- the event information corresponding to the navigation gesture function is used by the application layer. Determined according to the navigation gesture algorithm corresponding to the navigation gesture function.
- the device 400 for processing a touch event may further include an event information update module.
- the event information update module can be used to request the application layer to obtain all touch events from the application layer if multiple touch events are detected on the touch screen within a preset period of time, and the touch coordinates of the multiple monitored touch events are within a preset range.
- the event information corresponding to the navigation gesture function is described, and the event information corresponding to the navigation gesture is stored.
- the device 400 for processing a touch event may further include: a range update module.
- the range update module may be used to request the application layer to obtain the touch coordinate range of the navigation gesture corresponding to the navigation gesture function, and update the preset range according to the obtained touch coordinate range.
- the touch event processing device 400 may further include: a setting interface display module and a navigation gesture activation module.
- the setting interface display module is used to display a navigation setting interface before the operating system monitors a touch event on the touch screen when the navigation gesture function of the mobile terminal is turned on.
- the navigation setting interface includes the navigation gesture function. Function option; the navigation gesture enabling module is used to enable the navigation gesture function when an opening operation on the function option is detected.
- the touch event processing device 400 may further include: a selection interface display module and a navigation mode acquisition module.
- the selection interface display module is configured to display a selection interface of the navigation gesture after the navigation gesture function is enabled when the opening operation of the function option is detected, and the selection interface includes selection options corresponding to multiple navigation modes
- the navigation mode acquisition module is used to determine the selected navigation mode according to the operation of the selected option, and the navigation mode corresponds to different navigation gestures.
- the touch event processing device 400 may further include: a navigation gesture acquisition module and a preset range determination module.
- the navigation gesture acquisition module is used to determine the navigation gesture corresponding to the navigation method according to the navigation method;
- the preset range determination module is used to acquire the touch coordinate range of the navigation gesture corresponding to the navigation method, and compare the touch coordinates The range serves as the preset range.
- the event reporting module 440 may also be used to: when the navigation gesture function of the mobile terminal is turned off, if the operating system detects a touch event on the touch screen, obtain the corresponding touch event Multiple types of event information, and report the multiple types of event information to the application layer.
- the various event information includes coordinate event information, press event, lift event, press pressure event information, approximate event of the diameter of the finger contacting the touch screen, and approximate event of the diameter of the finger;
- the event information corresponding to the navigation gesture function includes coordinate event information, pressing event, and lifting event.
- the preset range includes a coordinate range corresponding to an edge area of the touch screen, and the edge area is an area whose distance from the edge of the touch screen is less than a set distance.
- the coupling between the modules may be electrical, mechanical or other forms of coupling.
- each functional module in each embodiment of the present application may be integrated into one processing module, or each module may exist alone physically, or two or more modules may be integrated into one module.
- the above-mentioned integrated modules can be implemented in the form of hardware or software function modules.
- the operating system monitors the touch event on the touch screen.
- the touch event on the touch screen When the touch event on the touch screen is detected, it obtains various event information corresponding to the touch event. Including coordinate event information, and then determining the touch coordinates of the touch time according to the coordinate event information.
- the touch coordinates are in the preset range, the event information corresponding to the navigation gesture function in the multiple event information is reported to the application layer.
- the preset range includes navigation gestures
- the touch coordinate range of the navigation gesture corresponding to the function, and the event information corresponding to the navigation gesture function is used to instruct the application layer to perform processing operations corresponding to the navigation gesture function. Therefore, when the function of the navigation gesture is turned on, when a touch event within the touch coordinate range of the navigation gesture occurs, the redundant event information is reported to the application layer, which causes the navigation gesture to become invalid.
- the mobile terminal 100 may be an electronic device capable of running application programs, such as a smart phone, a tablet computer, or an e-book.
- the mobile terminal 100 in this application may include one or more of the following components: a processor 110, a memory 120, a touch screen 130, and one or more application programs, of which one or more application programs may be stored in the memory 120 and configured To be executed by one or more processors 110, one or more programs are configured to execute the methods described in the foregoing method embodiments.
- the processor 110 may include one or more processing cores.
- the processor 110 uses various interfaces and lines to connect various parts of the entire electronic device 100, and executes by running or executing instructions, programs, code sets, or instruction sets stored in the memory 120, and calling data stored in the memory 120.
- Various functions and processing data of the electronic device 100 may use at least one of digital signal processing (Digital Signal Processing, DSP), Field-Programmable Gate Array (Field-Programmable Gate Array, FPGA), and Programmable Logic Array (Programmable Logic Array, PLA).
- DSP Digital Signal Processing
- FPGA Field-Programmable Gate Array
- PLA Programmable Logic Array
- the processor 110 may be integrated with one or a combination of a central processing unit (CPU), a graphics processing unit (GPU), a modem, and the like.
- the CPU mainly processes the operating system, user interface, and application programs; the GPU is used for rendering and drawing of display content; the modem is used for processing wireless communication. It can be understood that the above-mentioned modem may not be integrated into the processor 110, but may be implemented by a communication chip alone.
- the memory 120 may include random access memory (RAM) or read-only memory (Read-Only Memory).
- the memory 120 may be used to store instructions, programs, codes, code sets or instruction sets.
- the memory 120 may include a program storage area and a data storage area, where the program storage area may store instructions for implementing the operating system and instructions for implementing at least one function (such as touch function, sound playback function, image playback function, etc.) , Instructions used to implement the following various method embodiments, etc.
- the data storage area can also store data (such as phone book, audio and video data, chat record data) created by the terminal 100 during use.
- the touch screen 130 can collect the user's touch operations on or near it (for example, the user's operations on the touch screen 130 or near the touch screen 130 using any suitable object or accessory such as a finger, a stylus, etc.), and set it according to a preset
- the program drives the corresponding connection device.
- the touch screen 130 may include a touch detection device and a touch controller.
- the touch detection device detects the user's touch position, detects the signal brought by the touch operation, and transmits the signal to the touch controller;
- the touch controller receives touch information from the touch detection device, and The touch information is converted into contact coordinates, and then sent to the processor 110, and the command sent by the processor 110 can be received and executed.
- multiple types such as resistive type, capacitive type, infrared, and surface acoustic wave may be used to implement the touch detection function of the touch screen 130.
- FIG. 12 shows a structural block diagram of a computer-readable storage medium provided by an embodiment of the present application.
- the computer-readable medium 800 stores program code, and the program code can be invoked by a processor to execute the method described in the foregoing method embodiment.
- the computer-readable storage medium 800 may be an electronic memory such as flash memory, EEPROM (Electrically Erasable Programmable Read Only Memory), EPROM, hard disk, or ROM.
- the computer-readable storage medium 800 includes a non-transitory computer-readable storage medium.
- the computer-readable storage medium 800 has storage space for the program code 810 for executing any method steps in the above-mentioned methods. These program codes can be read from or written into one or more computer program products.
- the program code 810 may be compressed in a suitable form, for example.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
- Telephone Function (AREA)
Abstract
Description
Claims (20)
- 一种触摸事件的处理方法,其特征在于,应用于移动终端的操作系统,所述移动终端包括触摸屏,所述方法包括:在所述移动终端的导航手势功能开启时,所述操作系统监测对所述触摸屏的触摸事件;当监测到对所述触摸屏的触摸事件时,获取所述触摸事件对应的多种事件信息,所述多种事件信息中包括坐标事件信息;根据所述坐标事件信息,确定所述触摸事件的触摸坐标;当所述触摸坐标位于预设范围时,将所述多种事件信息中所述导航手势功能对应的事件信息上报至应用层,所述预设范围包括所述导航手势功能所对应的导航手势的触摸坐标范围,所述导航手势功能对应的事件信息用于指示所述应用层进行所述导航手势功能对应的处理操作。
- 根据权利要求1所述的方法,其特征在于,所述方法还包括:如果所述触摸坐标位于除所述预设范围以外的其他范围时,将所述多种事件信息上报至所述应用层。
- 根据权利要求1所述的方法,其特征在于,所述当所述触摸坐标位于预设范围时,将所述多种事件信息中所述导航手势功能对应的事件信息上报至应用层,包括:根据所述触摸坐标,确定所述触摸事件的触摸起始点的起始点坐标;当所述起始点坐标位于所述预设范围时,将所述多种事件信息中所述导航手势功能对应的事件信息上报至应用层。
- 根据权利要求3所述的方法,其特征在于,所述当所述起始点坐标位于所述预设范围时,将所述多种事件信息中所述导航手势功能对应的事件信息上报至应用层,包括:当所述起始点坐标位于所述预设范围时,根据所述触摸坐标,确定所述触摸事件的终点坐标;获取所述起始点坐标与所述终点坐标之间的距离;如果所述距离大于设定阈值,将所述多种事件信息中所述导航手势功能对应的事件信息上报至应用层。
- 根据权利要求4所述的方法,其特征在于,所述方法还包括:如果所述距离小于或者等于所述设定阈值,将所述多种事件信息上报至应用层。
- 根据权利要求1所述的方法,其特征在于,所述当所述触摸坐标位于预设范围时,将所述多种事件信息中所述导航手势功能对应的事件信息上报至应用层,包括:当所述触摸坐标中所有坐标均位于预设范围时,将所述多种事件信息中所述导航手势功能对应的事件信息上报至应用层。
- 根据权利要求1-6任一项所述的方法,其特征在于,所述方法还包括:向应用层请求获取所述导航手势功能对应的事件信息,并将所述导航手势功能对应的事件信息进行存储,所述导航手势功能对应的事件信息由所述应用层根据所述导航手势功能对应的导航手势算法确定。
- 根据权利要求7所述的方法,其特征在于,在当所述触摸坐标位于预设范围时,将所述多种事件信息中所述导航手势功能对应的事件信息上报至应用层之后,所述方法还包括:如果预设时长内多次监测到对所述触摸屏的触摸事件,且多次监测到的触摸事件的触摸坐标位于预设范围,则再次向所述应用层请求获取所述导航手势功能对应的事 件信息,并将所述导航手势对应的事件信息进行存储。
- 根据权利要求8所述的方法,其特征在于,所述方法还包括:向所述应用层请求获取所述导航手势功能所对应的导航手势的触摸坐标范围,并根据获取的触摸坐标范围对所述预设范围进行更新。
- 根据权利要求1-9任一项所述的方法,其特征在于,在所述移动终端的导航手势功能开启时,所述操作系统监测对所述触摸屏的触摸事件之前,所述方法还包括:显示导航设置界面,所述导航设置界面中包括所述导航手势功能的功能选项;当检测到对所述功能选项的开启操作时,启用所述导航手势功能。
- 根据权利要求10所述的方法,其特征在于,在所述当检测到对所述功能选项的开启操作时,启用所述导航手势功能之后,所述方法还包括:显示导航手势的选择界面,所述选择界面包括多个导航方式对应的选择选项;根据对所述选择选项的操作,确定选择的导航方式,所述导航方式对应不同的导航手势。
- 根据权利要求11所述的方法,其特征在于,所述方法还包括:根据所述导航方式,确定与所述导航方式对应的导航手势;获取所述导航方式对应的导航手势的触摸坐标范围,并将所述触摸坐标范围作为所述预设范围。
- 根据权利要求1-12任一项所述的方法,其特征在于,所述方法还包括:在所述移动终端的导航手势功能关闭时,如果所述操作系统监测到对所述触摸屏的触摸事件时,获取所述触摸事件对应的多种事件信息,并将所述多种事件信息上报至所述应用层。
- 根据权利要求1-13任一项所述的方法,其特征在于,所述多种事件信息包括坐标事件信息、按下事件、抬起事件、按压压力的事件信息、手指接触所述触摸屏的直径的近似事件以及手指的直径的近似事件;所述导航手势功能对应的事件信息包括坐标事件信息、按下事件以及抬起事件。
- 根据权利要求1-14任一项所述的方法,其特征在于,所述预设范围包括所述触摸屏的边缘区域对应的坐标范围,所述边缘区域为与所述触摸屏的边缘的距离小于设定距离的区域。
- 一种触摸事件的处理装置,其特征在于,应用于移动终端的操作系统,所述移动终端包括触摸屏,所述装置包括:触摸监测模块、事件获取模块、坐标获取模块以及事件上报模块,其中,所述触摸监测模块用于在所述移动终端的导航手势功能开启时,所述操作系统监测对所述触摸屏的触摸事件;所述事件获取模块用于当监测到对所述触摸屏的触摸事件时,获取所述触摸事件对应的多种事件信息,所述多种事件信息中包括坐标事件信息;所述坐标获取模块用于根据所述坐标事件信息,确定所述触摸事件的触摸坐标;所述事件上报模块用于当所述触摸坐标位于预设范围时,将所述多种事件信息中所述导航手势功能对应的事件信息上报至应用层,所述预设范围包括所述导航手势功能所对应的导航手势的触摸坐标范围,所述导航手势功能对应的事件信息用于指示所述应用层进行所述导航手势功能对应的处理操作。
- 根据权利要求16所述的装置,其特征在于,所述事件上报模块还用于如果所述触摸坐标位于除所述预设范围以外的其他范围时,将所述多种事件信息上报至所述应用层。
- 根据权利要求16所述的装置,其特征在于,所述事件上报模块包括:起始点获取单元以及上报执行单元,其中,所述起始点获取单元用于根据所述触摸坐标,确定所述触摸事件的触摸起始点的 起始点坐标;所述上报执行单元用于当所述起始点坐标位于所述预设范围时,将所述多种事件信息中所述导航手势功能对应的事件信息上报至应用层。
- 一种移动终端,其特征在于,包括:一个或多个处理器;存储器;一个或多个应用程序,其中所述一个或多个应用程序被存储在所述存储器中并被配置为由所述一个或多个处理器执行,所述一个或多个程序配置用于执行如权利要求1-15任一项所述的方法。
- 一种计算机可读取存储介质,其特征在于,所述计算机可读取存储介质中存储有程序代码,所述程序代码可被处理器调用执行如权利要求1-15任一项所述的方法。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2019/109993 WO2021068112A1 (zh) | 2019-10-08 | 2019-10-08 | 触摸事件的处理方法、装置、移动终端及存储介质 |
CN201980099360.2A CN114270298A (zh) | 2019-10-08 | 2019-10-08 | 触摸事件的处理方法、装置、移动终端及存储介质 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2019/109993 WO2021068112A1 (zh) | 2019-10-08 | 2019-10-08 | 触摸事件的处理方法、装置、移动终端及存储介质 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021068112A1 true WO2021068112A1 (zh) | 2021-04-15 |
Family
ID=75437785
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2019/109993 WO2021068112A1 (zh) | 2019-10-08 | 2019-10-08 | 触摸事件的处理方法、装置、移动终端及存储介质 |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN114270298A (zh) |
WO (1) | WO2021068112A1 (zh) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116991302B (zh) * | 2023-09-22 | 2024-03-19 | 荣耀终端有限公司 | 应用与手势导航栏兼容运行方法、图形界面及相关装置 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102023735A (zh) * | 2009-09-21 | 2011-04-20 | 联想(北京)有限公司 | 一种触摸输入设备、电子设备及手机 |
US20120206399A1 (en) * | 2011-02-10 | 2012-08-16 | Alcor Micro, Corp. | Method and System for Processing Signals of Touch Panel |
CN102819331A (zh) * | 2011-06-07 | 2012-12-12 | 联想(北京)有限公司 | 移动终端及其触摸输入方法 |
CN103257820A (zh) * | 2012-02-20 | 2013-08-21 | 联想(北京)有限公司 | 控制方法及电子设备 |
CN105487705A (zh) * | 2015-11-20 | 2016-04-13 | 努比亚技术有限公司 | 移动终端、输入处理方法及用户设备 |
CN109766043A (zh) * | 2018-12-29 | 2019-05-17 | 华为技术有限公司 | 电子设备的操作方法和电子设备 |
-
2019
- 2019-10-08 WO PCT/CN2019/109993 patent/WO2021068112A1/zh active Application Filing
- 2019-10-08 CN CN201980099360.2A patent/CN114270298A/zh active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102023735A (zh) * | 2009-09-21 | 2011-04-20 | 联想(北京)有限公司 | 一种触摸输入设备、电子设备及手机 |
US20120206399A1 (en) * | 2011-02-10 | 2012-08-16 | Alcor Micro, Corp. | Method and System for Processing Signals of Touch Panel |
CN102819331A (zh) * | 2011-06-07 | 2012-12-12 | 联想(北京)有限公司 | 移动终端及其触摸输入方法 |
CN103257820A (zh) * | 2012-02-20 | 2013-08-21 | 联想(北京)有限公司 | 控制方法及电子设备 |
CN105487705A (zh) * | 2015-11-20 | 2016-04-13 | 努比亚技术有限公司 | 移动终端、输入处理方法及用户设备 |
CN109766043A (zh) * | 2018-12-29 | 2019-05-17 | 华为技术有限公司 | 电子设备的操作方法和电子设备 |
Also Published As
Publication number | Publication date |
---|---|
CN114270298A (zh) | 2022-04-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10908703B2 (en) | User terminal device and method for controlling the user terminal device thereof | |
CN110663018B (zh) | 多显示器设备中的应用启动 | |
WO2021035884A1 (zh) | 投屏方法、装置、终端及存储介质 | |
EP3842905B1 (en) | Icon display method and apparatus, terminal and storage medium | |
KR102021048B1 (ko) | 사용자 입력을 제어하기 위한 방법 및 그 전자 장치 | |
US9400590B2 (en) | Method and electronic device for displaying a virtual button | |
KR101278346B1 (ko) | 이벤트 인식 | |
EP3435209B1 (en) | Method for recognizing a screen-off gesture, and storage medium and terminal thereof | |
WO2021092768A1 (zh) | 触摸事件的处理方法、装置、移动终端及存储介质 | |
TWI512601B (zh) | 電子裝置及其控制方法與電腦程式產品 | |
CN104007894A (zh) | 便携式设备及其多应用操作方法 | |
US9360989B2 (en) | Information processing device, and method for changing execution priority | |
US10466894B2 (en) | Method, device, storage medium and mobile terminal for recognizing an off-screen gesture | |
WO2019201140A1 (zh) | 应用显示方法、装置、存储介质及电子设备 | |
US10019148B2 (en) | Method and apparatus for controlling virtual screen | |
US11681410B2 (en) | Icon management method and terminal device | |
WO2019047231A1 (zh) | 触摸操作响应方法及装置 | |
WO2019047226A1 (zh) | 触摸操作响应方法及装置 | |
WO2019047234A1 (zh) | 触摸操作响应方法及装置 | |
EP2490115A1 (en) | Electronic device, controlling method thereof and computer program product | |
WO2021068112A1 (zh) | 触摸事件的处理方法、装置、移动终端及存储介质 | |
CN107092433B (zh) | 触控一体机的触摸控制方法及装置 | |
WO2019072169A1 (zh) | 防误触检测方法、装置及终端 | |
US20150153925A1 (en) | Method for operating gestures and method for calling cursor | |
US9026691B2 (en) | Semi-autonomous touch I/O device controller operation under control of host |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19948535 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 19948535 Country of ref document: EP Kind code of ref document: A1 |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 07.10.2022) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 19948535 Country of ref document: EP Kind code of ref document: A1 |