WO2021068112A1 - 触摸事件的处理方法、装置、移动终端及存储介质 - Google Patents

触摸事件的处理方法、装置、移动终端及存储介质 Download PDF

Info

Publication number
WO2021068112A1
WO2021068112A1 PCT/CN2019/109993 CN2019109993W WO2021068112A1 WO 2021068112 A1 WO2021068112 A1 WO 2021068112A1 CN 2019109993 W CN2019109993 W CN 2019109993W WO 2021068112 A1 WO2021068112 A1 WO 2021068112A1
Authority
WO
WIPO (PCT)
Prior art keywords
touch
event
event information
navigation gesture
navigation
Prior art date
Application number
PCT/CN2019/109993
Other languages
English (en)
French (fr)
Inventor
戴聪
Original Assignee
深圳市欢太科技有限公司
Oppo广东移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市欢太科技有限公司, Oppo广东移动通信有限公司 filed Critical 深圳市欢太科技有限公司
Priority to PCT/CN2019/109993 priority Critical patent/WO2021068112A1/zh
Priority to CN201980099360.2A priority patent/CN114270298A/zh
Publication of WO2021068112A1 publication Critical patent/WO2021068112A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]

Definitions

  • This application relates to the technical field of mobile terminals, and more specifically, to a method, device, mobile terminal, and storage medium for processing touch events.
  • Mobile terminals such as tablet computers and mobile phones, have become one of the most commonly used consumer electronic products in people's daily lives. With the continuous increase of the screen of the mobile terminal, the buttons on the mobile terminal are gradually cancelled. Since there are no buttons for returning to the homepage, rewinding, viewing tasks, etc. on the mobile terminal, a navigation gesture function is created.
  • this application proposes a touch event processing method, device, mobile terminal, and storage medium.
  • an embodiment of the present application provides a touch event processing method, which is applied to an operating system of a mobile terminal, the mobile terminal includes a touch screen, and the method includes: when the navigation gesture function of the mobile terminal is turned on, The operating system monitors a touch event to the touch screen; when a touch event to the touch screen is monitored, obtains various event information corresponding to the touch event, and the various event information includes coordinate event information; The coordinate event information determines the touch coordinates of the touch event; when the touch coordinates are within a preset range, the event information corresponding to the navigation gesture function in the multiple event information is reported to the application layer, and The preset range includes the touch coordinate range of the navigation gesture corresponding to the navigation gesture function, and the event information corresponding to the navigation gesture function is used to instruct the application layer to perform processing operations corresponding to the navigation gesture function.
  • an embodiment of the present application provides a touch event processing device, which is applied to an operating system of a mobile terminal, the mobile terminal includes a touch screen, and the device includes: a touch monitoring module, an event acquisition module, a coordinate acquisition module, and The event reporting module, wherein the touch monitoring module is used to monitor touch events to the touch screen when the navigation gesture function of the mobile terminal is turned on; the event acquisition module is used to monitor touch events to the touch screen when the navigation gesture function of the mobile terminal is turned on; When the touch event of the touch screen is described, various event information corresponding to the touch event is acquired, and the various event information includes coordinate event information; the coordinate acquisition module is used to determine the touch event according to the coordinate event information The touch coordinates; the event reporting module is used to report the event information corresponding to the navigation gesture function in the multiple event information to the application layer when the touch coordinates are in a preset range, and the preset range includes The touch coordinate range of the navigation gesture corresponding to the navigation gesture function, and the event information corresponding to the navigation gesture function is
  • an embodiment of the present application provides an electronic device, including: one or more processors; a memory; one or more application programs, wherein the one or more application programs are stored in the memory and It is configured to be executed by the one or more processors, and the one or more programs are configured to execute the touch event processing method provided in the first aspect described above.
  • an embodiment of the present application provides a computer-readable storage medium, and the computer-readable storage medium stores program code, and the program code can be invoked by a processor to execute the touch provided in the first aspect.
  • the processing method of the event is not limited to:
  • the solution provided by this application uses the operating system to monitor touch events on the touch screen when the navigation gesture function of the mobile terminal is turned on, and when a touch event on the touch screen is monitored, various event information corresponding to the touch event is obtained.
  • the coordinate event information is included in the coordinate event information, and then the touch coordinate of the touch time is determined according to the coordinate event information.
  • the touch coordinate is in the preset range
  • the event information corresponding to the navigation gesture function in the multiple event information is reported to the application layer.
  • the preset range includes navigation The touch coordinate range of the navigation gesture corresponding to the gesture function, and the event information corresponding to the navigation gesture function is used to instruct the application layer to perform processing operations corresponding to the navigation gesture function. Therefore, when the function of the navigation gesture is turned on, when a touch event within the touch coordinate range of the navigation gesture occurs, the redundant event information is reported to the application layer, which causes the navigation gesture to become invalid.
  • Fig. 1 shows a schematic diagram of an interface provided by an embodiment of the present application.
  • Fig. 2 shows a flowchart of a method for processing a touch event according to an embodiment of the present application.
  • Fig. 3 shows a flowchart of a method for processing a touch event according to another embodiment of the present application.
  • Fig. 4 shows a flowchart of a method for processing a touch event according to another embodiment of the present application.
  • Fig. 5 shows a flowchart of a method for processing a touch event according to another embodiment of the present application.
  • Fig. 6 shows a flowchart of a method for processing a touch event according to still another embodiment of the present application.
  • Fig. 7 shows a schematic diagram of an interface provided in yet another embodiment of the present application.
  • Fig. 8 shows a schematic diagram of another interface provided in yet another embodiment of the present application.
  • Fig. 9 shows a block diagram of a touch event processing device according to an embodiment of the present application.
  • Fig. 10 shows a block diagram of an event reporting module in a touch event processing apparatus according to an embodiment of the present application.
  • FIG. 11 is a block diagram of a mobile terminal for executing the method for processing a touch event according to an embodiment of the present application according to an embodiment of the present application.
  • FIG. 12 is a storage unit for storing or carrying program code implementing the method for processing touch events according to the embodiment of the present application according to an embodiment of the present application.
  • the display screen usually plays a role in electronic devices such as mobile phones and tablet computers to display content such as text, pictures, icons, or videos.
  • electronic devices such as mobile phones and tablet computers
  • content such as text, pictures, icons, or videos.
  • touch screens With the development of touch technology, more and more electronic devices are equipped with touch screens.
  • the touch screen When the touch screen is set, when it is detected that the user is dragging on the touch screen , Single-click, double-click, slide and other touch operations, it can respond to the user's touch operations.
  • electronic equipment includes a front panel, a rear cover, and a frame.
  • the front panel includes the upper forehead area, the middle screen area and the lower button area.
  • the upper forehead area is provided with functional devices such as the earpiece sound hole and the front camera
  • the middle screen area is provided with a touch display screen
  • the lower key area is provided with one to three physical buttons.
  • the navigation gesture function is to realize the functions of returning to the desktop, returning to the previous level, and viewing recent tasks through the navigation gestures on the system interface or the application interface using physical keys or virtual keys. For example, as shown in FIG. 1, the user can return to the desktop through an upward sliding operation gesture on the bottom area of the touch screen.
  • the specific navigation gestures may not be limited.
  • the realization of the navigation function is to use the operating system of the mobile terminal to report the event information of the touch event to the upper system (application layer), and the upper system determines whether the touch event is a navigation gesture according to the event information of the touch event, and then determines whether to trigger the navigation Gesture function.
  • the inventor found that when the navigation gesture function of the mobile terminal is enabled, when the system of the mobile terminal reports event information of a touch event to the application layer, there is a situation in which redundant event information is reported, which causes the navigation gesture to fail.
  • the inventors proposed the touch event processing method, device, mobile terminal, and storage medium provided by the embodiments of the present application, which can avoid the occurrence of touch events within the touch coordinate range of the navigation gesture when the function of the navigation gesture is turned on. At the time, the redundant event information is reported to the application layer, which causes the navigation gesture to become invalid.
  • the specific touch event processing method will be described in detail in the subsequent embodiments.
  • FIG. 2 shows a schematic flowchart of a touch event processing method provided by an embodiment of the present application.
  • the touch event processing method is used to avoid that when the function of the navigation gesture is turned on, when a touch event within the touch coordinate range of the navigation gesture occurs, the redundant event information is reported to the application layer, which causes the navigation gesture to become invalid.
  • the touch event processing method is applied to the touch event processing device 400 as shown in FIG. 9 and the mobile terminal 100 configured with the touch event processing device 400 (FIG. 11 ).
  • FIG. 11 shows a mobile terminal as an example to describe the specific process of this embodiment.
  • the mobile terminal applied in this embodiment may be a smart phone, a tablet computer, a smart watch, etc., which is not limited here.
  • the touch event processing method is applied to the operating system of the mobile terminal.
  • the mobile terminal includes a touch screen.
  • the process shown in FIG. 2 will be described in detail below, and the method for processing the touch event may specifically include the following steps:
  • Step S110 When the navigation gesture function of the mobile terminal is turned on, the operating system monitors a touch event to the touch screen.
  • the mobile terminal can monitor the navigation gesture function, and when it is monitored that the navigation gesture function is turned on, the touch event of the touch screen is processed in a corresponding manner.
  • the navigation gesture function is used to implement functions such as returning to the desktop, returning to the previous level, and viewing recent tasks based on the detected navigation gestures in the system interface or application interface, using physical keys or virtual keys.
  • the navigation gestures are preset touch gestures used to trigger returning to the desktop, returning to the previous level, and viewing recent tasks.
  • the navigation gestures can be up-swiping gestures, left-swiping gestures, right-swiping gestures, etc., which are not limited here.
  • the mobile terminal may be provided with a switch for controlling the on and off of the navigation gesture function, such as a switch in the navigation function setting interface.
  • a switch for controlling the on and off of the navigation gesture function, such as a switch in the navigation function setting interface.
  • the navigation gesture function can be controlled. Turn on and off.
  • the mobile terminal can detect the state of the switch and determine whether the navigation gesture function is turned on according to the state of the switch. Specifically, when the switch is in the on state, the navigation gesture function is in the on state, and when the switch is in the off state , The navigation gesture function is in the closed state.
  • the mobile terminal may include a navigation gesture module, which is used to implement the navigation gesture function
  • the navigation gesture module may be a software module for determining the touch according to the event information of the touch event reported by the operating system. Whether the event is a navigation gesture, and perform the corresponding navigation gesture function according to the navigation gesture.
  • the operating system can monitor whether the navigation gesture module is in the on state, so as to determine whether the navigation gesture function is on. Specifically, when the navigation gesture module is in the on state, the navigation gesture function is in the on state. When it is off, the navigation gesture function is off.
  • the Android system framework includes the kernel layer, the core class library layer, the framework layer and the application layer from bottom to top.
  • the kernel layer provides core system services, including security, memory management, process management, network protocol stacks, and hardware drivers.
  • the hardware driver in the kernel layer is recorded as the driver layer, and the driver layer includes touch display driver, camera driver, etc.
  • the core class library layer includes Android Runtime and Libraries. Among them, the Android runtime environment provides most of the functions available in the core class library of the Java programming language, including the core library (CoreLibraries) and the Dalvik virtual machine (Dalvik VM).
  • Each Android application is an instance in the Dalvik virtual machine, running in its own process.
  • the class library is used by the various components of the Android system, including the following functions: Media Framework, Surface Manager, SQLite (Relational Database Engine), Free Type (Bitmap and Vector Font Rendering), etc., and its various functions It is exposed to developers through the framework layer of the Android system.
  • the framework layer provides a series of class libraries needed to develop Android applications, so that developers can carry out rapid application development, facilitate the reuse of components, and can also achieve personalized extensions through inheritance.
  • the services provided include component management services and windows. Management services, system data source components, spatial framework, resource management services and installation package management services, etc.
  • the application layer includes all kinds of applications that directly interact with users, or service programs written in Java that run in the background, including desktop applications, contact applications, call applications, camera applications, image browsers, games, maps, and web Programs such as browsers, and other applications developed by developers.
  • the navigation gesture module can be located in the application layer. After the bottom layer of the Android system (the touch screen system in the kernel layer) detects the touch event, it reports the event information of the touch event to the application layer. The navigation gesture module of the application layer can be based on the touch The event information of the event, to determine whether the touch event is a navigation gesture used to trigger the navigation gesture function, and when the touch event is a navigation gesture, perform the display control corresponding to the navigation gesture function, such as returning to the desktop, returning to the previous interface, and viewing the latest Tasks and so on.
  • the touch event on the touch screen can be monitored.
  • the bottom layer of the operating system that is, the touch screen system in the kernel layer
  • Step S120 When a touch event to the touch screen is monitored, multiple types of event information corresponding to the touch event are acquired, and the multiple types of event information include coordinate event information.
  • input events in the mobile terminal can be uniformly mounted in the input system (input system) of the kernel layer, and the input system can include a touch screen system and the like.
  • the input system is divided into different event types (type), event codes (code) and event attributes (value).
  • type event types
  • code event codes
  • value event attributes
  • the operating system's processing principle for input events is: the application layer is at the upper layer of the system, and the application layer is mainly used to monitor, receive, and process event information from input events reported by the bottom layer of the system.
  • the event information of the touch screen events monitored may include press event, lift event, coordinate event information, pressure event information of hand pressing, approximate event information of the diameter of the finger contacting the touch screen, and finger
  • the approximate event information of the diameter and the coordinate event information can be divided into the event information of the abscissa and the event information of the ordinate.
  • the event information that must be reported can include press event, lift event, and coordinate event information.
  • Optional event information can include hand pressure event information, and the diameter of the finger touching the touch screen. Approximate event information and approximate event information of the diameter of the finger.
  • the operating system when it detects a touch event on the touch screen, it can acquire various event information corresponding to the touch event.
  • Various event information may include pressing events, lifting events, coordinate event information, pressure event information of hand pressing, event information of approximate event information of the diameter of the finger contacting the touch screen, and approximate event information of the diameter of the finger.
  • the event information corresponding to a specific touch event may not be limited.
  • Step S130 Determine the touch coordinates of the touch event according to the coordinate event information.
  • the operating system after the operating system obtains various event information of the touch event, that is, the underlying touch screen system obtains various event information, it can determine the touch event of the touch event according to the coordinate event information in the multiple event information. coordinate.
  • the coordinate information acquired by the touch screen system may include the abscissa and ordinate of one or more touch points corresponding to the touch event. It is understandable that when the touch event is an event corresponding to a click operation, the coordinate information may include the abscissa and ordinate corresponding to the click position. When the touch event is an event corresponding to a sliding operation, the coordinate information may include the sliding track. The abscissa and ordinate corresponding to multiple touch points.
  • Step S140 When the touch coordinates are in a preset range, report the event information corresponding to the navigation gesture function in the multiple event information to the application layer, and the preset range includes the navigation gesture function corresponding to the The touch coordinate range of the navigation gesture, and the event information corresponding to the navigation gesture function is used to instruct the application layer to perform the processing operation corresponding to the navigation gesture function.
  • the touch screen system at the bottom of the operating system can determine whether the touch coordinates are within a preset range after acquiring the touch coordinates of the touch event, so as to confirm whether the touch position of the touch event is within the touch range of the navigation gesture . If the touch coordinates are within the preset range, it means that the touch position of the touch event is within the touch range of the navigation gesture.
  • the touch event may be an input event of the navigation gesture. Therefore, the operating system (ie, the underlying touch screen system) is reporting to the application layer When the event information of the touch event is touched, the event information corresponding to the navigation gesture function can be reported to the application layer.
  • the touch screen system when the navigation gesture function is turned on, when the user triggers the navigation gesture function, the touch screen system only needs to report the necessary input event information, that is, only the press event, the lift event and the coordinate information need to be reported, and the redundant information cannot be reported.
  • the pressure event information, the diameter event information of the finger contacting the touch screen, the diameter event information of the finger, and the reporting of redundant event information will cause the navigation gesture algorithm to misjudge and cause the navigation gesture to fail. Therefore, the touch screen event may be the touch of the navigation gesture In the event of an event, the press event, lift event, and coordinate information are reported to the application layer, instead of reporting more event information to the application layer, so as to avoid navigation gesture failure.
  • the preset range may be the screen coordinate range corresponding to the navigation gesture, or it may be understood as the screen coordinate range that triggers the navigation gesture, and the specific preset range may not be limited.
  • the navigation gesture module in the application layer can determine whether the touch event is the input event of the navigation gesture according to the reported event information, and if it is the input of the navigation gesture Event, the control corresponding to the navigation gesture function can be performed, such as returning to the desktop, returning to the previous interface, and viewing recent tasks. If it is not an input event of the navigation gesture, the control corresponding to the navigation gesture function is not performed.
  • the touch screen system can report all event information of the touch event to the application layer to ensure that the touch event input by the user can be recognized by the application layer to achieve the required input purpose.
  • the method for processing touch events uses the operating system to monitor the touch event on the touch screen when the navigation gesture function of the mobile terminal is turned on, and when the touch event on the touch screen is monitored, multiple events corresponding to the touch event are obtained Information, a variety of event information includes coordinate event information, and then the touch coordinates of the touch time are determined according to the coordinate event information.
  • the event information corresponding to the navigation gesture function in the multiple event information is reported to the application layer .
  • the preset range includes the touch coordinate range of the navigation gesture corresponding to the navigation gesture function, and the event information corresponding to the navigation gesture function is used to instruct the application layer to perform processing operations corresponding to the navigation gesture function, thereby avoiding the situation that the navigation gesture function is turned on
  • the redundant event information is reported to the application layer, causing the navigation gesture to become invalid.
  • FIG. 3 shows a schematic flowchart of a touch event processing method provided by another embodiment of the present application. This method is applied to the operating system of the above-mentioned mobile terminal.
  • the mobile terminal includes a touch screen.
  • the process shown in FIG. 3 will be described in detail below.
  • the touch event processing method may specifically include the following steps:
  • Step S210 When the navigation gesture function of the mobile terminal is turned on, the operating system monitors the touch event on the touch screen.
  • Step S220 When a touch event to the touch screen is monitored, multiple types of event information corresponding to the touch event are acquired, and the multiple types of event information include coordinate event information.
  • Step S230 Determine the touch coordinates of the touch event according to the coordinate event information.
  • steps S210 to S230 may refer to the content of the foregoing embodiment, and details are not described herein again.
  • Step S240 Determine the starting point coordinates of the touch starting point of the touch event according to the touch coordinates.
  • the operating system may determine the initial coordinates of the touch start point of the touch event according to the touch coordinates.
  • the touch coordinates of the touch event may include the touch coordinates of the touch point during the entire touch process of the touch event.
  • the touch screen system at the bottom of the operating system can obtain the start point coordinates of the touch start point of the touch event according to the touch coordinates corresponding to the touch event.
  • the navigation gesture in the navigation gesture function is usually a sliding gesture that slides from the edge area of the touch screen to the center of the screen.
  • the touch start point of the navigation gesture is usually located in the edge area, so that the coordinates of the touch start point can be determined, and the coordinates of the touch start point can be determined according to the touch
  • the starting point coordinates determine whether the touch event may be an input event of a navigation gesture.
  • Step S250 When the starting point coordinates are in the preset range, report event information corresponding to the navigation gesture function in the multiple event information to the application layer, and the preset range includes the navigation gesture function
  • the touch coordinate range of the corresponding navigation gesture, and the event information corresponding to the navigation gesture function is used to instruct the application layer to perform the processing operation corresponding to the navigation gesture function.
  • the operating system after the operating system obtains the start point coordinates of the touch start point of the touch event, it can determine whether the start point coordinates are within the preset range. If the start point coordinates are within the preset range, it means the touch The event may be an input event of a navigation gesture; if the starting point coordinates are not within a preset range, it means that the touch event cannot be an input event of a navigation gesture.
  • the preset range may be the touch coordinate range corresponding to the navigation gesture of the navigation gesture function.
  • the touch coordinate range refers to the coordinate range corresponding to the touch point corresponding to the navigation gesture.
  • the preset range can be It is the coordinate range corresponding to the edge area of the touch screen, and the edge area is the area whose distance from the edge of the touch screen is less than the set distance. In this way, it can be determined whether the touch event may be an input event of a navigation gesture by determining whether the coordinates of the starting point of the touch event are in the corresponding coordinate range of the edge area.
  • the sliding length corresponding to the navigation gesture usually needs to be greater than a certain length. Therefore, the distance between the touch start point and the touch terminal can be determined, and further determined according to the distance Whether the touch event may be an input event of a navigation gesture, so as to accurately confirm whether the touch event may be an input event of a navigation gesture.
  • the touch event processing method may further include: when the starting point coordinates are in the preset range, determining the end point coordinates of the touch event according to the touch coordinates; acquiring the starting point coordinates and the The distance between the endpoint coordinates; if the distance is greater than a set threshold, the event information corresponding to the navigation gesture function in the multiple event information is reported to the application layer.
  • the end point coordinates of the touch focus of the touch event are obtained through the coordinate information, and according to the distance between the start point coordinates and the end point coordinates, if the distance is greater than the set threshold, the sliding length of the touch event is greater than a certain length, the touch event It may be an input event of a navigation gesture. If the distance is less than or equal to a set threshold, it means that the touch event cannot be an input event of a navigation gesture. Among them, the specific numerical value of the set threshold may not be regarded as a limitation.
  • the event information corresponding to the navigation gesture function in the multiple event information is reported to the application layer to avoid reporting redundant event information to the application Layer and cause navigation gestures to fail.
  • the method for processing the touch event may further include: if the distance is less than or equal to the set threshold, reporting the various event information to the application layer. It is understandable that when the threshold is less than or equal to the set threshold, it means that the touch event cannot be an input event of a navigation gesture. Therefore, all acquired event information can be reported to the application layer to ensure that the touch event input by the user can be The application layer recognizes, and realizes the input purpose of the requirement.
  • Step S260 If the touch coordinates are located in a range other than the preset range, report the various event information to the application layer.
  • the operating system When reporting the event information of a touch event to the application layer, the underlying touch screen system can report all event information of the touch event to the application layer to ensure that the touch event input by the user can be recognized by the application layer to achieve the required input purpose.
  • the touch event processing method determines whether the touch event may be an input event of a navigation gesture according to whether the coordinates of the touch start point of the touch event are in the preset range, and when the touch start point coordinates are in the preset range , Report the event information corresponding to the navigation gesture function in the various event information of the touch event to the application layer, so as to avoid the case that the navigation gesture function is turned on, when the touch event within the touch coordinate range of the navigation gesture occurs, the redundant event is reported Information to the application layer causes navigation gestures to fail.
  • FIG. 4 shows a schematic flowchart of a touch event processing method provided by another embodiment of the present application. This method is applied to the operating system of the above-mentioned mobile terminal.
  • the mobile terminal includes a touch screen.
  • the process shown in FIG. 4 will be described in detail below.
  • the touch event processing method may specifically include the following steps:
  • Step S310 When the navigation gesture function of the mobile terminal is turned on, the operating system monitors touch events on the touch screen
  • Step S320 When a touch event to the touch screen is monitored, multiple event information corresponding to the touch event is acquired, and the multiple event information includes coordinate event information.
  • Step S330 Determine the touch coordinates of the touch event according to the coordinate event information.
  • step S310 to step S330 can refer to the content of the foregoing embodiment, which will not be repeated here.
  • Step S340 When all the coordinates in the touch coordinates are within a preset range, report event information corresponding to the navigation gesture function in the multiple event information to the application layer, and the preset range includes the navigation gesture.
  • the touch coordinate range of the navigation gesture corresponding to the function, and the event information corresponding to the navigation gesture function is used to instruct the application layer to perform the processing operation corresponding to the navigation gesture function.
  • Step S350 When there are touch coordinates that are not in the preset range among the touch coordinates, report the various event information to the application layer.
  • the operating system after the operating system obtains the touch coordinates of the touch event, it can determine whether the touch coordinates of all touch points are within a preset range according to the touch coordinates. It is understandable that the preset range is the touch coordinate range corresponding to the navigation gesture. If the touch coordinates of all touch point coordinates are within the preset range, it means that the touch event is most likely the input event of the navigation gesture, and the coordinates of all touch points are If there are touch coordinates that are not in the preset range in the touch coordinates, it means that the touch event cannot be an input event of a navigation gesture.
  • the event information corresponding to the navigation gesture function in the multiple event information is reported to the application layer to avoid reporting redundant event information to the application layer and causing the navigation gesture to become invalid.
  • all kinds of event information are reported to the application layer to ensure the normal reporting of input events on the touch screen.
  • the touch event processing method determines whether the touch event may be an input event of a navigation gesture according to whether the touch coordinates of all touch points of the touch event are all in the preset range.
  • the event information corresponding to the navigation gesture function in the multiple event information of the touch event is reported to the application layer.
  • the touch coordinates of all touch points in the touch event have touch coordinates that are not in the preset range.
  • the event information of the touch event is reported to the application layer, so as to avoid that when the navigation gesture function is turned on, when a touch event within the touch coordinate range of the navigation gesture occurs, the redundant event information is reported to the application layer and navigation The gesture is invalid.
  • FIG. 5 shows a schematic flowchart of a touch event processing method according to another embodiment of the present application. This method is applied to the operating system of the above-mentioned mobile terminal.
  • the mobile terminal includes a touch screen.
  • the process shown in FIG. 5 will be described in detail below.
  • the touch event processing method may specifically include the following steps:
  • Step S410 Request to the application layer to obtain the event information corresponding to the navigation gesture function, and store the event information corresponding to the navigation gesture function.
  • the event information corresponding to the navigation gesture function is determined by the application layer according to the navigation
  • the navigation gesture algorithm corresponding to the gesture function is determined.
  • the event information corresponding to the navigation gesture function that is, when the navigation gesture function is turned on, the bottom layer of the operating system needs to report the event information of the touch event to the application layer, which can be navigated from the kernel layer of the operating system to the application layer Obtained by the gesture module.
  • the kernel layer can initiate a request for event information corresponding to the navigation gesture function to the navigation gesture module in the application layer. After receiving the request, the navigation gesture module in the application layer can return the event information corresponding to the navigation gesture function in the kernel layer.
  • the layer obtains the event information corresponding to the navigation gesture function, can store the event information corresponding to the navigation gesture function in the framework layer, and when reporting the event information of the touch event, the kernel layer can obtain the event type that needs to be reported from the framework layer.
  • Step S420 When the navigation gesture function of the mobile terminal is turned on, the operating system monitors the touch event on the touch screen.
  • Step S430 When a touch event to the touch screen is monitored, obtain various event information corresponding to the touch event, and the various event information includes coordinate event information.
  • Step S440 Determine the touch coordinates of the touch event according to the coordinate event information.
  • Step S450 When the touch coordinates are in a preset range, report event information corresponding to the navigation gesture function in the multiple event information to the application layer, and the preset range includes the navigation gesture function corresponding to the The touch coordinate range of the navigation gesture, and the event information corresponding to the navigation gesture function is used to instruct the application layer to perform the processing operation corresponding to the navigation gesture function.
  • steps S430 to S450 can refer to the content of the foregoing embodiment, which will not be repeated here.
  • Step S460 If multiple touch events on the touch screen are monitored within a preset time period, and the touch coordinates of the multiple monitored touch events are within a preset range, request the application layer to obtain the navigation gesture function again Corresponding event information, and store the event information corresponding to the navigation gesture.
  • the bottom layer of the operating system can monitor subsequent events after reporting the event information corresponding to the navigation gesture function to the application layer when it is determined that the touch event may be an input event of the navigation gesture. If the preset duration content detects multiple touch events on the touch screen, and the touch coordinates of the multiple monitored touch events are within the preset range, it means that the user may make navigation gestures multiple times within the preset duration, but it has not been distinguish. Therefore, the bottom layer of the operating system can obtain the event information corresponding to the navigation gesture function from the application layer again to determine whether the event information corresponding to the navigation gesture function needs to be updated, so as to avoid updating the event information required by the navigation gesture function. As a result, navigation gestures cannot be recognized.
  • the event information required by the previous navigation gesture function includes press events, lift events, and coordinate information
  • the navigation gesture algorithm is updated to also include approximate events of the diameter of the finger. Therefore, the bottom layer of the operating system only reports press events and lift events. If the event and coordinate information are triggered, the navigation gesture module in the application layer cannot recognize the navigation gestures, so that the navigation gesture function cannot be realized. Therefore, the bottom layer of the operating system can reacquire the event information corresponding to the navigation gesture function to prevent subsequent navigation gestures from being unacceptable. Recognition.
  • Step S470 request the application layer to obtain the touch coordinate range of the navigation gesture corresponding to the navigation gesture function, and update the preset range according to the obtained touch coordinate range.
  • the preset duration content detects multiple touch events on the touch screen, and the touch coordinates of the multiple monitored touch events are within the preset range, it means that the navigation gesture algorithm may have been updated, resulting in multiple The second navigation gesture was not recognized. Therefore, after the bottom layer of the operating system updates the event information corresponding to the navigation gesture function, it can also update the touch coordinate range corresponding to the navigation gesture, that is, update the preset range. Therefore, the bottom layer of the operating system can obtain the touch coordinate range of the navigation gesture corresponding to the navigation gesture function from the navigation gesture module of the application layer, and update the preset range to prevent the touch coordinate range of the navigation gesture from changing, which may cause subsequent users The navigation gestures made could not be recognized.
  • the bottom layer of the operating system obtains event information corresponding to the navigation gesture function from the application layer, and stores the event information in the driver layer. Each time the touch event of the navigation gesture is reported, it is based on the driver The event information corresponding to the navigation gesture function stored in the layer is reported. In addition, when multiple touch events are monitored and the touch coordinates of the touch event are within the preset range, the event information corresponding to the navigation gesture function is updated, and the preset range is updated, so as to prevent the subsequent navigation gesture input by the user from being unable to be affected. Identify, and affect the user experience.
  • FIG. 6 shows a schematic flowchart of a touch event processing method provided by yet another embodiment of the present application. This method is applied to the operating system of the above-mentioned mobile terminal.
  • the mobile terminal includes a touch screen.
  • the process shown in FIG. 6 will be described in detail below.
  • the touch event processing method may specifically include the following steps:
  • Step S501 Display a navigation setting interface, and the navigation setting interface includes functional options of the navigation gesture function.
  • the mobile terminal may provide navigation function settings.
  • the operating system of the mobile terminal can control the display screen to display the navigation setting interface, and the navigation setting interface includes functional options for the navigation gesture function.
  • the function option may be in the form of a switch control A1, and the user can control the opening and closing of the navigation gesture function by operating the switch control A1.
  • Step S502 When an opening operation on the function option is detected, enable the navigation gesture function.
  • the operating system when the operating system detects the opening operation of the function option, it means that the user needs to turn on the navigation gesture function, so the navigation gesture function can be controlled to turn on, for example, the opening operation of the switch control is detected, that is, The navigation gesture function can be controlled to turn on.
  • the bottom layer of the operating system may issue an instruction to the application layer, and the instruction is used to control the activation of the navigation gesture module corresponding to the navigation gesture function.
  • Step S503 Display a selection interface of the navigation gesture, where the selection interface includes selection options corresponding to multiple navigation modes.
  • the navigation mode of the navigation gesture function can be multiple.
  • the navigation gesture selection interface can be displayed.
  • the selection interface includes selection options corresponding to multiple navigation methods.
  • the selection options are used for user selection.
  • Navigation method For example, as shown in FIG. 8, the navigation mode includes two-side return, simple gesture, right-side return, and left-side return, and the user can select the navigation mode by selecting the corresponding selection option of the navigation mode.
  • the way of returning on both sides is that the sliding operation from the two areas at the bottom of the touch screen corresponds to returning to the previous level, and the sliding operation from the middle area at the bottom of the touch screen corresponds to returning to the desktop, and from the upper middle area at the bottom of the touch screen.
  • the operation of sliding and staying corresponds to viewing recent tasks;
  • the simple gesture method is to return to the desktop by sliding up from the bottom area of the touch screen, and sliding and staying from the bottom area of the touch screen to view recent tasks;
  • right The way of side return is that the sliding operation from the right area of the bottom area of the touch screen is used to return to the previous level, and the sliding operation from the middle area of the bottom area of the touch screen is used to return to the desktop, from the bottom area of the touch screen.
  • the swipe up operation in the left area of the touch screen is used to view recent tasks; the way to return on the right is to swipe up from the right area of the bottom area of the touch screen to view recent tasks, from the middle area of the bottom area of the touch screen
  • the slide-up operation performed is used to return to the desktop, and the slide-up operation performed from the left area of the bottom area of the touch screen is used to return to the previous level.
  • the above navigation mode is only an example, and does not represent a limitation on the navigation mode of the navigation gesture function in the embodiment of the present application.
  • Step S504 Determine the selected navigation mode according to the operation on the selected option, and the navigation mode corresponds to different navigation gestures.
  • Step S505 Determine a navigation gesture corresponding to the navigation mode according to the navigation mode.
  • Step S506 Obtain the touch coordinate range of the navigation gesture corresponding to the navigation mode, and use the touch coordinate range as the preset range.
  • the navigation gestures corresponding to different navigation methods may be different. Therefore, the navigation gesture corresponding to the navigation method can be determined according to the selected navigation method, and the touch coordinate range of the navigation gesture can be determined as the preset range according to the navigation gesture.
  • Step S507 When the navigation gesture function of the mobile terminal is turned on, the operating system monitors the touch event to the touch screen.
  • Step S508 When a touch event to the touch screen is monitored, multiple event information corresponding to the touch event is acquired, and the multiple event information includes coordinate event information.
  • Step S509 Determine the touch coordinates of the touch event according to the coordinate event information.
  • Step S510 When the touch coordinates are in a preset range, report the event information corresponding to the navigation gesture function in the multiple event information to the application layer, and the preset range includes the navigation gesture function corresponding to the The touch coordinate range of the navigation gesture, and the event information corresponding to the navigation gesture function is used to instruct the application layer to perform the processing operation corresponding to the navigation gesture function.
  • steps S507 to S510 can refer to the content of the foregoing embodiment, and details are not described herein again.
  • Step S511 When the navigation gesture function of the mobile terminal is turned off, if the operating system detects a touch event on the touch screen, acquire various event information corresponding to the touch event, and combine the various events The information is reported to the application layer.
  • the touch event processing method provided in the embodiment of the present application provides a method for setting the navigation gesture function, which is convenient for the user to manage the navigation gesture function. And according to whether the touch coordinate of the touch event is in the preset range, it is determined whether the touch event may be the input event of the navigation gesture. When the touch coordinate is in the preset range, the event corresponding to the navigation gesture function is included in the various event information of the touch event Information is reported to the application layer, so as to avoid that when the navigation gesture function is turned on, when a touch event within the touch coordinate range of the navigation gesture occurs, the redundant event information is reported to the application layer and the navigation gesture becomes invalid.
  • FIG. 9 shows a structural block diagram of a touch event processing apparatus 400 provided by an embodiment of the present application.
  • the touch event processing device 400 applies the aforementioned operating system of the mobile terminal.
  • the touch event processing device 400 includes a touch monitoring module 410, an event acquisition module 420, a coordinate acquisition module 430, and an event reporting module 440.
  • the touch monitoring module 410 is used to monitor touch events on the touch screen when the navigation gesture function of the mobile terminal is turned on
  • the event acquisition module 420 is used to monitor touch events on the touch screen when the navigation gesture function of the mobile terminal is turned on.
  • multiple event information corresponding to the touch event is acquired, and the multiple event information includes coordinate event information; the coordinate acquisition module 430 is configured to determine the touch event’s event information according to the coordinate event information. Touch coordinates; the event reporting module 440 is used to report the event information corresponding to the navigation gesture function in the multiple event information to the application layer when the touch coordinates are within a preset range, the preset range including The touch coordinate range of the navigation gesture corresponding to the navigation gesture function, and the event information corresponding to the navigation gesture function is used to instruct the application layer to perform processing operations corresponding to the navigation gesture function.
  • the event reporting module 440 is further configured to report the various event information to the application layer if the touch coordinates are located in a range other than the preset range.
  • the event reporting module 440 includes a starting point obtaining unit 441 and a reporting execution unit 442.
  • the starting point acquiring unit 441 is configured to determine the starting point coordinates of the touch starting point of the touch event according to the touch coordinates;
  • the reporting execution unit 442 is configured to determine when the starting point coordinates are located in the preset In the case of the range, the event information corresponding to the navigation gesture function in the multiple event information is reported to the application layer.
  • the reporting execution unit 442 may be specifically configured to: when the starting point coordinates are within the preset range, determine the end point coordinates of the touch event according to the touch coordinates; and obtain the starting point coordinates; The distance between the start point coordinates and the end point coordinates; if the distance is greater than a set threshold, the event information corresponding to the navigation gesture function in the multiple event information is reported to the application layer.
  • the report execution unit 442 may be further configured to report the various event information to the application layer if the distance is less than or equal to the set threshold.
  • the reporting execution unit 442 may also be specifically configured to: when all of the touch coordinates are within a preset range, combine the event information corresponding to the navigation gesture function in the multiple event information Report to the application layer.
  • the device 400 for processing a touch event may further include: an event information acquisition module.
  • the event information acquisition module may be used to request the application layer to acquire the event information corresponding to the navigation gesture function, and to store the event information corresponding to the navigation gesture function.
  • the event information corresponding to the navigation gesture function is used by the application layer. Determined according to the navigation gesture algorithm corresponding to the navigation gesture function.
  • the device 400 for processing a touch event may further include an event information update module.
  • the event information update module can be used to request the application layer to obtain all touch events from the application layer if multiple touch events are detected on the touch screen within a preset period of time, and the touch coordinates of the multiple monitored touch events are within a preset range.
  • the event information corresponding to the navigation gesture function is described, and the event information corresponding to the navigation gesture is stored.
  • the device 400 for processing a touch event may further include: a range update module.
  • the range update module may be used to request the application layer to obtain the touch coordinate range of the navigation gesture corresponding to the navigation gesture function, and update the preset range according to the obtained touch coordinate range.
  • the touch event processing device 400 may further include: a setting interface display module and a navigation gesture activation module.
  • the setting interface display module is used to display a navigation setting interface before the operating system monitors a touch event on the touch screen when the navigation gesture function of the mobile terminal is turned on.
  • the navigation setting interface includes the navigation gesture function. Function option; the navigation gesture enabling module is used to enable the navigation gesture function when an opening operation on the function option is detected.
  • the touch event processing device 400 may further include: a selection interface display module and a navigation mode acquisition module.
  • the selection interface display module is configured to display a selection interface of the navigation gesture after the navigation gesture function is enabled when the opening operation of the function option is detected, and the selection interface includes selection options corresponding to multiple navigation modes
  • the navigation mode acquisition module is used to determine the selected navigation mode according to the operation of the selected option, and the navigation mode corresponds to different navigation gestures.
  • the touch event processing device 400 may further include: a navigation gesture acquisition module and a preset range determination module.
  • the navigation gesture acquisition module is used to determine the navigation gesture corresponding to the navigation method according to the navigation method;
  • the preset range determination module is used to acquire the touch coordinate range of the navigation gesture corresponding to the navigation method, and compare the touch coordinates The range serves as the preset range.
  • the event reporting module 440 may also be used to: when the navigation gesture function of the mobile terminal is turned off, if the operating system detects a touch event on the touch screen, obtain the corresponding touch event Multiple types of event information, and report the multiple types of event information to the application layer.
  • the various event information includes coordinate event information, press event, lift event, press pressure event information, approximate event of the diameter of the finger contacting the touch screen, and approximate event of the diameter of the finger;
  • the event information corresponding to the navigation gesture function includes coordinate event information, pressing event, and lifting event.
  • the preset range includes a coordinate range corresponding to an edge area of the touch screen, and the edge area is an area whose distance from the edge of the touch screen is less than a set distance.
  • the coupling between the modules may be electrical, mechanical or other forms of coupling.
  • each functional module in each embodiment of the present application may be integrated into one processing module, or each module may exist alone physically, or two or more modules may be integrated into one module.
  • the above-mentioned integrated modules can be implemented in the form of hardware or software function modules.
  • the operating system monitors the touch event on the touch screen.
  • the touch event on the touch screen When the touch event on the touch screen is detected, it obtains various event information corresponding to the touch event. Including coordinate event information, and then determining the touch coordinates of the touch time according to the coordinate event information.
  • the touch coordinates are in the preset range, the event information corresponding to the navigation gesture function in the multiple event information is reported to the application layer.
  • the preset range includes navigation gestures
  • the touch coordinate range of the navigation gesture corresponding to the function, and the event information corresponding to the navigation gesture function is used to instruct the application layer to perform processing operations corresponding to the navigation gesture function. Therefore, when the function of the navigation gesture is turned on, when a touch event within the touch coordinate range of the navigation gesture occurs, the redundant event information is reported to the application layer, which causes the navigation gesture to become invalid.
  • the mobile terminal 100 may be an electronic device capable of running application programs, such as a smart phone, a tablet computer, or an e-book.
  • the mobile terminal 100 in this application may include one or more of the following components: a processor 110, a memory 120, a touch screen 130, and one or more application programs, of which one or more application programs may be stored in the memory 120 and configured To be executed by one or more processors 110, one or more programs are configured to execute the methods described in the foregoing method embodiments.
  • the processor 110 may include one or more processing cores.
  • the processor 110 uses various interfaces and lines to connect various parts of the entire electronic device 100, and executes by running or executing instructions, programs, code sets, or instruction sets stored in the memory 120, and calling data stored in the memory 120.
  • Various functions and processing data of the electronic device 100 may use at least one of digital signal processing (Digital Signal Processing, DSP), Field-Programmable Gate Array (Field-Programmable Gate Array, FPGA), and Programmable Logic Array (Programmable Logic Array, PLA).
  • DSP Digital Signal Processing
  • FPGA Field-Programmable Gate Array
  • PLA Programmable Logic Array
  • the processor 110 may be integrated with one or a combination of a central processing unit (CPU), a graphics processing unit (GPU), a modem, and the like.
  • the CPU mainly processes the operating system, user interface, and application programs; the GPU is used for rendering and drawing of display content; the modem is used for processing wireless communication. It can be understood that the above-mentioned modem may not be integrated into the processor 110, but may be implemented by a communication chip alone.
  • the memory 120 may include random access memory (RAM) or read-only memory (Read-Only Memory).
  • the memory 120 may be used to store instructions, programs, codes, code sets or instruction sets.
  • the memory 120 may include a program storage area and a data storage area, where the program storage area may store instructions for implementing the operating system and instructions for implementing at least one function (such as touch function, sound playback function, image playback function, etc.) , Instructions used to implement the following various method embodiments, etc.
  • the data storage area can also store data (such as phone book, audio and video data, chat record data) created by the terminal 100 during use.
  • the touch screen 130 can collect the user's touch operations on or near it (for example, the user's operations on the touch screen 130 or near the touch screen 130 using any suitable object or accessory such as a finger, a stylus, etc.), and set it according to a preset
  • the program drives the corresponding connection device.
  • the touch screen 130 may include a touch detection device and a touch controller.
  • the touch detection device detects the user's touch position, detects the signal brought by the touch operation, and transmits the signal to the touch controller;
  • the touch controller receives touch information from the touch detection device, and The touch information is converted into contact coordinates, and then sent to the processor 110, and the command sent by the processor 110 can be received and executed.
  • multiple types such as resistive type, capacitive type, infrared, and surface acoustic wave may be used to implement the touch detection function of the touch screen 130.
  • FIG. 12 shows a structural block diagram of a computer-readable storage medium provided by an embodiment of the present application.
  • the computer-readable medium 800 stores program code, and the program code can be invoked by a processor to execute the method described in the foregoing method embodiment.
  • the computer-readable storage medium 800 may be an electronic memory such as flash memory, EEPROM (Electrically Erasable Programmable Read Only Memory), EPROM, hard disk, or ROM.
  • the computer-readable storage medium 800 includes a non-transitory computer-readable storage medium.
  • the computer-readable storage medium 800 has storage space for the program code 810 for executing any method steps in the above-mentioned methods. These program codes can be read from or written into one or more computer program products.
  • the program code 810 may be compressed in a suitable form, for example.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Telephone Function (AREA)

Abstract

本申请公开了触摸事件的处理方法、装置、移动终端及存储介质,该触摸事件的处理方法应用于移动终端的操作系统,所述移动终端包括触摸屏,该方法包括:在所述移动终端的导航手势功能开启时,所述操作系统监测对所述触摸屏的触摸事件;当监测到对所述触摸屏的触摸事件时,获取所述触摸事件对应的多种事件信息,所述多种事件信息中包括坐标事件信息;根据所述坐标事件信息,确定所述触摸事件的触摸坐标;当所述触摸坐标位于预设范围时,将所述多种事件信息中所述导航手势功能对应的事件信息上报至应用层,所述预设范围包括所述导航手势功能所对应的导航手势的触摸坐标范围。本方法可以有效避免导航手势的失效问题。

Description

触摸事件的处理方法、装置、移动终端及存储介质 技术领域
本申请涉及移动终端技术领域,更具体地,涉及一种触摸事件的处理方法、装置、移动终端及存储介质。
背景技术
移动终端,例如平板电脑、手机等,已经成为人们日常生活中最常用的消费型电子产品之一。随着移动终端的屏幕的不断增大,移动终端上的按键被逐渐取消。由于移动终端上没有了用于返回主页、回退、查看任务等的按键,因此产生了导航手势功能。
发明内容
鉴于上述问题,本申请提出了一种触摸事件的处理方法、装置、移动终端及存储介质。
第一方面,本申请实施例提供了一种触摸事件的处理方法,应用于移动终端的操作系统,所述移动终端包括触摸屏,所述方法包括:在所述移动终端的导航手势功能开启时,所述操作系统监测对所述触摸屏的触摸事件;当监测到对所述触摸屏的触摸事件时,获取所述触摸事件对应的多种事件信息,所述多种事件信息中包括坐标事件信息;根据所述坐标事件信息,确定所述触摸事件的触摸坐标;当所述触摸坐标位于预设范围时,将所述多种事件信息中所述导航手势功能对应的事件信息上报至应用层,所述预设范围包括所述导航手势功能所对应的导航手势的触摸坐标范围,所述导航手势功能对应的事件信息用于指示所述应用层进行所述导航手势功能对应的处理操作。
第二方面,本申请实施例提供了一种触摸事件的处理装置,应用于移动终端的操作系统,所述移动终端包括触摸屏,所述装置包括:触摸监测模块、事件获取模块、坐标获取模块以及事件上报模块,其中,所述触摸监测模块用于在所述移动终端的导航手势功能开启时,所述操作系统监测对所述触摸屏的触摸事件;所述事件获取模块用于当监测到对所述触摸屏的触摸事件时,获取所述触摸事件对应的多种事件信息,所述多种事件信息中包括坐标事件信息;所述坐标获取模块用于根据所述坐标事件信息,确定所述触摸事件的触摸坐标;所述事件上报模块用于当所述触摸坐标位于预设范围时,将所述多种事件信息中所述导航手势功能对应的事件信息上报至应用层,所述预设范围包括所述导航手势功能所对应的导航手势的触摸坐标范围,所述导航手势功能对应的事件信息用于指示所述应用层进行所述导航手势功能对应的处理操作。
第三方面,本申请实施例提供了一种电子设备,包括:一个或多个处理器;存储器;一个或多个应用程序,其中所述一个或多个应用程序被存储在所述存储器中并被配置为由所述一个或多个处理器执行,所述一个或多个程序配置用于执行上述第一方面提供的触摸事件的处理方法。
第四方面,本申请实施例提供了一种计算机可读取存储介质,所述计算机可读取存储介质中存储有程序代码,所述程序代码可被处理器调用执行上述第一方面提供的触摸事件的处理方法。
本申请提供的方案,通过在移动终端的导航手势功能开启时,操作系统监测对触摸屏的触摸事件,当监测到对触摸屏的触摸事件时,获取触摸事件对应的多种事件信息,多种事件信息中包括坐标事件信息,然后根据坐标事件信息确定触摸时间的触摸 坐标,当触摸坐标位于预设范围时,将多种事件信息中导航手势功能对应的事件信息上报至应用层,预设范围包括导航手势功能所对应的导航手势的触摸坐标范围,导航手势功能对应的事件信息用于指示应用层进行该导航手势功能对应的处理操作。从而可以避免导航手势的功能开启的情况下,发生导航手势的触摸坐标范围内的触摸事件时,上报多余的事件信息至应用层而导致导航手势失效。
附图说明
为了更清楚地说明本申请实施例中的技术方案,下面将对实施例描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请的一些实施例,对于本领域技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1示出了本申请实施例提供的一种界面示意图。
图2示出了根据本申请一个实施例的触摸事件的处理方法流程图。
图3示出了根据本申请另一个实施例的触摸事件的处理方法流程图。
图4示出了根据本申请又一个实施例的触摸事件的处理方法流程图。
图5示出了根据本申请再一个实施例的触摸事件的处理方法流程图。
图6示出了根据本申请又再一个实施例的触摸事件的处理方法流程图。
图7示出了本申请又再一个实施例中提供的一种界面示意图。
图8示出了本申请又再一个实施例中提供的另一种界面示意图。
图9示出了根据本申请一个实施例的触摸事件的处理装置的一种框图。
图10示出了根据本申请一个实施例的触摸事件的处理装置中事件上报模块的框图。
图11是本申请实施例的用于执行根据本申请实施例的触摸事件的处理方法的移动终端的框图。
图12是本申请实施例的用于保存或者携带实现根据本申请实施例的触摸事件的处理方法的程序代码的存储单元。
具体实施方式
为了使本技术领域的人员更好地理解本申请方案,下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述。
显示屏通常在手机、平板电脑等电子设备中所起到的作用为显示文本、图片、图标或者视频等内容。而伴随着触控技术的发展,越来越多的电子设备所设置的显示屏为触控显示屏,在设置触控显示屏的情况下,当检测到用户在触控显示屏上进行拖拽、单击、双击、滑动等触控操作时,可以对用户的触控操作进行响应。
随着用户对所显示内容的清晰度以及精细度要求越来越高,更多的电子设备采用尺寸较大的触控显示屏。但是,在设置尺寸较大的触控显示屏的过程中,发现电子设备前端所设置的前置摄像头、接近光传感器、听筒等功能器件会影响触控显示屏所能扩展到的区域。
通常电子设备包括前面板、后盖以及边框。在前面板包括上额区、中部屏幕区和下部按键区。通常,上额区设置有听筒出音孔以及前置摄像头等功能器件,中部屏幕区设置有触控显示屏,下部按键区设置有一到三个物理按键。
随着移动终端的屏幕的不断增大,移动终端上的下部按键区被逐渐取消。由于移动终端上没有了用于返回主页、回退、查看任务等的按键,因此产生了导航手势功能。其中,导航手势功能为通过系统界面或者应用界面上的导航手势,而实现返回桌面、返回上一级以及查看最近任务等利用实体按键或者虚拟按键完成的功能。例如,如图1所示,用户可以通过触摸屏上底部区域的上滑操作手势实现返回桌面,当然,具体的导航手势可以不作为限定。
其中,导航功能的实现是利用移动终端的操作系统将触摸事件的事件信息上报至上层系统(应用层),上层系统根据触摸事件的事件信息,确定触摸事件是否为导航手势,进而确定是否触发导航手势功能。
发明人经过长期研究发现,移动终端的导航手势功能开启的情况下,移动终端的系统在向应用层上报触摸事件的事件信息时,存在上报多余事件信息的情况,而导致导航手势失效。
针对上述问题,发明人提出了本申请实施例提供的触摸事件的处理方法、装置、移动终端以及存储介质,可以避免导航手势的功能开启的情况下,发生导航手势的触摸坐标范围内的触摸事件时,上报多余的事件信息至应用层而导致导航手势失效。其中,具体的触摸事件的处理方法在后续的实施例中进行详细的说明。
请参阅图2,图2示出了本申请一个实施例提供的触摸事件的处理方法的流程示意图。所述触摸事件的处理方法用于避免导航手势的功能开启的情况下,发生导航手势的触摸坐标范围内的触摸事件时,上报多余的事件信息至应用层而导致导航手势失效。在具体的实施例中,所述触摸事件的处理方法应用于如图9所示的触摸事件的处理装置400以及配置有所述触摸事件的处理装置400的移动终端100(图11)。下面将以移动终端为例,说明本实施例的具体流程,当然,可以理解的,本实施例所应用的移动终端可以为智能手机、平板电脑、智能手表等,在此不做限定。具体的,该触摸事件的处理方法应用于移动终端的操作系统。其中,该移动终端包括触摸屏。下面将针对图2所示的流程进行详细的阐述,所述触摸事件的处理方法具体可以包括以下步骤:
步骤S110:在所述移动终端的导航手势功能开启时,所述操作系统监测对所述触摸屏的触摸事件。
在本申请实施例中,移动终端可以对导航手势功能进行监测,并且在监测到导航手势功能开启时,对触摸屏的触摸事件采用相应的处理方式。其中,导航手势功能用于根据检测到的系统界面或者应用界面中的导航手势,实现返回桌面、返回上一级以及查看最近任务等利用实体按键或者虚拟按键完成的功能。导航手势为预先设定的用于触发返回桌面、返回上一级以及查看最近任务的触摸手势,导航手势可以为上滑手势、左滑手势、右滑手势等,在此不做限定。
在一些实施方式中,移动终端可以中可以设置有用于管控导航手势功能的开启以及关闭的开关,例如导航功能的设置界面中的开关,通过对该开关的管控,即可实现对导航手势功能的开启以及关闭。相应的,移动终端可以检测该开关的状态,并根据开关的状态确定导航手势功能是否开启,具体的,当该开关处于开启状态时,则导航手势功能处于开启的状态,当该开关处于关闭状态时,则导航手势功能处于关闭的状态。
在另一些实施方式中,移动终端中可以包括导航手势模块,导航手势模块用于实现导航手势功能,该导航手势模块可以为软件模块,用于根据操作系统上报的触摸事件的事件信息,确定触摸事件是否为导航手势,并根据导航手势进行相应的导航手势功能。相应的,操作系统可以监测导航手势模块是否处于开启状态,从而确定导航手势功能是否开启,具体的,当该导航手势模块处于开启状态时,则导航手势功能处于开启的状态,当该导航手势模块处于关闭状态时,则导航手势功能处于关闭的状态。
示例性的,以操作系统为安卓(Android)系统的移动终端为例,介绍导航手势功能的原理。安卓系统框架由下至上包括内核层、核心类库层、框架层及应用层。其中,内核层提供核心系统服务,包括安全、内存管理、进程管理、网络协议栈及硬件驱动等。其中,将内核层中的硬件驱动记为驱动层,该驱动层包括触控显示屏驱动、摄像头驱动等。核心类库层包括安卓运行环境(Android Runtime)和类库(Libraries)。其中,安卓运行环境提供大部分在Java编程语言核心类库中可用的功能,包括核心库(CoreLibraries)和Dalvik虚拟机(Dalvik VM)。每一个安卓应用程序是Dalvik虚拟机中 的实例,运行在它们自己的进程中。类库供安卓系统的各个组件使用,包括如下功能:媒体库(Media Framework)、界面管理(Surface Manager)、SQLite(关系数据库引擎)及Free Type(位图和矢量字体渲染)等,其各个功能通过安卓系统的框架层暴露给开发者使用。框架层提供开发安卓应用程序所需的一系列类库,使开发人员可以进行快速的应用程序开发,方便重用组件,也可以通过继承实现个性化的扩展,其提供的服务包括组件管理服务、窗口管理服务、系统数据源组件、空间框架、资源管理服务及安装包管理服务等。应用层上包括各类与用户直接交互的应用程序,或由Java语言编写的运行于后台的服务程序,包括桌面应用、联系人应用、通话应用、相机应用、图片浏览器、游戏、地图、web浏览器等程序,以及开发人员开发的其他应用程序。
进一步的,导航手势模块可以位于应用层中,安卓系统底层(内核层中的触摸屏系统)在监测到触摸事件后,将触摸事件的事件信息上报至应用层,应用层的导航手势模块可以根据触摸事件的事件信息,确定触摸事件是否为用于触发导航手势功能的导航手势,并在触摸事件为导航手势时,进行导航手势功能对应的显示控制,例如返回桌面、返回上一级界面以及查看最近任务等。
在一些实施方式中,当操作系统监测到导航手势功能开启时,则可以监测对触摸屏的触摸事件。具体的,可以由操作系统的底层(即内核层中的触摸屏系统)监测对触摸屏的触摸事件。例如,当监测到按下事件时,则确定监测到对触摸屏的触摸事件。
步骤S120:当监测到对所述触摸屏的触摸事件时,获取所述触摸事件对应的多种事件信息,所述多种事件信息中包括坐标事件信息。
在本申请实施例中,移动终端中的输入事件(例如触摸屏等外设的事件)可以统一挂载在内核层的输入系统(input系统)中,输入系统可以包括触摸屏系统等。输入系统分为不同的事件类型(type)、事件代码(code)以及事件属性(value)。操作系统对于输入事件的处理原理为:应用层处于系统的上层,应用层主要用于监听、接收以及处理来自系统底层上报的输入事件的事件信息。
示例性的,以触摸屏系统为例,其监测的触摸屏事件的事件信息可以包括按下事件、抬起事件、坐标事件信息、手按压的压力事件信息、手指接触触摸屏的直径的近似事件信息以及手指的直径的近似事件信息,坐标事件信息又可以分为横坐标的事件信息以及纵坐标的事件信息。触摸屏系统在上报触摸事件的事件信息时,必须上报的事件信息可以包括按下事件、抬起事件以及坐标事件信息,可选的事件信息可以包括手按压的压力事件信息、手指接触触摸屏的直径的近似事件信息以及手指的直径的近似事件信息。
在一些实施方式中,操作系统在监测到对触摸屏的触摸事件时,则可以获取触摸事件对应的多种事件信息。多种事件信息可以包括按下事件、抬起事件、坐标事件信息、手按压的压力事件信息、手指接触触摸屏的直径的近似事件信息以及手指的直径的近似事件信息。当然,具体的触摸事件对应的事件信息可以不做限定。
步骤S130:根据所述坐标事件信息,确定所述触摸事件的触摸坐标。
在本申请实施例中,操作系统在获取到触摸事件的多种事件信息后,即底层的触摸屏系统获取到多种事件信息,可以根据多种事件信息中的坐标事件信息,确定触摸事件的触摸坐标。
在一些实施方式中,触摸屏系统获取的坐标信息可以包括触摸事件对应的一个或者多个触摸点的横坐标以及纵坐标。可以理解的,当触摸事件为点击操作对应的事件时,则坐标信息可以包括点击位置对应的横坐标以及纵坐标,当触摸事件为滑动操作对应的事件时,则坐标信息可以包括滑动轨迹上的多个触摸点对应的横坐标以及纵坐标。
步骤S140:当所述触摸坐标位于预设范围时,将所述多种事件信息中所述导航手势功能对应的事件信息上报至应用层,所述预设范围包括所述导航手势功能所对应的 导航手势的触摸坐标范围,所述导航手势功能对应的事件信息用于指示所述应用层进行所述导航手势功能对应的处理操作。
在本申请实施例中,操作系统底层的触摸屏系统在获取到触摸事件的触摸坐标后,可以判断触摸坐标是否位于预设范围内,以确认该触摸事件的触摸位置是否处于导航手势的触摸范围内。如果触摸坐标位于预设范围内,则表示触摸事件的触摸位置处于导航手势的触摸范围内,该触摸事件可能为导航手势的输入事件,因此操作系统(即底层的触摸屏系统)在向应用层上报触摸事件的事件信息时,可以将导航手势功能对应的事件信息上报至应用层。可以理解的,由于导航手势功能开启时,用户触发导航手势功能的时候,则只需要触摸屏系统上报必须的输入事件信息,即只需要上报按下事件、抬起事件以及坐标信息,而不能上报多余的压力事件信息、手指接触触摸屏的直径事件信息、手指的直径事件信息,上报多余的事件信息,会导致导航手势算法的误判,而导致导航手势失效,因此在触摸屏事件可能为导航手势的触摸事件时,将按下事件、抬起事件以及坐标信息上报至应用层,而不上报多于的事件信息至应用层,避免导航手势失效。其中,预设范围可以为导航手势对应的屏幕坐标范围,也可以理解为触发导航手势的屏幕坐标范围,具体的预设范围可以不做限定。
其中,上报至应用层导航手势功能所对应的事件信息后,应用层中导航手势模块则可以根据上报的事件信息,确定此次的触摸事件是否为导航手势的输入事件,如果为导航手势的输入事件,则可以进行导航手势功能对应的控制,例如返回桌面、返回上一级界面以及查看最近任务等,如果不为导航手势的输入事件,则不进行导航手势功能对应的控制。
在一些实施方式中,如果触摸坐标不位于预设范围内,则表示触摸事件的触摸位置不处于导航手势的触摸范围内,该触摸事件不可能为导航手势的输入事件,因此操作系统(即底层的触摸屏系统)在向应用层上报触摸事件的事件信息时,可以将触摸事件的所有事件信息上报至应用层,保证用户输入的触摸事件能被应用层识别,而实现需求的输入目的。
本申请实施例提供的触摸事件的处理方法,通过在移动终端的导航手势功能开启时,操作系统监测对触摸屏的触摸事件,当监测到对触摸屏的触摸事件时,获取触摸事件对应的多种事件信息,多种事件信息中包括坐标事件信息,然后根据坐标事件信息确定触摸时间的触摸坐标,当触摸坐标位于预设范围时,将多种事件信息中导航手势功能对应的事件信息上报至应用层,预设范围包括导航手势功能所对应的导航手势的触摸坐标范围,导航手势功能对应的事件信息用于指示应用层进行该导航手势功能对应的处理操作,从而可以避免导航手势的功能开启的情况下,发生导航手势的触摸坐标范围内的触摸事件时,上报多余的事件信息至应用层而导致导航手势失效。
请参阅图3,图3示出了本申请另一个实施例提供的触摸事件的处理方法的流程示意图。该方法应用于上述移动终端的操作系统,该移动终端包括触摸屏,下面将针对图3所示的流程进行详细的阐述,所述触摸事件的处理方法具体可以包括以下步骤:
步骤S210:在所述移动终端的导航手势功能开启时,所述操作系统监测对所述触摸屏的触摸事件。
步骤S220:当监测到对所述触摸屏的触摸事件时,获取所述触摸事件对应的多种事件信息,所述多种事件信息中包括坐标事件信息。
步骤S230:根据所述坐标事件信息,确定所述触摸事件的触摸坐标。
在本申请实施例中,步骤S210至步骤S230可以参阅前述实施例的内容,在此不再赘述。
步骤S240:根据所述触摸坐标,确定所述触摸事件的触摸起始点的起始点坐标。
在本申请实施例中,操作系统在获取到触摸事件的触摸坐标后,可以根据触摸坐标,确定触摸事件的触摸起始点的起始坐标。其中,触摸事件的触摸坐标中可以包括 触摸事件的整个触摸过程中的触摸点的触摸坐标。操作系统底层的触摸屏系统,可以根据触摸事件对应的触摸坐标,获取触摸事件的触摸起始点的起始点坐标。可以理解的,通常导航手势功能中的导航手势为从触摸屏的边缘区域往屏幕中心滑动的滑动手势,因此通常导航手势的触摸起始点位于边缘区域,从而可以确定触摸起始点的坐标,并根据触摸起始点坐标确定触摸事件是否可能为导航手势的输入事件。
步骤S250:当所述起始点坐标位于所述预设范围时,将所述多种事件信息中所述导航手势功能对应的事件信息上报至应用层,所述预设范围包括所述导航手势功能所对应的导航手势的触摸坐标范围,所述导航手势功能对应的事件信息用于指示所述应用层进行所述导航手势功能对应的处理操作。
在本申请实施例中,操作系统在获取到触摸事件的触摸起始点的起始点坐标后,则可以判断起始点坐标是否位于预设范围内,如果起始点坐标位于预设范围内,则表示触摸事件可能为导航手势的输入事件;如果起始点坐标不位于预设范围内,则表示触摸事件不可能为导航手势的输入事件。
在一些实施方式中,预设范围可以为导航手势功能的导航手势对应的触摸坐标范围。触摸坐标范围指导航手势所对应的触摸点对应的坐标范围。
作为一种具体的实施方式,由于根据触摸事件的起始点坐标确定触摸事件是否可能为导航手势的输入事件,并且导航手势通常为由触摸屏的边缘往中部滑动的滑动手势,因此,预设范围可以为触摸屏的边缘区域对应的坐标范围,并且边缘区域为与触摸屏的边缘的距离小于设定距离的区域。从而可以通过确定触摸事件的起始点坐标是否处于边缘区域对应坐标范围,确定触摸事件是否可能为导航手势的输入事件。
在一些实施方式中,导航手势除了起始点位于触摸屏的边缘以外,通常还需要导航手势对应的滑动长度大于一定长度,因此可以确定触摸起始点与触摸终端之间的距离,并根据该距离进一步确定触摸事件是否可能为导航手势的输入事件,从而精确的确认触摸事件是否可能为导航手势的输入事件。因此,该触摸事件的处理方法还可以包括:当所述起始点坐标位于所述预设范围时,根据所述触摸坐标,确定所述触摸事件的终点坐标;获取所述起始点坐标与所述终点坐标之间的距离;如果所述距离大于设定阈值,将所述多种事件信息中所述导航手势功能对应的事件信息上报至应用层。
可以理解的,通过坐标信息,获取触摸事件的触摸重点的终点坐标,并根据起始点坐标与终点坐标的距离,如果该距离大于设定阈值,则触摸事件的滑动长度大于一定长度,该触摸事件可能为导航手势的输入事件,如果该距离小于或者等于设定阈值,则表示该触摸事件不可能为导航手势的输入事件。其中,设定阈值的具体数值可以不作为限定。因此,当该距离大于设定阈值,表示触摸事件可能为导航手势的输入事件的情况下,将多种事件信息中导航手势功能对应的事件信息上报至应用层,避免上报多余的事件信息至应用层而导致导航手势失效。
进一步的,该触摸事件的处理方法还可以包括:如果所述距离小于或者等于所述设定阈值,将所述多种事件信息上报至应用层。可以理解的,当该阈值小于或者等于设定阈值,表示该触摸事件不可能为导航手势的输入事件,因此可以将获取的多种事件信息全部上报至应用层,保证用户输入的触摸事件能被应用层识别,而实现需求的输入目的。
步骤S260:如果所述触摸坐标位于除所述预设范围以外的其他范围时,将所述多种事件信息上报至所述应用层。
在本申请实施例中,如果触摸坐标不位于预设范围内,则表示触摸事件的触摸位置不处于导航手势的触摸范围内,该触摸事件不可能为导航手势的输入事件,因此操作系统(即底层的触摸屏系统)在向应用层上报触摸事件的事件信息时,可以将触摸事件的所有事件信息上报至应用层,保证用户输入的触摸事件能被应用层识别,而实现需求的输入目的。
本申请实施例提供的触摸事件的处理方法,根据触摸事件的触摸起始点的坐标是否处于预设范围,而确定触摸事件是否可能为导航手势的输入事件,当触摸起始点坐标位于预设范围时,将触摸事件的多种事件信息中导航手势功能对应的事件信息上报至应用层,从而避免导航手势的功能开启的情况下,发生导航手势的触摸坐标范围内的触摸事件时,上报多余的事件信息至应用层而导致导航手势失效。
请参阅图4,图4示出了本申请又一个实施例提供的触摸事件的处理方法的流程示意图。该方法应用于上述移动终端的操作系统,该移动终端包括触摸屏,下面将针对图4所示的流程进行详细的阐述,所述触摸事件的处理方法具体可以包括以下步骤:
步骤S310:在所述移动终端的导航手势功能开启时,所述操作系统监测对所述触摸屏的触摸事件
步骤S320:当监测到对所述触摸屏的触摸事件时,获取所述触摸事件对应的多种事件信息,所述多种事件信息中包括坐标事件信息。
步骤S330:根据所述坐标事件信息,确定所述触摸事件的触摸坐标。
在本申请实施例中,步骤S310至步骤S330可以参阅前述实施例的内容,在此不再赘述。
步骤S340:当所述触摸坐标中所有坐标均位于预设范围时,将所述多种事件信息中所述导航手势功能对应的事件信息上报至应用层,所述预设范围包括所述导航手势功能所对应的导航手势的触摸坐标范围,所述导航手势功能对应的事件信息用于指示所述应用层进行所述导航手势功能对应的处理操作。
步骤S350:当所述触摸坐标中存在不处于所述预设范围的触摸坐标时,将所述多种事件信息上报至所述应用层。
在本申请实施例中,操作系统在获取到触摸事件的触摸坐标后,可以根据触摸坐标,确定所有触摸点的触摸坐标是否在预设范围内。可以理解的,预设范围为导航手势对应的触摸坐标范围,如果所有触摸点坐标的触摸坐标全部在预设范围,则表示触摸事件极有可能为导航手势的输入事件,而所有触摸点坐标的触摸坐标中存在不处于预设范围的触摸坐标,则表示触摸事件不可能为导航手势的输入事件。因此,当全部触摸点的触摸坐标位于预设范围时,将多种事件信息中导航手势功能对应的事件信息上报至应用层,避免上报多余的事件信息至应用层而导致导航手势失效。而全部触摸点的触摸坐标中存在不位于预设范围的触摸坐标时,则将多种事件信息全部上报至应用层,保证触摸屏上的输入事件的正常上报。
本申请实施例提供的触摸事件的处理方法,根据触摸事件的所有触摸点的触摸坐标是否全部处于预设范围,而确定触摸事件是否可能为导航手势的输入事件,当触摸事件的所有触摸点的触摸坐标全部处于预设范围时,将触摸事件的多种事件信息中导航手势功能对应的事件信息上报至应用层,当触摸事件的所有触摸点的触摸坐标中存在不处于预设范围的触摸坐标时,将触摸事件的多种事件信息上报至应用层,从而避免导航手势的功能开启的情况下,发生导航手势的触摸坐标范围内的触摸事件时,上报多余的事件信息至应用层而导致导航手势失效。
请参阅图5,图5示出了本申请再一个实施例提供的触摸事件的处理方法的流程示意图。该方法应用于上述移动终端的操作系统,该移动终端包括触摸屏,下面将针对图5所示的流程进行详细的阐述,所述触摸事件的处理方法具体可以包括以下步骤:
步骤S410:向应用层请求获取所述导航手势功能对应的事件信息,并将所述导航手势功能对应的事件信息进行存储,所述导航手势功能对应的事件信息由所述应用层根据所述导航手势功能对应的导航手势算法确定。
在本申请实施例中,导航手势功能对应的事件信息,即导航手势功能开启时,操作系统的底层需要上报至应用层的触摸事件的事件信息,可以由操作系统的内核层向应用层中导航手势模块获取。具体的,内核层可以向应用层中导航手势模块发起导航 手势功能对应的事件信息的请求,应用层中导航手势模块接收到请求后,则可以返回给内核层导航手势功能对应的事件信息,内核层获取到导航手势功能对应的事件信息,可以将导航手势功能对应的事件信息存储到框架层中,并且上报触摸事件的事件信息时,内核层可以从框架层获取需要上报的事件类型。
步骤S420:在所述移动终端的导航手势功能开启时,所述操作系统监测对所述触摸屏的触摸事件。
步骤S430:当监测到对所述触摸屏的触摸事件时,获取所述触摸事件对应的多种事件信息,所述多种事件信息中包括坐标事件信息。
步骤S440:根据所述坐标事件信息,确定所述触摸事件的触摸坐标。
步骤S450:当所述触摸坐标位于预设范围时,将所述多种事件信息中所述导航手势功能对应的事件信息上报至应用层,所述预设范围包括所述导航手势功能所对应的导航手势的触摸坐标范围,所述导航手势功能对应的事件信息用于指示所述应用层进行所述导航手势功能对应的处理操作。
在本申请实施例中,步骤S430至步骤S450可以参阅前述实施例的内容,在此不再赘述。
步骤S460:如果预设时长内多次监测到对所述触摸屏的触摸事件,且多次监测到的触摸事件的触摸坐标位于预设范围,则再次向所述应用层请求获取所述导航手势功能对应的事件信息,并将所述导航手势对应的事件信息进行存储。
在本申请实施例中,操作系统的底层在确定触摸事件可能为导航手势的输入事件的情况下,在向应用层上报导航手势功能对应的事件信息后,可以对后续发生的事件进行监测。如果预设时长内容多次监测到对触摸屏的触摸事件,且多次监测到的触摸事件的触摸坐标位于预设范围,则表示用户在预设时长内可能多次做出导航手势,但是未被识别出。因此,操作系统的底层可以再次向应用层获取导航手势功能对应的事件信息,以确定是否需要对导航手势功能对应的事件信息进行更新,以避免导航手势功能所需的事件信息发生了更新,而导致导航手势无法被识别。例如,此前导航手势功能所需的事件信息包括按下事件、抬起事件以及坐标信息,而导航手势算法更新后还包括手指的直径的近似事件,因此操作系统的底层仅上报按下事件、抬起事件以及坐标信息,则应用层中导航手势模块无法识别出导航手势,从而无法实现导航手势功能,因此操作系统的底层可以重新获取导航手势功能对应的事件信息,以避免后续的导航手势无法被识别。
步骤S470:向所述应用层请求获取所述导航手势功能所对应的导航手势的触摸坐标范围,并根据获取的触摸坐标范围对所述预设范围进行更新。
在本申请实施例中,如果预设时长内容多次监测到对触摸屏的触摸事件,且多次监测到的触摸事件的触摸坐标位于预设范围,表示导航手势算法可能发生了更新,而导致多次的导航手势未被识别出。因此,操作系统的底层在将导航手势功能对应的事件信息进行更新后,还可以对导航手势对应的触摸坐标范围进行更新,即对预设范围进行更新。因此操作系统的底层可以向应用层的导航手势模块获取导航手势功能所对应的导航手势的触摸坐标范围,并对预设范围进行更新,避免导航手势的触摸坐标范围发生了变化,而导致后续用户做出的导航手势无法被识别。
本申请实施例提供的触摸事件的处理方法,操作系统的底层向应用层获取导航手势功能对应的事件信息,并将事件信息存储于驱动层,每次导航手势的触摸事件的上报时,根据驱动层中存储的导航手势功能对应的事件信息进行上报。并且,在多次监测到触摸事件,并且触摸事件的触摸坐标在预设范围时,对导航手势功能对应的事件信息进行更新,并对预设范围进行更新,避免用户后续输入的导航手势无法被识别,而影响用户体验。
请参阅图6,图6示出了本申请又再一个实施例提供的触摸事件的处理方法的流程 示意图。该方法应用于上述移动终端的操作系统,该移动终端包括触摸屏,下面将针对图6所示的流程进行详细的阐述,所述触摸事件的处理方法具体可以包括以下步骤:
步骤S501:显示导航设置界面,所述导航设置界面中包括所述导航手势功能的功能选项。
在一些实施方式中,移动终端可以提供导航功能的设置。移动终端的操作系统可以控制显示屏对导航设置界面进行显示,导航设置界面中包括导航手势功能的功能选项。例如,如图7所示,该功能选项可以为开关控件A1的形式,用户通过对该开关控件A1进行操作,即可控制导航手势功能的开启和关闭。
步骤S502:当检测到对所述功能选项的开启操作时,启用所述导航手势功能。
在一些实施方式中,操作系统在检测到对功能选项的开启操作时,则表示用户需要对导航手势的功能进行开启,因此可以控制导航手势功能开启,例如检测到对开关控件的开启操作,即可控制导航手势功能开启。具体的,操作系统的底层可以向应用层发出指令,该指令用于控制导航手势功能对应的导航手势模块开启。
步骤S503:显示导航手势的选择界面,所述选择界面包括多个导航方式对应的选择选项。
在一些实施方式中,导航手势功能的导航方式可以为多种,在设置导航手势功能时,可以显示导航手势的选择界面,选择界面包括多个导航方式对应的选择选项,选择选项用于用户选择导航方式。例如,如图8所示,导航方式包括两侧返回、简易手势、右侧返回以及左侧返回,用户可以通过对导航方式对应的选择选项进行选取,而实现对导航方式的选择。其中,两侧返回的方式为,从触摸屏底部的两侧区域的上滑操作对应实现返回上一级,从触摸屏底部的中部区域的上滑操作对应实现返回桌面,从触摸屏底部的中部区域的上滑且停留的操作对应实现查看最近任务;简易手势的方式为,从触摸屏的底部区域的上滑操作对应实现返回桌面,从触摸屏的底部区域的上滑且停留的操作对应实现查看最近任务;右侧返回的方式为,从触摸屏底部区域的右侧区域进行的上滑操作用于实现返回上一级,从触摸屏底部区域的中部区域进行的上滑操作用于实现返回桌面,从触摸屏的底部区域的左侧区域进行的上滑操作用于实现查看最近任务;右侧返回的方式为,从触摸屏底部区域的右侧区域进行的上滑操作用于实现查看最近任务,从触摸屏底部区域的中部区域进行的上滑操作用于实现返回桌面,从触摸屏的底部区域的左侧区域进行的上滑操作用于实现返回上一级。当然,以上导航方式仅为举例,并不代表对本申请实施例中导航手势功能的导航方式的限定。
步骤S504:根据对所述选择选项的操作,确定选择的导航方式,所述导航方式对应不同的导航手势。
步骤S505:根据所述导航方式,确定与所述导航方式对应的导航手势。
步骤S506:获取所述导航方式对应的导航手势的触摸坐标范围,并将所述触摸坐标范围作为所述预设范围。
可以理解的,不同的导航方式对应的导航手势可能不同,因此可以根据选择的导航方式,而确定导航方式对应的导航手势,并根据导航手势确定导航手势的触摸坐标范围作为预设范围。
步骤S507:在所述移动终端的导航手势功能开启时,所述操作系统监测对所述触摸屏的触摸事件。
步骤S508:当监测到对所述触摸屏的触摸事件时,获取所述触摸事件对应的多种事件信息,所述多种事件信息中包括坐标事件信息。
步骤S509:根据所述坐标事件信息,确定所述触摸事件的触摸坐标。
步骤S510:当所述触摸坐标位于预设范围时,将所述多种事件信息中所述导航手势功能对应的事件信息上报至应用层,所述预设范围包括所述导航手势功能所对应的导航手势的触摸坐标范围,所述导航手势功能对应的事件信息用于指示所述应用层进 行所述导航手势功能对应的处理操作。
在本申请实施例中,步骤S507至步骤S510可以参阅前述实施例的内容,在此不再赘述。
步骤S511:在所述移动终端的导航手势功能关闭时,如果所述操作系统监测到对所述触摸屏的触摸事件时,获取所述触摸事件对应的多种事件信息,并将所述多种事件信息上报至所述应用层。
可以理解的,移动终端的导航手势功能关闭时,则触摸事件的所有事件信息均可以上报至应用层,因此如果操作系统监测到对触摸屏的触摸事件,可以将触摸事件的时间信息全部上报至应用层。
本申请实施例提供的触摸事件的处理方法,提供了导航手势功能的设置方法,方便用户对导航手势功能进行管理。并且根据触摸事件的触摸坐标是否处于预设范围,而确定触摸事件是否可能为导航手势的输入事件,当触摸坐标位于预设范围时,将触摸事件的多种事件信息中导航手势功能对应的事件信息上报至应用层,从而避免导航手势的功能开启的情况下,发生导航手势的触摸坐标范围内的触摸事件时,上报多余的事件信息至应用层而导致导航手势失效。
请参阅图9,其示出了本申请实施例提供的一种触摸事件的处理装置400的结构框图。该触摸事件的处理装置400应用上述的移动终端的操作系统,该触摸事件的处理装置400包括:触摸监测模块410、事件获取模块420、坐标获取模块430以及事件上报模块440。其中,所述触摸监测模块410用于在所述移动终端的导航手势功能开启时,所述操作系统监测对所述触摸屏的触摸事件;所述事件获取模块420用于当监测到对所述触摸屏的触摸事件时,获取所述触摸事件对应的多种事件信息,所述多种事件信息中包括坐标事件信息;所述坐标获取模块430用于根据所述坐标事件信息,确定所述触摸事件的触摸坐标;所述事件上报模块440用于当所述触摸坐标位于预设范围时,将所述多种事件信息中所述导航手势功能对应的事件信息上报至应用层,所述预设范围包括所述导航手势功能所对应的导航手势的触摸坐标范围,所述导航手势功能对应的事件信息用于指示所述应用层进行所述导航手势功能对应的处理操作。
在一些实施方式中,所述事件上报模块440还用于如果所述触摸坐标位于除所述预设范围以外的其他范围时,将所述多种事件信息上报至所述应用层。
在一些实施方式中,请参见图10,所述事件上报模块440包括:起始点获取单元441以及上报执行单元442。其中,所述起始点获取单元441用于根据所述触摸坐标,确定所述触摸事件的触摸起始点的起始点坐标;所述上报执行单元442用于当所述起始点坐标位于所述预设范围时,将所述多种事件信息中所述导航手势功能对应的事件信息上报至应用层。
在该实施方式下,所述上报执行单元442可以具体用于:当所述起始点坐标位于所述预设范围时,根据所述触摸坐标,确定所述触摸事件的终点坐标;获取所述起始点坐标与所述终点坐标之间的距离;如果所述距离大于设定阈值,将所述多种事件信息中所述导航手势功能对应的事件信息上报至应用层。
进一步的,所述上报执行单元442还可以用于:如果所述距离小于或者等于所述设定阈值,将所述多种事件信息上报至应用层。
在一些实施方式中,所述上报执行单元442也可以具体用于:当所述触摸坐标中所有坐标均位于预设范围时,将所述多种事件信息中所述导航手势功能对应的事件信息上报至应用层。
在一些实施方式中,该触摸事件的处理装置400还可以包括:事件信息获取模块。事件信息获取模块可以用于向应用层请求获取所述导航手势功能对应的事件信息,并将所述导航手势功能对应的事件信息进行存储,所述导航手势功能对应的事件信息由所述应用层根据所述导航手势功能对应的导航手势算法确定。
在该实施方式下,该触摸事件的处理装置400还可以包括:事件信息更新模块。事件信息更新模块可以用于如果预设时长内多次监测到对所述触摸屏的触摸事件,且多次监测到的触摸事件的触摸坐标位于预设范围,则再次向所述应用层请求获取所述导航手势功能对应的事件信息,并将所述导航手势对应的事件信息进行存储。
进一步的,该触摸事件的处理装置400还可以包括:范围更新模块。范围更新模块可以用于向所述应用层请求获取所述导航手势功能所对应的导航手势的触摸坐标范围,并根据获取的触摸坐标范围对所述预设范围进行更新。
在一些实施方式中,该触摸事件的处理装置400还可以包括:设置界面显示模块以及导航手势启用模块。设置界面显示模块用于在所述移动终端的导航手势功能开启时,所述操作系统监测对所述触摸屏的触摸事件之前,显示导航设置界面,所述导航设置界面中包括所述导航手势功能的功能选项;导航手势启用模块用于当检测到对所述功能选项的开启操作时,启用所述导航手势功能。
在该实施方式下,该触摸事件的处理装置400还可以包括:选择界面显示模块以及导航方式获取模块。选择界面显示模块用于在所述当检测到对所述功能选项的开启操作时,启用所述导航手势功能之后,显示导航手势的选择界面,所述选择界面包括多个导航方式对应的选择选项;导航方式获取模块用于根据对所述选择选项的操作,确定选择的导航方式,所述导航方式对应不同的导航手势。
进一步的,该触摸事件的处理装置400还可以包括:导航手势获取模块以及预设范围确定模块。导航手势获取模块用于根据所述导航方式,确定与所述导航方式对应的导航手势;预设范围确定模块用于获取所述导航方式对应的导航手势的触摸坐标范围,并将所述触摸坐标范围作为所述预设范围。
在一些实施方式中,事件上报模块440还可以用于:在所述移动终端的导航手势功能关闭时,如果所述操作系统监测到对所述触摸屏的触摸事件时,获取所述触摸事件对应的多种事件信息,并将所述多种事件信息上报至所述应用层。
在一些实施方式中,所述多种事件信息包括坐标事件信息、按下事件、抬起事件、按压压力的事件信息、手指接触所述触摸屏的直径的近似事件以及手指的直径的近似事件;所述导航手势功能对应的事件信息包括坐标事件信息、按下事件以及抬起事件。
在一些实施方式中,述预设范围包括所述触摸屏的边缘区域对应的坐标范围,所述边缘区域为与所述触摸屏的边缘的距离小于设定距离的区域。
所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,上述描述装置和模块的具体工作过程,可以参考前述方法实施例中的对应过程,在此不再赘述。
在本申请所提供的几个实施例中,模块相互之间的耦合可以是电性,机械或其它形式的耦合。
另外,在本申请各个实施例中的各功能模块可以集成在一个处理模块中,也可以是各个模块单独物理存在,也可以两个或两个以上模块集成在一个模块中。上述集成的模块既可以采用硬件的形式实现,也可以采用软件功能模块的形式实现。
综上所述,通过在移动终端的导航手势功能开启时,操作系统监测对触摸屏的触摸事件,当监测到对触摸屏的触摸事件时,获取触摸事件对应的多种事件信息,多种事件信息中包括坐标事件信息,然后根据坐标事件信息确定触摸时间的触摸坐标,当触摸坐标位于预设范围时,将多种事件信息中导航手势功能对应的事件信息上报至应用层,预设范围包括导航手势功能所对应的导航手势的触摸坐标范围,导航手势功能对应的事件信息用于指示应用层进行该导航手势功能对应的处理操作。从而可以避免导航手势的功能开启的情况下,发生导航手势的触摸坐标范围内的触摸事件时,上报多余的事件信息至应用层而导致导航手势失效。
请参考图11,其示出了本申请实施例提供的一种电子设备的结构框图。该移动终端100可以是智能手机、平板电脑、电子书等能够运行应用程序的电子设备。本申请 中的移动终端100可以包括一个或多个如下部件:处理器110、存储器120、触摸屏130以及一个或多个应用程序,其中一个或多个应用程序可以被存储在存储器120中并被配置为由一个或多个处理器110执行,一个或多个程序配置用于执行如前述方法实施例所描述的方法。
处理器110可以包括一个或者多个处理核。处理器110利用各种接口和线路连接整个电子设备100内的各个部分,通过运行或执行存储在存储器120内的指令、程序、代码集或指令集,以及调用存储在存储器120内的数据,执行电子设备100的各种功能和处理数据。可选地,处理器110可以采用数字信号处理(Digital Signal Processing,DSP)、现场可编程门阵列(Field-Programmable Gate Array,FPGA)、可编程逻辑阵列(Programmable Logic Array,PLA)中的至少一种硬件形式来实现。处理器110可集成中央处理器(Central Processing Unit,CPU)、图像处理器(Graphics Processing Unit,GPU)和调制解调器等中的一种或几种的组合。其中,CPU主要处理操作系统、用户界面和应用程序等;GPU用于负责显示内容的渲染和绘制;调制解调器用于处理无线通信。可以理解的是,上述调制解调器也可以不集成到处理器110中,单独通过一块通信芯片进行实现。
存储器120可以包括随机存储器(Random Access Memory,RAM),也可以包括只读存储器(Read-Only Memory)。存储器120可用于存储指令、程序、代码、代码集或指令集。存储器120可包括存储程序区和存储数据区,其中,存储程序区可存储用于实现操作系统的指令、用于实现至少一个功能的指令(比如触控功能、声音播放功能、图像播放功能等)、用于实现下述各个方法实施例的指令等。存储数据区还可以存储终端100在使用中所创建的数据(比如电话本、音视频数据、聊天记录数据)等。
触摸屏130可收集用户在其上或附近的触摸操作(比如用户使用手指、触笔等任何适合的物体或附件在所述触摸屏130上或在所述触摸屏130附近的操作),并根据预先设定的程序驱动相应的连接装置。可选的,所述触摸屏130可包括触摸检测装置和触摸控制器。其中,所述触摸检测装置检测用户的触摸方位,并检测触摸操作带来的信号,将信号传送给所述触摸控制器;所述触摸控制器从所述触摸检测装置上接收触摸信息,并将该触摸信息转换成触点坐标,再送给所述处理器110,并能接收所述处理器110发来的命令并加以执行。此外,可以采用电阻式、电容式、红外线以及表面声波等多种类型实现所述触摸屏130的触摸检测功能。
请参考图12,其示出了本申请实施例提供的一种计算机可读存储介质的结构框图。该计算机可读介质800中存储有程序代码,所述程序代码可被处理器调用执行上述方法实施例中所描述的方法。
计算机可读存储介质800可以是诸如闪存、EEPROM(电可擦除可编程只读存储器)、EPROM、硬盘或者ROM之类的电子存储器。可选地,计算机可读存储介质800包括非易失性计算机可读介质(non-transitory computer-readable storage medium)。计算机可读存储介质800具有执行上述方法中的任何方法步骤的程序代码810的存储空间。这些程序代码可以从一个或者多个计算机程序产品中读出或者写入到这一个或者多个计算机程序产品中。程序代码810可以例如以适当形式进行压缩。
最后应说明的是:以上实施例仅用以说明本申请的技术方案,而非对其限制;尽管参照前述实施例对本申请进行了详细的说明,本领域的普通技术人员当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分技术特征进行等同替换;而这些修改或者替换,并不驱使相应技术方案的本质脱离本申请各实施例技术方案的精神和范围。

Claims (20)

  1. 一种触摸事件的处理方法,其特征在于,应用于移动终端的操作系统,所述移动终端包括触摸屏,所述方法包括:
    在所述移动终端的导航手势功能开启时,所述操作系统监测对所述触摸屏的触摸事件;
    当监测到对所述触摸屏的触摸事件时,获取所述触摸事件对应的多种事件信息,所述多种事件信息中包括坐标事件信息;
    根据所述坐标事件信息,确定所述触摸事件的触摸坐标;
    当所述触摸坐标位于预设范围时,将所述多种事件信息中所述导航手势功能对应的事件信息上报至应用层,所述预设范围包括所述导航手势功能所对应的导航手势的触摸坐标范围,所述导航手势功能对应的事件信息用于指示所述应用层进行所述导航手势功能对应的处理操作。
  2. 根据权利要求1所述的方法,其特征在于,所述方法还包括:
    如果所述触摸坐标位于除所述预设范围以外的其他范围时,将所述多种事件信息上报至所述应用层。
  3. 根据权利要求1所述的方法,其特征在于,所述当所述触摸坐标位于预设范围时,将所述多种事件信息中所述导航手势功能对应的事件信息上报至应用层,包括:
    根据所述触摸坐标,确定所述触摸事件的触摸起始点的起始点坐标;
    当所述起始点坐标位于所述预设范围时,将所述多种事件信息中所述导航手势功能对应的事件信息上报至应用层。
  4. 根据权利要求3所述的方法,其特征在于,所述当所述起始点坐标位于所述预设范围时,将所述多种事件信息中所述导航手势功能对应的事件信息上报至应用层,包括:
    当所述起始点坐标位于所述预设范围时,根据所述触摸坐标,确定所述触摸事件的终点坐标;
    获取所述起始点坐标与所述终点坐标之间的距离;
    如果所述距离大于设定阈值,将所述多种事件信息中所述导航手势功能对应的事件信息上报至应用层。
  5. 根据权利要求4所述的方法,其特征在于,所述方法还包括:
    如果所述距离小于或者等于所述设定阈值,将所述多种事件信息上报至应用层。
  6. 根据权利要求1所述的方法,其特征在于,所述当所述触摸坐标位于预设范围时,将所述多种事件信息中所述导航手势功能对应的事件信息上报至应用层,包括:
    当所述触摸坐标中所有坐标均位于预设范围时,将所述多种事件信息中所述导航手势功能对应的事件信息上报至应用层。
  7. 根据权利要求1-6任一项所述的方法,其特征在于,所述方法还包括:
    向应用层请求获取所述导航手势功能对应的事件信息,并将所述导航手势功能对应的事件信息进行存储,所述导航手势功能对应的事件信息由所述应用层根据所述导航手势功能对应的导航手势算法确定。
  8. 根据权利要求7所述的方法,其特征在于,在当所述触摸坐标位于预设范围时,将所述多种事件信息中所述导航手势功能对应的事件信息上报至应用层之后,所述方法还包括:
    如果预设时长内多次监测到对所述触摸屏的触摸事件,且多次监测到的触摸事件的触摸坐标位于预设范围,则再次向所述应用层请求获取所述导航手势功能对应的事 件信息,并将所述导航手势对应的事件信息进行存储。
  9. 根据权利要求8所述的方法,其特征在于,所述方法还包括:
    向所述应用层请求获取所述导航手势功能所对应的导航手势的触摸坐标范围,并根据获取的触摸坐标范围对所述预设范围进行更新。
  10. 根据权利要求1-9任一项所述的方法,其特征在于,在所述移动终端的导航手势功能开启时,所述操作系统监测对所述触摸屏的触摸事件之前,所述方法还包括:
    显示导航设置界面,所述导航设置界面中包括所述导航手势功能的功能选项;
    当检测到对所述功能选项的开启操作时,启用所述导航手势功能。
  11. 根据权利要求10所述的方法,其特征在于,在所述当检测到对所述功能选项的开启操作时,启用所述导航手势功能之后,所述方法还包括:
    显示导航手势的选择界面,所述选择界面包括多个导航方式对应的选择选项;
    根据对所述选择选项的操作,确定选择的导航方式,所述导航方式对应不同的导航手势。
  12. 根据权利要求11所述的方法,其特征在于,所述方法还包括:
    根据所述导航方式,确定与所述导航方式对应的导航手势;
    获取所述导航方式对应的导航手势的触摸坐标范围,并将所述触摸坐标范围作为所述预设范围。
  13. 根据权利要求1-12任一项所述的方法,其特征在于,所述方法还包括:
    在所述移动终端的导航手势功能关闭时,如果所述操作系统监测到对所述触摸屏的触摸事件时,获取所述触摸事件对应的多种事件信息,并将所述多种事件信息上报至所述应用层。
  14. 根据权利要求1-13任一项所述的方法,其特征在于,所述多种事件信息包括坐标事件信息、按下事件、抬起事件、按压压力的事件信息、手指接触所述触摸屏的直径的近似事件以及手指的直径的近似事件;
    所述导航手势功能对应的事件信息包括坐标事件信息、按下事件以及抬起事件。
  15. 根据权利要求1-14任一项所述的方法,其特征在于,所述预设范围包括所述触摸屏的边缘区域对应的坐标范围,所述边缘区域为与所述触摸屏的边缘的距离小于设定距离的区域。
  16. 一种触摸事件的处理装置,其特征在于,应用于移动终端的操作系统,所述移动终端包括触摸屏,所述装置包括:触摸监测模块、事件获取模块、坐标获取模块以及事件上报模块,其中,
    所述触摸监测模块用于在所述移动终端的导航手势功能开启时,所述操作系统监测对所述触摸屏的触摸事件;
    所述事件获取模块用于当监测到对所述触摸屏的触摸事件时,获取所述触摸事件对应的多种事件信息,所述多种事件信息中包括坐标事件信息;
    所述坐标获取模块用于根据所述坐标事件信息,确定所述触摸事件的触摸坐标;
    所述事件上报模块用于当所述触摸坐标位于预设范围时,将所述多种事件信息中所述导航手势功能对应的事件信息上报至应用层,所述预设范围包括所述导航手势功能所对应的导航手势的触摸坐标范围,所述导航手势功能对应的事件信息用于指示所述应用层进行所述导航手势功能对应的处理操作。
  17. 根据权利要求16所述的装置,其特征在于,所述事件上报模块还用于如果所述触摸坐标位于除所述预设范围以外的其他范围时,将所述多种事件信息上报至所述应用层。
  18. 根据权利要求16所述的装置,其特征在于,所述事件上报模块包括:起始点获取单元以及上报执行单元,其中,
    所述起始点获取单元用于根据所述触摸坐标,确定所述触摸事件的触摸起始点的 起始点坐标;
    所述上报执行单元用于当所述起始点坐标位于所述预设范围时,将所述多种事件信息中所述导航手势功能对应的事件信息上报至应用层。
  19. 一种移动终端,其特征在于,包括:
    一个或多个处理器;
    存储器;
    一个或多个应用程序,其中所述一个或多个应用程序被存储在所述存储器中并被配置为由所述一个或多个处理器执行,所述一个或多个程序配置用于执行如权利要求1-15任一项所述的方法。
  20. 一种计算机可读取存储介质,其特征在于,所述计算机可读取存储介质中存储有程序代码,所述程序代码可被处理器调用执行如权利要求1-15任一项所述的方法。
PCT/CN2019/109993 2019-10-08 2019-10-08 触摸事件的处理方法、装置、移动终端及存储介质 WO2021068112A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2019/109993 WO2021068112A1 (zh) 2019-10-08 2019-10-08 触摸事件的处理方法、装置、移动终端及存储介质
CN201980099360.2A CN114270298A (zh) 2019-10-08 2019-10-08 触摸事件的处理方法、装置、移动终端及存储介质

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2019/109993 WO2021068112A1 (zh) 2019-10-08 2019-10-08 触摸事件的处理方法、装置、移动终端及存储介质

Publications (1)

Publication Number Publication Date
WO2021068112A1 true WO2021068112A1 (zh) 2021-04-15

Family

ID=75437785

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/109993 WO2021068112A1 (zh) 2019-10-08 2019-10-08 触摸事件的处理方法、装置、移动终端及存储介质

Country Status (2)

Country Link
CN (1) CN114270298A (zh)
WO (1) WO2021068112A1 (zh)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116991302B (zh) * 2023-09-22 2024-03-19 荣耀终端有限公司 应用与手势导航栏兼容运行方法、图形界面及相关装置

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102023735A (zh) * 2009-09-21 2011-04-20 联想(北京)有限公司 一种触摸输入设备、电子设备及手机
US20120206399A1 (en) * 2011-02-10 2012-08-16 Alcor Micro, Corp. Method and System for Processing Signals of Touch Panel
CN102819331A (zh) * 2011-06-07 2012-12-12 联想(北京)有限公司 移动终端及其触摸输入方法
CN103257820A (zh) * 2012-02-20 2013-08-21 联想(北京)有限公司 控制方法及电子设备
CN105487705A (zh) * 2015-11-20 2016-04-13 努比亚技术有限公司 移动终端、输入处理方法及用户设备
CN109766043A (zh) * 2018-12-29 2019-05-17 华为技术有限公司 电子设备的操作方法和电子设备

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102023735A (zh) * 2009-09-21 2011-04-20 联想(北京)有限公司 一种触摸输入设备、电子设备及手机
US20120206399A1 (en) * 2011-02-10 2012-08-16 Alcor Micro, Corp. Method and System for Processing Signals of Touch Panel
CN102819331A (zh) * 2011-06-07 2012-12-12 联想(北京)有限公司 移动终端及其触摸输入方法
CN103257820A (zh) * 2012-02-20 2013-08-21 联想(北京)有限公司 控制方法及电子设备
CN105487705A (zh) * 2015-11-20 2016-04-13 努比亚技术有限公司 移动终端、输入处理方法及用户设备
CN109766043A (zh) * 2018-12-29 2019-05-17 华为技术有限公司 电子设备的操作方法和电子设备

Also Published As

Publication number Publication date
CN114270298A (zh) 2022-04-01

Similar Documents

Publication Publication Date Title
US10908703B2 (en) User terminal device and method for controlling the user terminal device thereof
CN110663018B (zh) 多显示器设备中的应用启动
WO2021035884A1 (zh) 投屏方法、装置、终端及存储介质
EP3842905B1 (en) Icon display method and apparatus, terminal and storage medium
KR102021048B1 (ko) 사용자 입력을 제어하기 위한 방법 및 그 전자 장치
US9400590B2 (en) Method and electronic device for displaying a virtual button
KR101278346B1 (ko) 이벤트 인식
EP3435209B1 (en) Method for recognizing a screen-off gesture, and storage medium and terminal thereof
WO2021092768A1 (zh) 触摸事件的处理方法、装置、移动终端及存储介质
TWI512601B (zh) 電子裝置及其控制方法與電腦程式產品
CN104007894A (zh) 便携式设备及其多应用操作方法
US9360989B2 (en) Information processing device, and method for changing execution priority
US10466894B2 (en) Method, device, storage medium and mobile terminal for recognizing an off-screen gesture
WO2019201140A1 (zh) 应用显示方法、装置、存储介质及电子设备
US10019148B2 (en) Method and apparatus for controlling virtual screen
US11681410B2 (en) Icon management method and terminal device
WO2019047231A1 (zh) 触摸操作响应方法及装置
WO2019047226A1 (zh) 触摸操作响应方法及装置
WO2019047234A1 (zh) 触摸操作响应方法及装置
EP2490115A1 (en) Electronic device, controlling method thereof and computer program product
WO2021068112A1 (zh) 触摸事件的处理方法、装置、移动终端及存储介质
CN107092433B (zh) 触控一体机的触摸控制方法及装置
WO2019072169A1 (zh) 防误触检测方法、装置及终端
US20150153925A1 (en) Method for operating gestures and method for calling cursor
US9026691B2 (en) Semi-autonomous touch I/O device controller operation under control of host

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19948535

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19948535

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 07.10.2022)

122 Ep: pct application non-entry in european phase

Ref document number: 19948535

Country of ref document: EP

Kind code of ref document: A1