WO2020168786A1 - 触摸操作响应方法、装置、存储介质及终端 - Google Patents

触摸操作响应方法、装置、存储介质及终端 Download PDF

Info

Publication number
WO2020168786A1
WO2020168786A1 PCT/CN2019/123366 CN2019123366W WO2020168786A1 WO 2020168786 A1 WO2020168786 A1 WO 2020168786A1 CN 2019123366 W CN2019123366 W CN 2019123366W WO 2020168786 A1 WO2020168786 A1 WO 2020168786A1
Authority
WO
WIPO (PCT)
Prior art keywords
window
touch operation
touch
area
pen
Prior art date
Application number
PCT/CN2019/123366
Other languages
English (en)
French (fr)
Inventor
杨蒙
Original Assignee
广州视源电子科技股份有限公司
广州视臻信息科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 广州视源电子科技股份有限公司, 广州视臻信息科技有限公司 filed Critical 广州视源电子科技股份有限公司
Publication of WO2020168786A1 publication Critical patent/WO2020168786A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Definitions

  • This application relates to the field of computer technology, for example, to a touch operation response method, device, storage medium, and terminal.
  • the display interface of the smart display device needs to display corresponding content through multiple display windows at the same time. Multiple display windows may overlap, that is, multiple display windows of the display interface may be displayed by one or more other display windows. Covered by the window.
  • the present application provides a touch operation response method, device, storage medium, and terminal, which can solve the problem of complex operation of the touch penetration function and great difficulty in realization.
  • the technical solution is as follows:
  • This application provides a touch operation response method, including:
  • the first type is different from the second type.
  • This application also provides a touch operation response device, including:
  • a window display module configured to display a first window and a second window on a display interface, the first window and the second window having an overlapping area
  • An operation receiving module configured to receive a touch operation in the overlapping area
  • the first response module is configured to respond to the touch operation on the first window when the touch operation is of the first type
  • the second response module is configured to respond to the touch operation on the second window when the touch operation is of the second type, and the first type is different from the second type.
  • the present application also provides a computer storage medium, wherein the computer storage medium stores a plurality of instructions, and the instructions are suitable for being loaded by a processor and executing the above method steps.
  • the present application also provides a terminal, including: a processor and a memory; wherein the memory stores a computer program, and the computer program is adapted to be loaded by the processor and execute the above method steps.
  • the terminal displays a first window and a second window with an overlapping area on the display interface, and receives a touch operation in the overlapping area.
  • the touch operation is of the first type
  • the first window Respond to the touch operation on a window
  • the touch operation is of the second type
  • Different operation types of the input touch operation can realize the response of different layers of windows in the overlapped window. At the same time, there is no need to modify the operation distribution logic.
  • the operation simplifies the realization process of the touch penetration function, thereby reducing the complexity of the realization of the touch penetration function, and can respond to different layers of windows under different operation types through a set of systems to improve user experience.
  • FIG. 1 is a schematic diagram of the display effect of a first window and a second window provided by an embodiment of the present application
  • FIG. 2 is a schematic diagram of an example of a first window and a second window provided by an embodiment of the present application
  • FIG. 3 is a schematic structural diagram of a system architecture provided by an embodiment of the present application.
  • FIG. 4 is a schematic flowchart of a touch operation response method provided by an embodiment of the present application.
  • FIG. 5 is a schematic flowchart of a touch operation response method provided by an embodiment of the present application.
  • FIG. 6 is a schematic diagram of an example of a first window provided by an embodiment of the present application.
  • FIG. 7 is a schematic diagram of an example of a second window provided by an embodiment of the present application.
  • FIGS. 8a and 8b are schematic diagrams of effects of a multi-finger touch operation provided by an embodiment of the present application.
  • FIG. 9 is a schematic flowchart of a touch operation response method provided by an embodiment of the present application.
  • FIG. 10 is a schematic diagram of an example of a first window provided by an embodiment of the present application.
  • FIG. 11 is a schematic diagram of an example of a second window provided by an embodiment of the present application.
  • FIG. 12 is a schematic diagram of the effect of a thick pen touch event provided by an embodiment of the present application.
  • FIG. 13 is a schematic structural diagram of a touch operation response device provided by an embodiment of the present application.
  • FIG. 14 is a schematic structural diagram of a touch operation response device provided by an embodiment of the present application.
  • FIG. 15 is a schematic structural diagram of a terminal provided by an embodiment of the present application.
  • the first window refers to the window displayed by the application on the display interface when the user opens the application through the terminal.
  • the application can be any application in the set of existing applications. When there are multiple open applications, each application corresponds to one window, and the window selected by the user is the first window.
  • the current display interface displays the A application, the B application, and the C application. If the user selects the A application, the display window corresponding to the A application is the first window.
  • the display window corresponding to the application with the opening time closest to the current time may be determined as the first window, that is, multiple display windows are opened, and the display window opened last For the first window.
  • the window itself does not have any visible content, it provides a basic container for the view of the application.
  • the view refers to the content filled in the container, for example, displaying images, text, shapes or a combination thereof.
  • a window can be understood as a display window containing a view.
  • a view for receiving finger touch operations can be added to the second window.
  • the finger touch operation can be a single-finger touch operation or a multi-finger touch operation.
  • Second window Pointer to the window created by the first window, and the window is overlaid on the first window.
  • the second window covers the first window, which can be understood as the first window opened first, and the second window opened later.
  • the two windows can overlap completely or partially. There are many ways to display the two windows on the display interface, and a feasible way to display is shown in Figure 1.
  • the second window can be created by the user actively; the other can be for the user to actively open the application corresponding to the second window; the other can be to trigger the opening of the second window when the first window is in use (for example, PowerPoint (PPT) automatically starts the annotation application during playback, thereby opening a transparent window to cover the PPT window); or to pop up a second window by triggering (such as inserting a U disk to pop up, and pop up via shortcut keys).
  • PowerPoint PPT
  • pop up a second window by triggering (such as inserting a U disk to pop up, and pop up via shortcut keys).
  • the annotation application is a timely writing program, intended to supplement and expand certain information. Run as a background service.
  • there are functions such as pen, eraser, share, close, and page up and down.
  • PPT turning pages up and down and closing playback are functions to control PPT, and other information is writing-related information, for example, pen: writing; eraser: clear; sharing: saving the content locally, etc.; closing: closing annotations.
  • the corresponding display interface is shown in Figure 2. After opening the annotation application, the display interface displays "You are already in the annotation mode", and the corresponding function icon is floating on the PPT.
  • Finger touch operation refers to the operation generated by the user touching the display screen of the terminal with a finger.
  • the display screen has a touch sensing function, and is also called a "touch screen” or a “touch panel”, and is an inductive liquid crystal display device that can receive input signals such as contacts.
  • a touch sensing function and is also called a "touch screen” or a “touch panel”
  • an inductive liquid crystal display device that can receive input signals such as contacts.
  • the tactile feedback system on the screen can drive various connected devices according to the pre-programmed programs, which can be used to replace the mechanical button panel, and use the LCD screen to create dynamic audio-visual effects.
  • Differentiating touch screens from technical principles can be divided into five basic categories: vector pressure sensing technology touch screens, resistive technology touch screens, capacitive technology touch screens, infrared technology touch screens, and surface acoustic wave technology touch screens.
  • the touch screen is divided into four types, which are resistive, capacitive, infrared, and surface acoustic wave.
  • the inputted finger touch operation is input for the overlapping area of the two windows.
  • the first quantity range any set range, the value is between 0 and the preset quantity threshold, or the preset quantity threshold to 10.
  • the second quantity range any range different from the first quantity range, the same value is between 0 and the preset quantity threshold, or between the preset quantity threshold and 10.
  • the first quantity range may be greater than the second quantity range, or may be smaller than the second quantity range.
  • the preset number threshold is 1, if the first number range is 0 to 1 (that is, 1), it means single-finger touch, and the second number range is 2-10, which means multi-finger touch.
  • the preset number threshold is 2, if the first number range is 0-2, it means single-finger touch or two-finger touch, and the second number range is 3-10, which means three-finger touch or more fingers touch.
  • the preset number threshold is 1
  • the first number ranges from 0 to 1
  • the second number is greater than 1, as an example, which respectively indicate single-finger touch and multi-finger touch for description.
  • Pen touch operation refers to the operation generated by the user using the pen to touch the terminal screen.
  • the input pen touch operation is for the overlap area of the two windows.
  • the first area range any value range set and less than or equal to the preset area threshold.
  • the preset area threshold is an empirical value based on statistics of the pen tip contact area.
  • the second area range any range different from the first area range and greater than the preset area threshold.
  • the first area range is greater than the preset area threshold, and the second area range is less than or equal to the preset area threshold.
  • the first area range is less than the preset area threshold, it is determined as a fine pen touch, and the second area range is greater than the preset area threshold, then it is determined as a thick pen touch.
  • the description is made by assuming that the first area range is less than or equal to the preset area threshold, and the second area range is greater than the preset area threshold.
  • Attribute identification the identification used to uniquely identify the window, which can be TAG tag, Identification (ID), etc.
  • an attribute identifier is added to the second window to intercept the touch operation of the first window assigned to the lower layer, and return it to the second window based on the attribute identifier, thereby triggering the second window on the upper layer to respond. Should touch operation.
  • FIG. 3 is a schematic diagram of a system architecture provided by an embodiment of this application.
  • the system architecture may include a terminal 1000, a first window 2000 and a second window 3000.
  • the first window 2000 and the second window 3000 are overlapped and displayed on the current display interface of the terminal 1000, and the second window 3000 covers On the first window 2000, the overlapping area of the two windows is 4000.
  • the second window 3000 is overlaid on the first window 2000 of the display interface, the first window 2000 is any display window selected on the display interface, and the second window 3000 may be other than the first window.
  • the terminal 1000 may include a tablet computer, a personal computer (PC), a smart phone, a palmtop computer, and a mobile Internet device (MID) and other terminal devices with data computing and processing functions.
  • PC personal computer
  • MID mobile Internet device
  • the terminal 1000 receives a touch operation in the overlap area 4000;
  • the touch operation may include a finger touch operation, a pen touch operation, and the like.
  • the terminal 1000 before receiving a touch operation in the overlap area 4000, the terminal 1000 obtains a window creation instruction input for the first window 2000 on the display interface, and creates a second window based on the window creation instruction.
  • the terminal 1000 before receiving the touch operation in the overlapping area, the terminal 1000 obtains a view creation instruction input for the overlapping area 4000 on the display interface, and creates an operation receiving view in the overlapping area 4000;
  • the terminal 1000 receives a touch operation input on the operation receiving view.
  • the terminal 1000 before receiving the touch operation in the overlapping area, obtains a window creation instruction input for the first window 2000 selected on the display interface, and creates the second window 3000 based on the window creation instruction.
  • the window creation instruction may be input by the user for the first window 2000, indicating that the created second window 3000 is overlaid on the first window 2000.
  • the way for the user to input the window creation instruction can be to perform a touch operation on the first window 2000 (such as clicking the right mouse button on the "Window Creation” on the first window 2000, or clicking a function key on the first window 2000 to start the second window 2000.
  • the second application corresponding to the window 3000 may also be a voice signal "create a second window 3000 for the first window 2000" received through the voice receiving device of the terminal 1000.
  • the second window 3000 can also be opened actively, such as triggering the opening of the second window 3000 when the first window 2000 is in use.
  • the terminal 1000 receives a finger touch operation in the overlap area, acquires the number of touch points of the finger touch operation, and when the number of touch points meets a first number range, distributes the finger touch operation to all The first window 2000 is described, and the finger touch operation is responded to on the first window 2000.
  • the terminal 1000 synchronously distributes the finger touch operation to the second window 3000, but does not respond on the second window 3000.
  • the terminal 1000 receives the pen tip touch operation in the overlap area, acquires the touch contact area of the pen tip touch operation, and when the touch contact area meets the first area range, touch the pen tip The control operation is distributed to the first window 2000, and the pen touch operation is responded to on the first window 2000.
  • the terminal 1000 When the touch operation is of the second type, the terminal 1000 responds to the touch operation on the second window 3000, and the first type is different from the second type.
  • the terminal 1000 distributes the finger touch operation to the second window 3000, and responds to the finger touch operation on the second window 3000,
  • the second number range is different from the first number range.
  • the terminal 1000 distributes the pen touch operation to the second window 3000, and responds to the pen touch operation on the second window 3000 Control operation, the second area range is different from the first area range.
  • the terminal 1000 acquires the touch position of the pen touch operation, and inputs text information at the touch position on the second window.
  • the terminal 1000 obtains an identifier addition request input for the second window 3000, adds an attribute identifier to the second window 3000 based on the identifier addition request, and when responding on the second window 3000, the specific It can be used to intercept the touch operation distributed to the first window 2000, and when the touch operation is returned to the second window 3000 corresponding to the attribute identifier, triggering a response to the second window 3000 ⁇ touch operation.
  • the terminal displays a first window and a second window with an overlapping area on the display interface, and receives a touch operation in the overlapping area.
  • the touch operation is of the first type, Responding to the touch operation on the first window; when the touch operation is of the second type, responding to the touch operation on the second window.
  • Different operation types of the input touch operation can realize the response of different layers of windows in the overlapped window. At the same time, there is no need to modify the operation distribution logic.
  • the operation simplifies the realization process of the touch penetration function, thereby reducing the complexity of the realization of the touch penetration function, and can respond to different layers of windows under different operation types through a set of systems to improve user experience.
  • touch operation response method provided by the embodiment of the present application will be described in detail below in conjunction with FIG. 4 to FIG. 12.
  • the touch operation response device in the embodiment of the present application may be the terminal shown in FIG. 3.
  • FIG. 4 provides a schematic flowchart of a touch operation response method according to an embodiment of the present application.
  • the method of the embodiment of the present application may include the following steps:
  • S101 Display a first window and a second window on a display interface, where the first window and the second window have an overlapping area;
  • the first window and the second window displayed are different windows.
  • the first window and the second window have an overlapping area, and the two windows may overlap partially or completely. That is, there is a hierarchical relationship between the two windows. If the second window covers the first window, the second window is the upper window, and the second window is the lower window.
  • the two windows can correspond to different applications. Two windows can be opened by opening two applications, or the second window can be created after opening the first window.
  • the method of creating the second window please refer to the system architecture embodiment, which will not be repeated here.
  • the two windows are displayed on the same display interface. Specifically, the two windows can be displayed according to preset display rules.
  • the preset display rules may be window display size, window display position, window display style (static, dynamic), etc.
  • the touch operation is the operation input by the user on the overlapping area of the two windows.
  • the touch operation may include touch information such as the number of touch points, touch point fingerprints, touch point touch pressure value, touch point contact area, and touch track.
  • touch operation response device receives the touch operation, the touch information in the touch operation is acquired.
  • an operation receiving view is created in the overlapping area in advance, and the touch operation input is input on the operation receiving view.
  • the first type can be any operation type, and can be specifically classified based on touch information of the touch operation.
  • the first type may be equal to 1, greater than 1, greater than or equal to 1, and less than or equal to 3, and so on.
  • the first type when the touch information includes the touch contact area, the first type may be greater than or equal to the preset area threshold, less than the preset area preset, within the preset area threshold range, or outside the preset area threshold range, etc. .
  • the input touch operation is responded to on the first window. If the first window is a lower-layer window, the response is performed on the lower-layer window, thereby realizing the touch penetration function.
  • the second type is different from the first type and is also classified according to touch information.
  • the second type and the first type may be divided based on the same touch information, or may be divided based on different touch information.
  • the second type and the first type may be divided based on the number of touch points.
  • the first type is a first preset quantity
  • the second type is a second preset quantity
  • the first preset quantity is different from the second preset quantity.
  • the second type may be divided based on the number of touch points, and the first type may be divided based on touch contact area.
  • the input touch operation is responded to on the second window. If the second window is an upper-layer window, the response is performed on the upper-layer window, thereby realizing the basic response of the touch operation on the window.
  • the terminal displays a first window and a second window with an overlapping area on the display interface, and receives a touch operation in the overlapping area.
  • the touch operation is of the first type
  • the first window Respond to the touch operation on the window
  • the touch operation is of the second type
  • Different operation types of the input touch operation can realize the response of different layers of windows in the overlapping window.
  • the operation simplifies the realization process of the touch penetration function, thereby reducing the complexity of the realization of the touch penetration function, and can respond to different layers of windows under different operation types through a set of systems to improve user experience.
  • FIG. 5 provides a schematic flowchart of a touch operation response method according to an embodiment of this application.
  • the touch operation response method is applied to a terminal as an example.
  • the touch operation response method may include the following steps:
  • S201 Display a first window and a second window on a display interface, where the first window and the second window have an overlapping area;
  • S202 Obtain an identifier addition request input for the second window, and add an attribute identifier on the second window based on the identifier addition request.
  • the attribute identifier is used to uniquely identify the second window, and may include a TAG identifier, ID, etc.
  • the attribute identifier is added to the second window to facilitate searching for the second window, so that the second window can be quickly determined as a directional window.
  • the user performs an editing operation on the second window to generate a logo addition request.
  • the terminal After receiving the request, the terminal adds an attribute logo to the second window.
  • the attribute logo can be added to any display area of the second window, such as the upper left corner.
  • the added attribute identification can be found in the underlying identification library, or it can be currently generated.
  • each window has an attribute identifier
  • each attribute identifier is represented by a set of binary codes, which can include multiple bits (such as 32 bits), and find the attribute identifier of the second window in the underlying identifier library , And mark on the second window.
  • Table 1 there is a kind of attribute identification of each window stored in the underlying identification library, and the attribute identification of the second window can be obtained as 001011...1 by looking up Table 1.
  • S203 Acquire a view creation instruction input for an overlapping area on the display interface, and create an operation receiving view in the overlapping area;
  • Opening a window is actually opening an Activity.
  • Creating a view in the Activity can include the following two ways:
  • the created view is used to receive finger touch operations input by the user. It should be noted that the view is located in the overlapping area of the two windows (the first window and the second window) on the second window.
  • S204 Receive a finger touch operation input on the operation receiving view, and acquire the number of touch points of the finger touch operation;
  • the terminal can sense the generated touch operation and respond.
  • Finger touch operations can be single-finger touch operations or multi-finger touch operations, which can be distinguished by the number of touch points.
  • the distinction between single-finger and multi-finger is mainly based on the pointerCount of the underlying operation. If the current pointerCount is greater than or equal to 2, it is determined as a multi-finger touch, and if the current pointerCount is 1, it is determined as a single-finger touch.
  • the getPointerCount in the MotionEvent class can be used to obtain the number of touch points. If it returns 1, it means that one finger has pressed the screen, and if it returns 2, it means that two fingers have pressed the screen at the same time.
  • the finger touch operation may also include information such as the state of the touch point, the coordinates of the touch point, the touch pressure value, and the touch fingerprint.
  • the touch point status of the current operation can be obtained through event.getAction().
  • the operations of single-point pressing, releasing and moving are: MotionEvent.ACTION_DOWN, ACTION_UP, ACTION_MOVE.
  • the coordinates of the touch point can be obtained through event.getX() and event.getY(). If there are multiple touch points, use event.getX(0), event.getY(0) to get the coordinate value of the first point, and use event.getX(1), event.getY(1) to get the second one The coordinate value of the point. If there are more points, and so on.
  • S205 When the number of touch points meets a first number range, distribute the finger touch operation to the first window, and respond to the finger touch operation on the first window.
  • the first quantity range any set range, the value is between 0 and the preset quantity threshold, or the preset quantity threshold to 10.
  • the first number range is 0 to 1 as an example for description.
  • the finger touch operation is distributed to the first window in the lower layer of the two overlapping windows.
  • operation distribution refers to the process in which a touch operation (MotionEvent) is generated, and the system needs to pass it to a specific view (View).
  • the finger touch operation is passed to the first window.
  • the first window contains the view that receives the operation, that is, the finger touch operation is transferred to the view that receives the operation in the first window.
  • the operation distribution can be understood as: the user touches the display screen to generate a touch operation (MotionEvent), which is received by the Activity, and the operation is transmitted after the Activity receives it.
  • the transmission process is: Activity->Window->DecorView(DecorView It is the bottom container of the current interface, which is a ViewGroup) -> executes dispatchTouchEvent() of ViewGroup, among which dispatchTouchEvent() is used to distribute operations.
  • Responding to a finger touch operation on the first window can be understood as processing or consuming the operation of the finger touch operation on the first window.
  • onTouchEvent() in dispatchTouchEvent() and determine whether the finger touch operation is consumed by returning the result. If it returns false, it means it is not consumed, and in the same operation (finger touch operation) sequence
  • the window (the view in the window) will not receive the operation again; if it returns true, it indicates consumption.
  • the terminal distributes the operation of the finger touch operation to the second window located on the upper layer and the first window located on the lower layer respectively, and for the second window, onTouchEvent() is called in dispatchTouchEvent(), And returns false.
  • onTouchEvent() in dispatchTouchEvent() for the first window, responds to the finger touch operation and returns true, thereby realizing the penetration response of the finger touch operation on the overlapping window.
  • the two windows completely overlap.
  • the position of the operation ie the coordinates of the touch point
  • the position corresponds to the "open" of the red envelope on the first window
  • the position corresponds to the song name "ugly eight strange" on the second window
  • the red envelope is opened to realize the function of receiving red envelopes, And do not play the song "Ugly Eight Monsters”.
  • the terminal when the number of touch points meets the first number range, the terminal simultaneously distributes the finger touch operation to the second window, and can respond to the finger touch operation on the first window and the second window at the same time .
  • the two windows completely overlap.
  • the position of the touch operation that is, the position of the touch point
  • the position is on the first window corresponding to the "open" of the red envelope
  • the position corresponds to the song name "Ugly Bagua” on the second window
  • the song "Ugly Baguai” will be played and opened simultaneously
  • This red envelope realizes the function of receiving red envelopes.
  • the second quantity range is any range different from the first quantity range, and the value is also between 0 and the preset quantity threshold, or between the preset quantity threshold and 10.
  • the first quantity range may be greater than the second quantity range, or may be smaller than the second quantity range.
  • the second number is greater than 1 as an example for description.
  • the directional distribution of the finger touch operation can be performed based on the attribute identification of the second window.
  • onInterceptTouchEvent() can be called in dispatchTouchEvent() to intercept the finger touch operation.
  • the finger touch operation in response to the second window may include multiple operations such as page turning, writing, roaming, and opening another page.
  • the e-book page turning operation is performed on the second window, as shown in Figure 8b Display, and does not respond to the red envelope grab on the first window.
  • the terminal receives a finger touch operation input from a second window located on the upper layer of the displayed two overlapping windows, and obtains the number of touch points of the finger touch operation, when the number of touch points meets the first number range , Distribute the finger touch operation to the first window in the lower layer of the two overlapping windows, and respond to the finger touch operation on the first window.
  • the finger touch operation is distributed to the upper second window of the two overlapping windows, and the finger touch operation is responded to on the second window.
  • the finger trigger operation can be directly distributed to the lower window, which simplifies the realization process of the touch penetration function, thereby reducing the touch
  • the implementation complexity of the penetration function on the other hand, the window of different layers can respond to the finger touch operation by distinguishing the number of touch points of the finger touch operation, and the response function of the different layer windows can be realized through a system, and the applicable application window There are many types, and the interactivity between overlapping windows is increased.
  • FIG. 9 provides a schematic flowchart of a touch operation response method according to an embodiment of the present application.
  • the touch operation response method is applied to the terminal as an example.
  • the touch operation response method may include the following steps:
  • S301 Display a first window and a second window on a display interface, where the first window and the second window have an overlapping area;
  • S302 Obtain an identifier addition request input for the second window, and add an attribute identifier on the second window based on the identifier addition request.
  • S303 Acquire a view creation instruction input for an overlapping area on the display interface, and create an operation receiving view in the overlapping area;
  • Opening a window is actually opening an Activity.
  • Creating a view in the Activity can include the following two ways:
  • the created view is used to receive pen touch operations input by the user. It should be noted that the view is located in the overlapping area of the two windows (the first window and the second window) on the second window.
  • the terminal can sense the generated operation and respond.
  • the pen touch operation can be a thick pen touch operation or a fine pen touch operation, which can be distinguished by the touch contact area.
  • the distinction is made by comparing the size of the touch contact area with a preset area threshold. If the current touch contact area is less than or equal to the preset area threshold, it is determined to be a fine pen touch, and if the current touch contact area is greater than the preset area threshold, it is determined to be a thick pen touch.
  • the pen touch operation may also include information such as the state of the touch point, the coordinates of the touch point, and the touch pressure value.
  • the touch point status of the current operation can be obtained through event.getAction().
  • the operations of pressing, releasing and moving are: MotionEvent.ACTION_DOWN, ACTION_UP, ACTION_MOVE.
  • the coordinates of the touch point can be obtained through event.getX() and event.getY(). If there are multiple touch points, use event.getX(0), event.getY(0) to get the coordinate value of the first point, and use event.getX(1), event.getY(1) to get the second The coordinate values of points. If there are more points, and so on.
  • the input thick pen touch operation is for the overlapping area of the two overlapping windows.
  • the first area range is any set value range, which is less than or equal to the preset area threshold, or greater than the preset area threshold. In the embodiment of the present application, description is made by taking an example that the first area range is less than or equal to the preset area threshold.
  • the pen touch operation is distributed to the first window in the lower layer of the two overlapping windows.
  • operation distribution refers to the process in which a touch operation (MotionEvent) is generated, and the system needs to pass it to a specific view (View).
  • the pen touch operation is passed to the first A window.
  • the first window contains the view for receiving the operation, that is, the pen touch operation is transferred to the view for receiving the operation in the first window.
  • the operation distribution can be understood as: the user touches the display screen to generate a touch operation (MotionEvent), which is received by the Activity, and the operation is transmitted after the Activity receives it.
  • the transmission process is: Activity->Window->DecorView(DecorView It is the bottom container of the current interface, which is a ViewGroup) -> executes dispatchTouchEvent() of ViewGroup, among which dispatchTouchEvent() is used to distribute operations.
  • Responding to the pen touch operation on the first window can be understood as processing or consuming the pen touch operation on the first window.
  • onTouchEvent() in dispatchTouchEvent(), and determine whether the pen touch operation is consumed by returning the result. If it returns false, it means it is not consumed, and it is in the same operation (pen touch operation)
  • the window (view in the window) in the sequence will not receive the operation again; if it returns true, it indicates consumption.
  • the terminal distributes the pen touch operation to the second window located on the upper layer and the first window located on the lower layer respectively.
  • onTouchEvent() is called in dispatchTouchEvent() and returns false, at this time, the trigger calls onTouchEvent() in dispatchTouchEvent() for the first window, responds to the pen touch operation and returns true, thereby realizing the penetrating response of the pen touch operation on the overlapping window.
  • the two windows are completely overlapped.
  • the position of the operation that is, the touch point
  • the location is on the first window (PPT display window) corresponding to the link "Http://www.abc.com”
  • the location belongs to the information writing area on the second window (annotation application display window)
  • the page corresponding to the link is opened, and no information is entered on the second window.
  • the terminal when the touch contact area meets the first area range, the terminal also distributes the pen tip touch operation to the second window at the same time, and can respond to the pen tip simultaneously on the first window and the second window Touch operation.
  • the two windows completely overlap.
  • the position of the operation that is, the position of the touch point
  • the position of the operation that is, the position of the touch point
  • Coordinates if the location is at the corresponding link "Http://www.abc.com" on the first window (PPT display window), and the location belongs to the information writing area on the second window (annotation application display window), If it is a fine pen touch, the page corresponding to the link is opened, and information is written on the second window synchronously.
  • the second area range is any range different from the first area range, and may be less than or equal to the preset area threshold, and may also be greater than the preset area threshold. In the embodiment of the present application, when the first area range is less than or equal to the preset area threshold, the second area range is greater than the preset area threshold.
  • the pen touch operation is distributed to the second window located on the upper layer of the two overlapping windows.
  • the directional distribution of the pen touch operation can be performed based on the attribute identification of the second window.
  • onInterceptTouchEvent() can be called in dispatchTouchEvent() to intercept the pen touch operation.
  • the pen touch operation in response to the second window may include multiple operations such as turning pages, writing, roaming, and opening another page.
  • the first window is shown in Figure 10
  • the second window is shown in Figure 12
  • an input box and a virtual keyboard will pop up on the second window to facilitate the user to input text Information, meanwhile, does not respond to the link on the first window.
  • the terminal receives a pen touch operation input from a second window located on the upper layer of the displayed two overlapping windows, and obtains the touch contact area of the pen touch operation.
  • the touch contact area satisfies the first In a range of an area
  • the pen touch operation is distributed to the first window in the lower layer of the two overlapping windows, and the pen touch operation is responded to on the first window.
  • the touch contact area meets the second area range
  • the pen touch operation is distributed to the upper second window of the two overlapping windows, and the pen touch operation is responded to on the second window.
  • the touch contact area can be directly distributed to the lower window, which simplifies the realization process of the touch penetration function, and then Reduce the complexity of the touch penetration function;
  • the windows of different layers can respond to the pen touch operation, and the response of different layers of windows can be realized through a set of systems Function, applicable to more types of application windows, and increase the interactivity between overlapping windows.
  • FIG. 13 shows a schematic structural diagram of a touch operation response device provided by an exemplary embodiment of the present application.
  • the touch operation response device can be implemented as all or a part of the terminal through software, hardware or a combination of both.
  • the device includes a window display module 11, an operation receiving module 12, a first response module 13 and a second response module 14.
  • the window display module 11 is configured to display a first window and a second window on a display interface, where the first window and the second window have an overlapping area;
  • the operation receiving module 12 is configured to receive a touch operation in the overlapping area
  • the first response module 13 is configured to respond to the touch operation on the first window when the touch operation is of the first type
  • the second response module 14 is configured to respond to the touch operation on the second window when the touch operation is of the second type, and the first type is different from the second type.
  • the operation receiving module 12 is specifically configured to:
  • the first response module is specifically used for:
  • the second response module is specifically used for:
  • the finger touch operation is distributed to the second window, and the finger touch operation is responded to on the second window.
  • the first quantity range is different.
  • the operation receiving module is specifically used for:
  • the first response module is specifically used for:
  • the pen touch operation is distributed to the first window, and the pen touch operation is responded to on the first window.
  • the second response module is specifically used for:
  • the touch contact area meets the second area range, distribute the pen touch operation to the second window, and respond to the pen touch operation on the second window, the second area The range is different from the first area range.
  • the second response module is specifically used for:
  • the device 1 further includes:
  • the view creation module 15 is configured to obtain a view creation instruction input for an overlap area on the display interface, and create an operation receiving view in the overlap area;
  • the operation receiving module 12 is specifically configured to:
  • the touch operation input on the operation receiving view is received.
  • the device 1 further includes:
  • the logo adding module 16 is configured to obtain a logo addition request input for the second window, and add an attribute logo on the second window based on the logo addition request;
  • the second response module 14 is specifically configured to:
  • touch operation response device when the touch operation response device provided in the above embodiment executes the touch operation response method, only the division of the above-mentioned functional modules is used for illustration. In actual applications, the above-mentioned functions can be assigned to different functions according to needs. Module completion means dividing the internal structure of the device into different functional modules to complete all or part of the functions described above.
  • touch operation response device provided in the foregoing embodiment and the touch operation response method embodiment belong to the same concept, and the implementation process of the touch operation response method is described in the method embodiment, which will not be repeated here.
  • the terminal displays a first window and a second window with an overlapping area on the display interface, and receives a touch operation in the overlapping area.
  • the first window Respond to the touch operation on the window; when the touch operation is of the second type, respond to the touch operation on the second window.
  • Different operation types of the input touch operation can be realized in different layer windows in the overlapping window to respond, and at the same time, the touch operation can be distributed to the lower window in the overlapping window without modifying the operation distribution logic, which simplifies the touch penetration
  • the realization process of the penetration function further reduces the realization complexity of the touch penetration function.
  • the embodiment of the present application also provides a computer storage medium.
  • the computer storage medium may store a plurality of instructions, and the instructions are suitable for being loaded by a processor and executing the method steps of the embodiments shown in FIGS. 4-12.
  • the specific execution process please refer to the specific description of the embodiment shown in FIG. 4 to FIG. 12, which is not repeated here.
  • the present application also provides a computer program product that stores at least one instruction, and the at least one instruction is loaded and executed by the processor to implement the focus state adjustment method described in each of the above embodiments.
  • the terminal 1000 may include: at least one processor 1001, at least one network interface 1004, a user interface 1003, a memory 1005, and at least one communication bus 1002.
  • the communication bus 1002 is used to implement connection and communication between these components.
  • the user interface 1003 may include a display screen (Display) and a camera (Camera), and the optional user interface 1003 may also include a standard wired interface and a wireless interface.
  • Display display screen
  • Camera Camera
  • the optional user interface 1003 may also include a standard wired interface and a wireless interface.
  • the network interface 1004 may optionally include a standard wired interface and a wireless interface (such as a WI-FI interface).
  • the processor 1001 may include one or more processing cores.
  • the processor 1001 uses various excuses and lines to connect various parts of the entire terminal 1000, and executes the terminal by running or executing instructions, programs, code sets, or instruction sets stored in the memory 1005, and calling data stored in the memory 1005.
  • the processor 1001 may use at least one of Digital Signal Processing (DSP), Field-Programmable Gate Array (FPGA), and Programmable Logic Array (PLA).
  • DSP Digital Signal Processing
  • FPGA Field-Programmable Gate Array
  • PDA Programmable Logic Array
  • the processor 1001 may integrate one or a combination of a central processing unit (CPU), a graphics processing unit (GPU), a modem, and the like.
  • the CPU mainly processes the operating system, user interface, and application programs; the GPU is used to render and draw the content that the display needs to display; the modem is used to process wireless communication. It is understandable that the above-mentioned modem may not be integrated into the processor 1001, but may be implemented by a chip alone.
  • the memory 1005 may include random access memory (RAM) or read-only memory (Read-Only Memory).
  • the memory 1005 includes a non-transitory computer-readable storage medium.
  • the memory 1005 may be used to store instructions, programs, codes, code sets or instruction sets.
  • the memory 1005 may include a storage program area and a storage data area, where the storage program area may store instructions for implementing the operating system and instructions for at least one function (such as touch function, sound playback function, image playback function, etc.), Instructions used to implement the foregoing method embodiments, etc.; the storage data area can store the data involved in the foregoing method embodiments.
  • the memory 1005 may also be at least one storage device located far away from the foregoing processor 1001.
  • the memory 1005, which is a computer storage medium may include an operating system, a network communication module, a user interface module, and a touch operation response application program.
  • the user interface 1003 is mainly used to provide an input interface for the user to obtain data input by the user; and the processor 1001 may be used to call the touch operation response application stored in the memory 1005, and specifically Do the following:
  • the first type is different from the second type.
  • the processor 1001 specifically performs the following operations when performing a touch operation received in the overlapping area:
  • the finger touch operation is distributed to the second window, and the finger touch operation is responded to on the second window.
  • the first quantity range is different.
  • the processor 1001 specifically performs the following operations when performing a touch operation received in the overlapping area:
  • the pen touch operation is distributed to the first window, and the pen touch operation is responded to on the first window.
  • the touch contact area meets the second area range, distribute the pen touch operation to the second window, and respond to the pen touch operation on the second window, the second area The range is different from the first area range.
  • processor 1001 when the processor 1001 performs the following operations in response to the pen touch operation on the second window:
  • the processor 1001 further performs the following operations before performing the touch operation received in the overlapping area:
  • the processor 1001 specifically performs the following operations when performing a touch operation received in the overlapping area:
  • the touch operation input on the operation receiving view is received.
  • the processor 1001 further performs the following operations:
  • the terminal displays a first window and a second window with an overlapping area on the display interface, and receives a touch operation in the overlapping area.
  • the first window Respond to the touch operation on the window; when the touch operation is of the second type, respond to the touch operation on the second window.
  • Different operation types of the input touch operation can be realized in different layer windows in the overlapping window to respond, and at the same time, the touch operation can be distributed to the lower window in the overlapping window without modifying the operation distribution logic, which simplifies the touch penetration
  • the realization process of the penetration function further reduces the realization complexity of the touch penetration function.
  • the program can be stored in a computer readable storage medium. During execution, it may include the procedures of the above-mentioned method embodiments.
  • the storage medium can be a magnetic disk, an optical disc, a read-only storage memory or a random storage memory, etc.

Abstract

一种触摸操作响应方法、装置、存储介质及终端,属于计算机技术领域。所述方法包括:在显示界面显示第一窗口和第二窗口,所述第一窗口和所述第二窗口具有重叠区域(S101);接收在所述重叠区域的触摸操作(S102);当所述触摸操作为第一类型时,在所述第一窗口上响应所述触摸操作(S103);当所述触摸操作为第二类型时,在所述第二窗口上响应所述触摸操作,所述第一类型与所述第二类型不同(S104)。因此,可以简化触摸穿透功能的实现过程,进而可以减小触摸穿透功能的实现复杂度。

Description

触摸操作响应方法、装置、存储介质及终端
本公开要求在2019年02月19日提交中国专利局、申请号为201910122474.6的中国专利申请的优先权,以上申请的全部内容通过引用结合在本公开中。
技术领域
本申请涉及计算机技术领域,例如涉及一种触摸操作响应方法、装置、存储介质及终端。
背景技术
各类智能显示设备,尤其是大屏显示设备,在生活或者工作中的应用频率越来越高。很多情况下,智能显示设备的显示界面在同一时间需要通过多个显示窗口进行相应内容的显示,多个显示窗口可能重叠在一起,即显示界面的多个显示窗口可能被其他一个或者几个显示窗口所覆盖。
对于安卓(Android)系统而言,在两重叠的显示窗口应用场景下,为了通过对上层窗口进行触摸操作可以在不同场景下实现对不同层窗口的控制,尤其实现对下层窗口的控制(即触摸穿透功能),通常需要修改系统的操作分发逻辑,从而使得可以在下层窗口上对触摸操作对应的触摸事件进行响应,但这种触摸穿透功能操作比较复杂,实现难度大。
发明内容
本申请提供了一种触摸操作响应方法、装置、存储介质及终端,可以解决触摸穿透功能操作比较复杂,实现难度大的问题。所述技术方案如下:
本申请提供了一种触摸操作响应方法,包括:
在显示界面显示第一窗口和第二窗口,所述第一窗口和所述第二窗口具有重叠区域;
接收在所述重叠区域的触摸操作;
当所述触摸操作为第一类型时,在所述第一窗口上响应所述触摸操作;
当所述触摸操作为第二类型时,在所述第二窗口上响应所述触摸操作,所述第一类型与所述第二类型不同。
本申请还提供了一种触摸操作响应装置,包括:
窗口显示模块,用于在显示界面显示第一窗口和第二窗口,所述第一窗口和所述第二窗口具有重叠区域;
操作接收模块,用于接收在所述重叠区域的触摸操作;
第一响应模块,用于当所述触摸操作为第一类型时,在所述第一窗口上响应所述触摸操作;
第二响应模块,用于当所述触摸操作为第二类型时,在所述第二窗口上响应所述触摸操作,所述第一类型与所述第二类型不同。
本申请还提供一种计算机存储介质,其中,所述计算机存储介质存储有多条指令,所述指令适于由处理器加载并执行上述的方法步骤。
本申请还提供一种终端,包括:处理器和存储器;其中,所述存储器存储有计算机程序,所述计算机程序适于由所述处理器加载并执行上述的方法步骤。
本申请的方案在执行时,终端在显示界面显示具有重叠区域的第一窗口和第二窗口,并接收在该重叠区域的触摸操作,当所述触摸操作为第一类型时,在所述第一窗口上响应所述触摸操作;当所述触摸操作为第二类型时,在所述第二窗口上响应所述触摸操作。通过输入的触摸操作的不同的操作类型就可以实现在重叠窗口中不同层窗口进行响应,同时,不需要修改操作分发逻辑,只需要基于触摸操作的操作类型的匹配就可以实现在下层窗口响应触摸操作,简化了触摸穿透功能的实现过程,进而减小了触摸穿透功能的实现复杂度,并可以通过一套系统在不同的操作类型下在不同层窗口进行响应,提升用户体验。
附图说明
图1是本申请实施例提供的一种第一窗口与第二窗口的展示效果示意图;
图2是本申请实施例提供的一种第一窗口与第二窗口的举例示意图;
图3是本申请实施例提供的一种系统架构的结构示意图;
图4是本申请实施例提供的一种触摸操作响应方法的流程示意图;
图5是本申请实施例提供的一种触摸操作响应方法的流程示意图;
图6是本申请实施例提供的一种第一窗口的举例示意图;
图7是本申请实施例提供的一种第二窗口的举例示意图;
图8a和8b是本申请实施例提供的一种多指触摸操作的效果示意图;
图9是本申请实施例提供的一种触摸操作响应方法的流程示意图;
图10是本申请实施例提供的一种第一窗口的举例示意图;
图11是本申请实施例提供的一种第二窗口的举例示意图;
图12是本申请实施例提供的一种粗笔触控事件的效果示意图;
图13是本申请实施例提供的一种触摸操作响应装置的结构示意图;
图14是本申请实施例提供的一种触摸操作响应装置的结构示意图;
图15是本申请实施例提供的一种终端的结构示意图。
具体实施方式
下面将结合附图对本申请实施例作进一步地详细描述。
下面的描述涉及附图时,除非另有表示,不同附图中的相同数字表示相同或相似的要素。以下示例性实施例中所描述的实施方式并不代表与本申请相一致的所有实施方式。相反,它们仅是如所附权利要求书中所详述的、本申请的一些方面相一致的装置和方法的例子。
在本申请的描述中,需要理解的是,术语“第一”、“第二”等仅用于描述目的,而不能理解为指示或暗示相对重要性。对于本领域的普通技术人员而言,可以具体情况理解上述术语在本申请中的具体含义。此外,在本申请的描述中,除非另有说明,“多个”是指两个或两个以上。“和/或”,描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B这三种情况。字符“/”一般表示前后关联对象是一种“或”的关系。
首先,对本申请实施例涉及到的一些名词进行解释:
第一窗口:指用户通过终端打开应用程序时,该应用程序在显示界面所显示的窗口。该应用程序 可以是现有应用程序集合中的任一应用程序。当打开的应用程序包括多个时,每个应用程序分别对应一个窗口,而用户在其中所选择的窗口即为第一窗口。
例如,在当前显示界面显示有A应用程序、B应用程序和C应用程序,若用户选择了A应用程序,那么A应用程序对应的显示窗口即为第一窗口。
可选的,也可将用户打开的多个应用程序中,开启时间离当前时间最近的应用程序所对应的显示窗口确定为第一窗口,即打开了多个显示窗口,而最后打开的显示窗口为第一窗口。
通常,Android系统中的“窗口”类型虽然很多,但只有三大类是经常使用的:一种是由系统进程管理的,称之为“系统窗口”;另一种就是由应用程序产生的,用于显示UI界面的“应用窗口”;还一种是“子窗口”,该窗口不能单独存在,需要依附于父窗口,如dialog。在本申请实施例中所提及的窗口即为“应用窗口”。
另外,窗口本身不具有任何可见的内容,它对于应用程序的视图提供了一个基本的容器。而视图是指填充在该容器中的内容,例如,显示图像,文本,形状或其组合。在本申请实施例中,可以将窗口理解为包含视图的显示窗口。
在Android系统显示架构中,每一个要显示的视图所依托的窗口有一个类型区分,不同类型的窗口的视图显示的层次不同,位置不同,显示/销毁的过程不同等等。
当然,也可以在窗口中添加视图。例如,可以直接通过窗口的根视图管理器WindowManager的addView()添加视图。在本申请实施例中可以在第二窗口中添加一个用来接收手指触摸操作的视图。其中,该手指触摸操作可以为单指触摸操作,也可以为多指触摸操作。
第二窗口:指针对第一窗口所创建的窗口,且该窗口覆盖在第一窗口上。
第二窗口覆盖在第一窗口上,可以理解为,先开启的第一窗口,后开启的第二窗口。两窗口可以完全重叠,也可以部分重叠。两窗口在显示界面的展示方式有多种,一种可行的展示方式如图1所示。
其中,第二窗口的创建方式可以为用户主动创建的;另一种可以为用户主动开启第二窗口对应的应用程序;还一种可以为当第一窗口在使用时触发第二窗口的开启(如PowerPoint(PPT)在播放中时自动开启批注应用,从而开启一透明窗口覆盖在PPT窗口上);又或者为通过触发方式弹出一个第二窗口(如插入U盘弹出,通过快捷键弹出)。
其中,批注应用为一个及时书写程序,意在对某些信息进行补充与扩展。以后台服务方式运行。在批注应用中,存在笔、橡皮、分享、关闭、上下翻页等功能。针对PPT而言,上下翻页以及关闭播放为控制PPT的功能,其他信息为书写相关信息,例如,笔:书写;橡皮:清除;分享:将内容保存到本地等;关闭:关闭批注。其对应的显示界面如图2所示,开启批注应用后,显示界面显示“你已处于批注模式”,且所对应的功能图标悬浮在PPT上。
手指触摸操作:指用户用手指在终端的显示屏上进行触摸而生成的操作。
所述显示屏具有触控感应功能,又称为“触控屏”、“触控面板”,是一种可接收触头等输入讯号的感应式液晶显示装置。当接触了屏幕上的图形按钮时,屏幕上的触觉反馈系统可根据预先编程的程式驱动各种连结装置,可用以取代机械式的按钮面板,并借由液晶显示画面制造出生动的影音效果。
从技术原理来区别触摸屏,可分为五个基本种类:矢量压力传感技术触摸屏、电阻技术触摸屏、 电容技术触摸屏、红外线技术触摸屏、表面声波技术触摸屏。
按照触摸屏的工作原理和传输信息的介质,把触摸屏分为四种,它们分别为电阻式、电容感应式、红外线式以及表面声波式。
需要说明的是,当第一窗口与第二窗口部分重叠时,所输入的手指触摸操作是针对两窗口的重叠区域进行输入。
第一数量范围:所设定的任一范围,取值在0~预设数量阈值之间,或者预设数量阈值~10之间。
第二数量范围:区别于第一数量范围的任一范围,同样取值在0~预设数量阈值之间,或者预设数量阈值~10之间。第一数量范围可以大于第二数量范围,也可小于第二数量范围。
例如,预设数量阈值为1时,若第一数量范围为0~1(即1),表示单指触摸,第二数量范围为2~10,表示多指触摸。
又例如,预设数量阈值为2,若第一数量范围为0~2,表示单指触摸或两指触摸,第二数量范围为3~10,表示三指触摸或更多指触摸。
在本申请实施例中,以预设数量阈值为1,那么第一数量范围为0~1,第二数量大于1为例,分别表示单指触摸和多指触摸进行说明。
笔头触控操作:指用户用笔头在终端的显示屏上进行触控而生成的操作。
需要说明的是,当第一窗口与第二窗口部分重叠时,所输入的笔头触控操作是针对两窗口的重叠区域进行输入。
第一面积范围:所设定的任一取值范围,且小于或者等于预设面积阈值。预设面积阈值为根据笔头接触面积统计的经验值。
第二面积范围:区别于第一面积范围的任一范围,且大于预设面积阈值。
当然,也可以为第一面积范围为大于预设面积阈值,第二面积范围为小于或者等于预设面积阈值。
例如,第一面积范围为小于预设面积阈值,则确定为细笔触控,第二面积范围大于预设面积阈值,则确定为粗笔触控。
在本申请实施例中,以第一面积范围小于或者等于预设面积阈值,第二面积范围大于预设面积阈值进行说明。
属性标识:用于唯一识别窗口的标识,可以为TAG标签、Identification(ID)等。
在本申请实施例中,为第二窗口添加属性标识,用于将分配至下层的第一窗口的触摸操作拦截,并基于属性标识回传至第二窗口,从而触发在上层的第二窗口响应该触摸操作。
请参见图3,为本申请实施例提供的一种系统架构的示意图。如图3所示,所述系统架构可以包括终端1000,第一窗口2000以及第二窗口3000,第一窗口2000以及第二窗口3000重叠显示在终端1000的当前显示界面,且第二窗口3000覆盖在第一窗口2000上,两窗口的重叠区域为4000。
所述第二窗口3000覆盖在所述显示界面的第一窗口2000上,所述第一窗口2000为所述显示界面上所选择的任一显示窗口,所述第二窗口3000可以为除第一窗口对应的第一应用以外的任一应用对应的显示窗口。
其中,所述终端1000可以包括平板电脑、个人计算机(PC)、智能手机、掌上电脑以及移动互联 网设备(MID)等具备数据运算处理功能的终端设备。
如图3所示,终端1000接收在所述重叠区域4000的触摸操作;
其中,所述触摸操作可以包括手指触摸操作,笔头触控操作等。
可选的,在接收在所述重叠区域4000的触摸操作之前,终端1000获取针对显示界面上第一窗口2000输入的窗口创建指令,基于所述窗口创建指令创建第二窗口。
可选的,所述接收在所述重叠区域的触摸操作之前,终端1000获取针对所述显示界面上的重叠区域4000输入的视图创建指令,在所述重叠区域4000创建操作接收视图;
终端1000接收在所述操作接收视图上输入的触摸操作。
可选的,在接收在所述重叠区域的触摸操作之前,终端1000获取针对显示界面上所选择的第一窗口2000输入的窗口创建指令,基于所述窗口创建指令创建第二窗口3000。
其中,窗口创建指令可以为用户针对第一窗口2000输入的,表明所创建的第二窗口3000是覆盖在第一窗口2000上的。而用户输入窗口创建指令的方式可以为在第一窗口2000上进行触控操作(如在第一窗口2000上点击鼠标右键的“窗口创建”,或者在第一窗口2000上点击功能键开启第二窗口3000对应的第二应用),还可以为通过终端1000的语音接收设备接收语音信号“针对第一窗口2000创建第二窗口3000”等。
当然,第二窗口3000除了被动开启外,还可以主动开启,如当第一窗口2000在使用时触发第二窗口3000的开启。
当所述触摸操作为第一类型时,在所述第一窗口2000上响应所述触摸操作;
具体可以为,终端1000接收在所述重叠区域的手指触摸操作,获取所述手指触摸操作的触摸点数量,当所述触摸点数量满足第一数量范围时,将所述手指触摸操作分发至所述第一窗口2000,并在所述第一窗口2000上响应所述手指触摸操作。
当然,当所述触摸点数量满足第一数量范围时,终端1000同步将所述手指触摸操作分发至所述第二窗口3000,但在第二窗口3000上不进行响应。
还可以为,终端1000接收在所述重叠区域的笔头触控操作,获取所述笔头触控操作的触控接触面积,当所述触控接触面积满足第一面积范围时,将所述笔头触控操作分发至所述第一窗口2000,并在所述第一窗口2000上响应所述笔头触控操作。
当所述触摸操作为第二类型时,终端1000在所述第二窗口3000上响应所述触摸操作,所述第一类型与所述第二类型不同。
具体可以为,当所述触摸点数量满足第二数量范围时,终端1000将所述手指触摸操作分发至所述第二窗口3000,并在所述第二窗口3000上响应所述手指触摸操作,所述第二数量范围与所述第一数量范围不同。
还可以为,当所述触控接触面积满足第二面积范围时,终端1000将所述笔头触控操作分发至所述第二窗口3000,并在所述第二窗口3000上响应所述笔头触控操作,所述第二面积范围与所述第一面积范围不同。
其中,在一种具体的方式中,终端1000获取所述笔头触控操作的触控位置,并在所述第二窗口 上的所述触控位置处输入文本信息。
可选的,终端1000获取针对所述第二窗口3000输入的标识添加请求,基于所述标识添加请求在所述第二窗口3000上添加属性标识,当在第二窗口3000上进行响应时,具体可以为拦截分发至所述第一窗口2000的所述触摸操作,并在将所述触摸操作回传至所述属性标识对应的第二窗口3000时,触发在所述第二窗口3000上响应所述触摸操作。
本申请实施例的方案在执行时,终端在显示界面显示具有重叠区域的第一窗口和第二窗口,并接收在该重叠区域的触摸操作,当所述触摸操作为第一类型时,在所述第一窗口上响应所述触摸操作;当所述触摸操作为第二类型时,在所述第二窗口上响应所述触摸操作。通过输入的触摸操作的不同的操作类型就可以实现在重叠窗口中不同层窗口进行响应,同时,不需要修改操作分发逻辑,只需要基于触摸操作的操作类型的匹配就可以实现在下层窗口响应触摸操作,简化了触摸穿透功能的实现过程,进而减小了触摸穿透功能的实现复杂度,并可以通过一套系统在不同的操作类型下在不同层窗口进行响应,提升用户体验。
下面将结合附图4-附图12,对本申请实施例提供的触摸操作响应方法进行详细介绍。其中,本申请实施例中的触摸操作响应装置可以是图3所示的终端。
请参见图4,为本申请实施例提供了一种触摸操作响应方法的流程示意图。如图4所示,本申请实施例的所述方法可以包括以下步骤:
S101,在显示界面显示第一窗口和第二窗口,所述第一窗口和所述第二窗口具有重叠区域;
所显示的第一窗口和第二窗口为不同的窗口。且第一窗口与第二窗口具有重叠区域,两窗口可以部分重叠,也可以完全重叠。也就是说,两窗口之间具有层次关系,若第二窗口覆盖在第一窗口上,则第二窗口为上层窗口,第二窗口为下层窗口。
两窗口可以分别对应不同的应用。可通过开启两应用从而开启两窗口,也可以是开启第一窗口后,再创建第二窗口。创建第二窗口的方式具体可参见系统架构实施例,此处不再赘述。
在两窗口都开启后,则在同一显示界面显示这两个窗口。而具体可按照预设的显示规则对两窗口进行显示。
其中,所述预设的显示规则可以为窗口显示大小、窗口显示位置、窗口显示风格(静态、动态)等。
S102,接收在所述重叠区域的触摸操作;
所述触摸操作即为用户针对两窗口的重叠区域所输入的操作。该触摸操作中可以包括触摸点数量、触摸点指纹、触摸点触摸压力值、触摸点接触面积、触摸轨迹等触摸信息。当触摸操作响应装置接收到该触摸操作时,获取该触摸操作中的触摸信息。
需要说明的是,在该重叠区域预先创建有操作接收视图,所输入的触摸操作即为在该操作接收视图上输入。
S103,当所述触摸操作为第一类型时,在所述第一窗口上响应所述触摸操作;
第一类型可以是任意操作类型,具体可基于触摸操作的触摸信息进行划分。
例如,当触摸信息包括触摸点数量时,第一类型可以为等于1、大于1、大于等于1且小于等于3, 等等。
又例如,当触摸信息包括触摸接触面积时,第一类型可以为大于或者等于预设面积阈值、小于预设面积预设、在预设的面积阈值范围内或在预设的面积阈值范围外等。
通过将触摸操作的触摸信息与第一类型进行比对,若比对一致,则在第一窗口上响应所输入的触摸操作。若第一窗口为下层窗口,则为在下层窗口上进行响应,从而实现了触摸穿透功能。
S104,当所述触摸操作为第二类型时,在所述第二窗口上响应所述触摸操作,所述第一类型与所述第二类型不同。
相应的,第二类型与第一类型不同,同样是根据触摸信息进行划分。当然,第二类型与第一类型可以是基于相同的触摸信息划分,也可以是基于不同的触摸信息划分。
例如,第二类型与第一类型可以是都基于触摸点数量进行划分。第一类型为第一预设数量,第二类型为第二预设数量,且第一预设数量与第二预设数量不同。
又例如,第二类型可以是基于触摸点数量划分,第一类型可以是基于触摸接触面积划分。
通过将触摸操作的触摸信息与第二类型进行比对,若比对一致,则在第二窗口上响应所输入的触摸操作。若第二窗口为上层窗口,则为在上层窗口上进行响应,从而实现了触摸操作在窗口上的基本响应。
在本申请实施例中,终端在显示界面显示具有重叠区域的第一窗口和第二窗口,并接收在该重叠区域的触摸操作,当所述触摸操作为第一类型时,在所述第一窗口上响应所述触摸操作;当所述触摸操作为第二类型时,在所述第二窗口上响应所述触摸操作。通过输入的触摸操作的不同的操作类型就可以实现在重叠窗口中不同层窗口进行响应,同时,不需要修改操作分发逻辑,只需要基于触摸操作的操作类型的匹配就可以实现在下层窗口响应触摸操作,简化了触摸穿透功能的实现过程,进而减小了触摸穿透功能的实现复杂度,并可以通过一套系统在不同的操作类型下在不同层窗口进行响应,提升用户体验。
请参见图5,为本申请实施例提供了一种触摸操作响应方法的流程示意图。本实施例以触摸操作响应方法应用于终端中来举例说明。该触摸操作响应方法可以包括以下步骤:
S201,在显示界面显示第一窗口和第二窗口,所述第一窗口和所述第二窗口具有重叠区域;
具体可参见S101,此处不再赘述。
S202,获取针对所述第二窗口输入的标识添加请求,基于所述标识添加请求在所述第二窗口上添加属性标识;
属性标识用于唯一识别第二窗口,可以包括TAG标识、ID等。对第二窗口添加属性标识是为了方便对第二窗口进行搜索,从而可以快速确定该第二窗口为定向窗口。
用户对第二窗口进行编辑操作,从而生成标识添加请求,终端在接收到该请求后,对第二窗口添加属性标识,该属性标识可以添加在第二窗口的任何显示区域,如左上角。而所添加的属性标识可以是在底层标识库中查找到的,也可以是当前生成的。
其中一种可行的实现方式为,每个窗口都有一属性标识,每个属性标识采用一组二进制码表示,可以包括多位(如32位),在底层标识库中查找第二窗口的属性标识,并在第二窗口上进行标记。
例如,如表1所示为在底层标识库中存储的一种各窗口的属性标识,通过查找表1可得到第二窗口的属性标识为001011…1。
表1
窗口 属性标识
第一窗口 100001…1
第二窗口 001011…1
第三窗口 110011…0
S203,获取针对所述显示界面上的重叠区域输入的视图创建指令,在所述重叠区域创建操作接收视图;
打开一个窗口,实际上就是打开了一个Activity,在Activity中创建视图(View)可以包括如下两种方式:
其一,调用Activity的onCreate方法,并执行setContentView,从而创建View对象;其二,获取一个WindowManager,并调用其addView方法,将视图交给WindowManagerService进行管理。
所创建的视图用于接收用户输入的手指触摸操作。需要说明的是,该视图位于第二窗口上两窗口(第一窗口与第二窗口)的重叠区域。
S204,接收在所述操作接收视图上输入的手指触摸操作,获取所述手指触摸操作的触摸点数量;
由于所创建的操作接收视图是用于接收触摸操作,因此,当用户针对该视图进行手指触摸时,终端才能感应到所产生的触摸操作并进行响应。
手指触摸操作可以为单指触摸操作,也可为多指触摸操作,可通过触摸点数量进行区分。
在一种可行的实现方式中,区分单指与多指主要是根据底层操作的pointerCount来进行区分。若当前pointerCount大于等于2,则确定为多指触摸,若当前pointerCount为1,则确定为单指触摸。
其中,在安卓系统中,可采用MotionEvent类中的getPointerCount获取触摸点的数量,若返回1,表明一个手指按压了屏幕,若返回2,表明两个手指同时按压了屏幕。
可选的,手指触摸操作还可包括触摸点状态、触摸点坐标、触摸压力值、触摸指纹等信息。
其中,可通过event.getAction()来获取当前的操作的触摸点状态,单点按下、松开和移动的操作分别是:MotionEvent.ACTION_DOWN、ACTION_UP、ACTION_MOVE。
可通过event.getX()、event.getY()获取触摸点坐标。若有多个触摸点,通过event.getX(0)、event.getY(0)来获取第一个点的坐标值,通过event.getX(1)、event.getY(1)来获取第二个点的坐标值。如果有更多的点,依次类推。
S205,当所述触摸点数量满足第一数量范围时,将所述手指触摸操作分发至所述第一窗口,在所述第一窗口上响应所述手指触摸操作。
第一数量范围:所设定的任一范围,取值在0~预设数量阈值之间,或者预设数量阈值~10之间。在本申请实施例中,以第一数量范围为0~1为例进行描述。
当检测到当前的触摸点数量为1时,确定为单指触摸,则将该手指触摸操作分发到两重叠窗口中 位于下层的第一窗口。
其中,操作分发是指当一个触摸操作(MotionEvent)产生之后,系统需要把它传递给一个具体的视图(View)的过程,在本申请实施例中,则是将该手指触摸操作传递给第一窗口。可以理解的是,第一窗口中包含接收操作的视图,那么也就是将该手指触摸操作传递给第一窗口中接收操作的视图。
在安卓系统中,操作分发可以理解为:用户接触显示屏产生触摸操作(MotionEvent),该操作由Activity接收,Activity接收后将该操作进行传递,传递过程为:Activity->Window->DecorView(DecorView是当前界面的底层容器,是一个ViewGroup)->执行ViewGroup的dispatchTouchEvent(),其中,dispatchTouchEvent()用来进行操作的分发。
在第一窗口上响应手指触摸操作,可以理解为,在第一窗口上对该手指触摸操作的操作操作进行处理或消耗。
在安卓系统中,可通过在dispatchTouchEvent()中调用onTouchEvent(),并通过返回结果确定该手指触摸操作是否被消耗,若返回false,表示不消耗,且在同一个操作(手指触摸操作)序列中该窗口(窗口中的视图)不会再次接收到该操作;若返回ture,表明消耗。
在一种具体的实现方式中,终端将该手指触摸操作的操作操作分别分发至位于上层的第二窗口以及位于下层的第一窗口,针对第二窗口,在dispatchTouchEvent()中调用onTouchEvent(),并返回false,此时,触发针对第一窗口在dispatchTouchEvent()中调用onTouchEvent(),响应该手指触摸操作并返回ture,从而实现了手指触摸操作在重叠窗口上的穿透响应。
例如,若第一窗口如图5所示,第二窗口如图6所示,两窗口完全重叠,当用户在第二窗口上输入手指触摸操作时,获取该操作的位置(即触摸点的坐标),若该位置在第一窗口上对应红包的“開”处,且该位置在第二窗口上对应歌曲名“丑八怪”,若为单指触摸,则打开该红包,实现收红包的功能,且不播放歌曲“丑八怪”。
可选的,当所述触摸点数量满足第一数量范围时,终端将所述手指触摸操作同时也分发至第二窗口,并可在第一窗口上以及第二窗口上同时响应该手指触摸操作。
例如,若第一窗口如图6所示,第二窗口如图7所示,两窗口完全重叠,当用户在第二窗口上输入手指触摸操作时,获取该触摸操作的位置(即触摸点的坐标),若该位置在第一窗口上对应红包的“開”处,且该位置在第二窗口上对应歌曲名“丑八怪”,若为单指触摸,则播放歌曲“丑八怪”,且同步打开该红包,实现收红包的功能。
S206,当所述触摸点数量满足第二数量范围时,将所述手指触摸操作分发至所述第二窗口,所述第二数量范围与所述第一数量范围不同;
第二数量范围为区别于第一数量范围的任一范围,同样取值在0~预设数量阈值之间,或者预设数量阈值~10之间。第一数量范围可以大于第二数量范围,也可小于第二数量范围。在本申请实施例中,以第二数量大于1为例进行描述。
当检测到当前的触摸点数量大于1时,确定为多点触摸,则将该手指触摸操作分发到两重叠窗口中位于上层的第二窗口。
其中,可基于第二窗口的属性标识进行手指触摸操作的定向分发。
S207,拦截分发至所述第一窗口的所述手指触摸操作,并在将所述手指触摸操作回传至所述属性标识对应的第二窗口时,触发在所述第二窗口上响应所述手指触摸操作。
在将手指触摸操作分发至第二窗口后,并对分发至第一窗口的手指触摸操作进行拦截即回传,结束操作分发流程,从而可触发在第二窗口上对该手指触摸操作进行响应。其中,对于安卓系统,可在dispatchTouchEvent()中调用onInterceptTouchEvent()来拦截该手指触摸操作。
在第二窗口上响应的手指触摸操作可以包括翻页、书写、漫游、开启另一页面等多个操作。
例如,若第一窗口如图6所示,第二窗口如图8a所示,当用户采用两个手指触摸第二窗口时,则在第二窗口上进行电子书翻页操作,如图8b所示,而对第一窗口上的抢红包不进行响应。
在本申请实施例中,终端接收在所显示的两重叠窗口中位于上层的第二窗口输入的手指触摸操作,并获取该手指触摸操作的触摸点数量,当触摸点数量满足第一数量范围时,将手指触摸操作分发至两重叠窗口中位于下层的第一窗口,在第一窗口上响应该手指触摸操作。当触摸点数量满足不同于第一数量范围的第二数量范围时,将手指触摸操作分发至两重叠窗口中位于上层的第二窗口,在第二窗口上响应该手指触摸操作。一方面,只需要通过监测触摸点的数量,并在该触摸点数量满足要求时,就可以将该手指触发操作直接分发至下层窗口,简化了触摸穿透功能的实现过程,进而减小了触摸穿透功能的实现复杂度;另一方面,可以通过区分手指触摸操作的触摸点数量而在不同层窗口响应手指触摸操作,通过一套系统可以实现不同层窗口的响应功能,所适用的应用窗口种类较多,且增加了重叠窗口之间的交互性。
请参见图9,为本申请实施例提供了一种触控操作响应方法的流程示意图。本实施例以触控操作响应方法应用于终端中来举例说明。该触控操作响应方法可以包括以下步骤:
S301,在显示界面显示第一窗口和第二窗口,所述第一窗口和所述第二窗口具有重叠区域;
具体可参见S101,此处不再赘述。
S302,获取针对所述第二窗口输入的标识添加请求,基于所述标识添加请求在所述第二窗口上添加属性标识;
具体可参见S202,此处不再赘述。
S303,获取针对所述显示界面上的重叠区域输入的视图创建指令,在所述重叠区域创建操作接收视图;
具体可参见S203,此处不再赘述。
S304,接收在所述操作接收视图上输入的笔头触控操作,获取所述笔头触控操作的触控接触面积;
打开一个窗口,实际上就是打开了一个Activity,在Activity中创建视图(View)可以包括如下两种方式:
其一,调用Activity的onCreate方法,并执行setContentView,从而创建View对象;其二,获取一个WindowManager,并调用其addView方法,将视图交给WindowManagerService进行管理。
所创建的视图用于接收用户输入的笔头触控操作。需要说明的是,该视图位于第二窗口上两窗口(第一窗口与第二窗口)的重叠区域。
由于所创建的操作接收视图是用于接收操作,因此,当用户针对该视图进行手指触摸时,终端才 能感应到所产生的操作并进行响应。
笔头触控操作可以为粗笔触控操作,也可为细笔触控操作,可通过触控接触面积进行区分。
在一种可行的实现方式中,通过比较触控接触面积与预设面积阈值的大小进行区分。若当前触控接触面积小于或者等于预设面积阈值,则确定为细笔触控,若当前触控接触面积大于预设面积阈值,则确定为粗笔触控。
可选的,笔头触控操作还可包括触控点状态、触控点坐标、触控压力值等信息。
其中,可通过event.getAction()来获取当前的操作的触控点状态,按下、松开和移动的操作分别是:MotionEvent.ACTION_DOWN、ACTION_UP、ACTION_MOVE。
可通过event.getX()、event.getY()获取触控点坐标。若有多个触控点,通过event.getX(0)、event.getY(0)来获取第一个点的坐标值,通过event.getX(1)、event.getY(1)来获取第二个点的坐标值。如果有更多的点,依次类推。
需要说明的是,本申请实施例的应用场景是两重叠的窗口,因此,所输入的粗笔触控操作是针对两重叠窗口的重叠区域。
S305,当所述触控接触面积满足第一面积范围时,将所述笔头触控操作分发至所述第一窗口,在所述第一窗口上响应所述笔头触控操作。
第一面积范围为所设定的任一取值范围,小于或者等于预设面积阈值,或者大于预设面积阈值。在本申请实施例中,以第一面积范围小于或者等于预设面积阈值为例进行描述。
当检测到当前的触控接触面积满足第一面积范围时,确定为细笔触控,则将该笔头触控操作分发到两重叠窗口中位于下层的第一窗口。
其中,操作分发是指当一个触摸操作(MotionEvent)产生之后,系统需要把它传递给一个具体的视图(View)的过程,在本申请实施例中,则是将该笔头触控操作传递给第一窗口。可以理解的是,第一窗口中包含接收操作的视图,那么也就是将该笔头触控操作传递给第一窗口中接收操作的视图。
在安卓系统中,操作分发可以理解为:用户接触显示屏产生触摸操作(MotionEvent),该操作由Activity接收,Activity接收后将该操作进行传递,传递过程为:Activity->Window->DecorView(DecorView是当前界面的底层容器,是一个ViewGroup)->执行ViewGroup的dispatchTouchEvent(),其中,dispatchTouchEvent()用来进行操作的分发。
在第一窗口上响应笔头触控操作,可以理解为,在第一窗口上对该笔头触控操作进行处理或消耗。
在安卓系统中,可通过在dispatchTouchEvent()中调用onTouchEvent(),并通过返回结果确定该笔头触控操作是否被消耗,若返回false,表示不消耗,且在同一个操作(笔头触控操作)序列中该窗口(窗口中的视图)不会再次接收到该操作;若返回ture,表明消耗。
在一种具体的实现方式中,终端将该笔头触控操作分别分发至位于上层的第二窗口以及位于下层的第一窗口,针对第二窗口,在dispatchTouchEvent()中调用onTouchEvent(),并返回false,此时,触发针对第一窗口在dispatchTouchEvent()中调用onTouchEvent(),响应该笔头触控操作并返回ture,从而实现了笔头触控操作在重叠窗口上的穿透响应。
例如,若第一窗口如图10所示,第二窗口如图11所示,两窗口完全重叠,当用户在第二窗口上 输入笔头触控操作时,获取该操作的位置(即触控点的坐标),若该位置在第一窗口(PPT显示窗口)上对应链接“Http://www.abc.com”处,且该位置属于第二窗口(批注应用显示窗口)上一信息书写区域,若为细笔触控,则打开该链接对应的页面,且不在第二窗口上输入任何信息。
可选的,当所述触控接触面积满足第一面积范围时,终端将所述笔头触控操作同时也分发至第二窗口,并可在第一窗口上以及第二窗口上同时响应该笔头触控操作。
例如,若第一窗口如图10所示,第二窗口如图11所示,两窗口完全重叠,当用户在第二窗口上输入笔头触控操作时,获取该操作的位置(即触摸点的坐标),若该位置在第一窗口(PPT显示窗口)上对应链接“Http://www.abc.com”处,且该位置属于第二窗口(批注应用显示窗口)上一信息书写区域,若为细笔触控,则打开该链接对应的页面,且同步在第二窗口上进行信息书写。
S306,当所述触控接触面积满足第二面积范围时,将所述笔头触控操作分发至所述第二窗口,所述第二面积范围与所述第一面积范围不同;
第二面积范围为区别于第一面积范围的任一范围,可以小于或者等于预设面积阈值,也可以大于预设面积阈值。在本申请实施例中,当第一面积范围小于或者等于预设面积阈值时,第二面积范围大于预设面积阈值。
当检测到当前触控接触面积满足第二面积范围时,确定为粗笔触控,则将该笔头触控操作分发到两重叠窗口中位于上层的第二窗口。
其中,可基于第二窗口的属性标识进行笔头触控操作的定向分发。
S307,拦截分发至所述第一窗口的所述笔头触控操作,并在将所述笔头触控操作回传至所述属性标识对应的第二窗口时,获取所述笔头触控操作的触控位置;
在将笔头触控操作分发至第二窗口后,并对分发至第一窗口的笔头触控操作进行拦截即回传,结束操作分发流程,从而可触发在第二窗口上对该笔头触控操作进行响应。其中,对于安卓系统,可在dispatchTouchEvent()中调用onInterceptTouchEvent()来拦截该笔头触控操作。
在第二窗口上响应的笔头触控操作可以包括翻页、书写、漫游、开启另一页面等多个操作。
S308,在所述第二窗口上的所述触控位置处输入文本信息。
例如,若第一窗口如图10所示,第二窗口如图12所示,当用户粗笔触控第二窗口时,则在第二窗口上开启输入框以及弹出虚拟键盘,从而方便用户输入文本信息,同时,对第一窗口上的链接不进行响应。
在本申请实施例中,终端接收在所显示的两重叠窗口中位于上层的第二窗口输入的笔头触控操作,并获取该笔头触控操作的触控接触面积,当触控接触面积满足第一面积范围时,将笔头触控操作分发至两重叠窗口中位于下层的第一窗口,在第一窗口上响应该笔头触控操作。当触控接触面积满足第二面积范围时,将笔头触控操作分发至两重叠窗口中位于上层的第二窗口,在第二窗口上响应该笔头触控操作。一方面,只需要通过监测触控接触面积的大小,并在该触控接触面积满足要求时,就可以将该触控接触面积直接分发至下层窗口,简化了触摸穿透功能的实现过程,进而减小了触摸穿透功能的实现复杂度;另一方面,可以通过区分笔头触控操作的触控接触面积而在不同层窗口响应笔头触控操作,通过一套系统可以实现不同层窗口的响应功能,所适用的应用窗口种类较多,且增加了重叠窗口 之间的交互性。
下述为本申请装置实施例,可以用于执行本申请方法实施例。对于本申请装置实施例中未披露的细节,请参照本申请方法实施例。
请参见图13,其示出了本申请一个示例性实施例提供的触摸操作响应装置的结构示意图。该触摸操作响应装置可以通过软件、硬件或者两者的结合实现成为终端的全部或一部分。该装置包括窗口显示模块11、操作接收模块12、第一响应模块13和第二响应模块14。
窗口显示模块11,用于在显示界面显示第一窗口和第二窗口,所述第一窗口和所述第二窗口具有重叠区域;
操作接收模块12,用于接收在所述重叠区域的触摸操作;
第一响应模块13,用于当所述触摸操作为第一类型时,在所述第一窗口上响应所述触摸操作;
第二响应模块14,用于当所述触摸操作为第二类型时,在所述第二窗口上响应所述触摸操作,所述第一类型与所述第二类型不同。
可选的,所述操作接收模块12,具体用于:
接收在所述重叠区域的手指触摸操作,获取所述手指触摸操作的触摸点数量;
所述第一响应模块,具体用于:
当所述触摸点数量满足第一数量范围时,将所述手指触摸操作分发至所述第一窗口,并在所述第一窗口上响应所述手指触摸操作。
所述第二响应模块,具体用于:
当所述触摸点数量满足第二数量范围时,将所述手指触摸操作分发至所述第二窗口,并在所述第二窗口上响应所述手指触摸操作,所述第二数量范围与所述第一数量范围不同。
可选的,所述操作接收模块,具体用于:
接收在所述重叠区域的笔头触控操作,获取所述笔头触控操作的触控接触面积;
所述第一响应模块,具体用于:
当所述触控接触面积满足第一面积范围时,将所述笔头触控操作分发至所述第一窗口,并在所述第一窗口上响应所述笔头触控操作。
所述第二响应模块,具体用于:
当所述触控接触面积满足第二面积范围时,将所述笔头触控操作分发至所述第二窗口,并在所述第二窗口上响应所述笔头触控操作,所述第二面积范围与所述第一面积范围不同。
可选的,所述第二响应模块,具体用于:
获取所述笔头触控操作的触控位置;
在所述第二窗口上的所述触控位置处输入文本信息。
可选的,如图14所示,所述装置1还包括:
视图创建模块15,用于获取针对所述显示界面上的重叠区域输入的视图创建指令,在所述重叠区域创建操作接收视图;
所述操作接收模块12,具体用于:
接收在所述操作接收视图上输入的触摸操作。
可选的,如图14所示,所述装置1还包括:
标识添加模块16,用于获取针对所述第二窗口输入的标识添加请求,基于所述标识添加请求在所述第二窗口上添加属性标识;
所述第二响应模块14,具体用于:
拦截分发至所述第一窗口的所述触摸操作,并在将所述触摸操作回传至所述属性标识对应的第二窗口时,触发在所述第二窗口上响应所述触摸操作。
需要说明的是,上述实施例提供的触摸操作响应装置在执行触摸操作响应方法时,仅以上述各功能模块的划分进行举例说明,实际应用中,可以根据需要而将上述功能分配由不同的功能模块完成,即将设备的内部结构划分成不同的功能模块,以完成以上描述的全部或者部分功能。另外,上述实施例提供的触摸操作响应装置与触摸操作响应方法实施例属于同一构思,其体现实现过程详见方法实施例,这里不再赘述。
上述本申请实施例序号仅仅为了描述,不代表实施例的优劣。
在本申请实施例中,终端在显示界面显示具有重叠区域的第一窗口和第二窗口,并接收在该重叠区域的触摸操作,当所述触摸操作为第一类型时,在所述第一窗口上响应所述触摸操作;当所述触摸操作为第二类型时,在所述第二窗口上响应所述触摸操作。通过输入的触摸操作的不同的操作类型就可以实现在重叠窗口中不同层窗口进行响应,同时,不需要修改操作分发逻辑就可以实现将触摸操作分发至重叠窗口中的下层窗口,简化了触摸穿透功能的实现过程,进而减小了触摸穿透功能的实现复杂度。
本申请实施例还提供了一种计算机存储介质,所述计算机存储介质可以存储有多条指令,所述指令适于由处理器加载并执行如上述图4-图12所示实施例的方法步骤,具体执行过程可以参见图4-图12所示实施例的具体说明,在此不进行赘述。
本申请还提供了一种计算机程序产品,该计算机程序产品存储有至少一条指令,所述至少一条指令由所述处理器加载并执行以实现如上各个实施例所述的焦点状态调整方法。
请参见图15,为本申请实施例提供了一种终端的结构示意图。如图15所示,所述终端1000可以包括:至少一个处理器1001,至少一个网络接口1004,用户接口1003,存储器1005,至少一个通信总线1002。
其中,通信总线1002用于实现这些组件之间的连接通信。
其中,用户接口1003可以包括显示屏(Display)、摄像头(Camera),可选用户接口1003还可以包括标准的有线接口、无线接口。
其中,网络接口1004可选的可以包括标准的有线接口、无线接口(如WI-FI接口)。
其中,处理器1001可以包括一个或者多个处理核心。处理器1001利用各种借口和线路连接整个终端1000内的各个部分,通过运行或执行存储在存储器1005内的指令、程序、代码集或指令集,以及调用存储在存储器1005内的数据,执行终端1000的各种功能和处理数据。可选的,处理器1001可以采用数字信号处理(Digital Signal Processing,DSP)、现场可编程门阵列(Field-Programmable Gate  Array,FPGA)、可编程逻辑阵列(Programmable Logic Array,PLA)中的至少一种硬件形式来实现。处理器1001可集成中央处理器(Central Processing Unit,CPU)、图像处理器(Graphics Processing Unit,GPU)和调制解调器等中的一种或几种的组合。其中,CPU主要处理操作系统、用户界面和应用程序等;GPU用于负责显示屏所需要显示的内容的渲染和绘制;调制解调器用于处理无线通信。可以理解的是,上述调制解调器也可以不集成到处理器1001中,单独通过一块芯片进行实现。
其中,存储器1005可以包括随机存储器(Random Access Memory,RAM),也可以包括只读存储器(Read-Only Memory)。可选的,该存储器1005包括非瞬时性计算机可读介质(non-transitory computer-readable storage medium)。存储器1005可用于存储指令、程序、代码、代码集或指令集。存储器1005可包括存储程序区和存储数据区,其中,存储程序区可存储用于实现操作系统的指令、用于至少一个功能的指令(比如触控功能、声音播放功能、图像播放功能等)、用于实现上述各个方法实施例的指令等;存储数据区可存储上面各个方法实施例中涉及到的数据等。存储器1005可选的还可以是至少一个位于远离前述处理器1001的存储装置。如图15所示,作为一种计算机存储介质的存储器1005中可以包括操作系统、网络通信模块、用户接口模块以及触摸操作响应应用程序。
在图15所示的终端1000中,用户接口1003主要用于为用户提供输入的接口,获取用户输入的数据;而处理器1001可以用于调用存储器1005中存储的触摸操作响应应用程序,并具体执行以下操作:
在显示界面显示第一窗口和第二窗口,所述第一窗口和所述第二窗口具有重叠区域;
接收在所述重叠区域的触摸操作;
当所述触摸操作为第一类型时,在所述第一窗口上响应所述触摸操作;
当所述触摸操作为第二类型时,在所述第二窗口上响应所述触摸操作,所述第一类型与所述第二类型不同。
在一个实施例中,所述处理器1001在执行接收在所述重叠区域的触摸操作时,具体执行以下操作:
接收在所述重叠区域的手指触摸操作,获取所述手指触摸操作的触摸点数量;
所述处理器1001在执行当所述触摸操作为第一类型时,在所述第一窗口上响应所述触摸操作时,具体执行以下操作:
当所述触摸点数量满足第一数量范围时,将所述手指触摸操作分发至所述第一窗口,并在所述第一窗口上响应所述手指触摸操作;
所述处理器1001在执行当所述触摸操作为第二类型时,在所述第二窗口上响应所述触摸操作时,具体执行以下操作:
当所述触摸点数量满足第二数量范围时,将所述手指触摸操作分发至所述第二窗口,并在所述第二窗口上响应所述手指触摸操作,所述第二数量范围与所述第一数量范围不同。
在一个实施例中,所述处理器1001在执行接收在所述重叠区域的触摸操作时,具体执行以下操作:
接收在所述重叠区域的笔头触控操作,获取所述笔头触控操作的触控接触面积;
所述处理器1001在执行当所述触摸操作为第一类型时,在所述第一窗口上响应所述触摸操作时,具体执行以下操作:
当所述触控接触面积满足第一面积范围时,将所述笔头触控操作分发至所述第一窗口,并在所述第一窗口上响应所述笔头触控操作。
所述处理器1001在执行当所述触摸操作为第二类型时,在所述第二窗口上响应所述触摸操作时,具体执行以下操作:
当所述触控接触面积满足第二面积范围时,将所述笔头触控操作分发至所述第二窗口,并在所述第二窗口上响应所述笔头触控操作,所述第二面积范围与所述第一面积范围不同。
在一个实施例中,所述处理器1001在执行在所述第二窗口上响应所述笔头触控操作时,具体执行以下操作:
获取所述笔头触控操作的触控位置;
在所述第二窗口上的所述触控位置处输入文本信息。
在一个实施例中,所述处理器1001在执行接收在所述重叠区域的触摸操作之前,还执行以下操作:
获取针对所述显示界面上的重叠区域输入的视图创建指令,在所述重叠区域创建操作接收视图;
所述处理器1001在执行接收在所述重叠区域的触摸操作时,具体执行以下操作:
接收在所述操作接收视图上输入的触摸操作。
在一个实施例中,所述处理器1001还执行以下操作:
获取针对所述第二窗口输入的标识添加请求,基于所述标识添加请求在所述第二窗口上添加属性标识;
所述处理器1001在执行在所述第二窗口上响应所述触摸操作时,具体执行以下操作:
拦截分发至所述第一窗口的所述触摸操作,并在将所述触摸操作回传至所述属性标识对应的第二窗口时,触发在所述第二窗口上响应所述触摸操作。
在本申请实施例中,终端在显示界面显示具有重叠区域的第一窗口和第二窗口,并接收在该重叠区域的触摸操作,当所述触摸操作为第一类型时,在所述第一窗口上响应所述触摸操作;当所述触摸操作为第二类型时,在所述第二窗口上响应所述触摸操作。通过输入的触摸操作的不同的操作类型就可以实现在重叠窗口中不同层窗口进行响应,同时,不需要修改操作分发逻辑就可以实现将触摸操作分发至重叠窗口中的下层窗口,简化了触摸穿透功能的实现过程,进而减小了触摸穿透功能的实现复杂度。
本领域普通技术人员可以理解实现上述实施例方法中的全部或部分流程,是可以通过计算机程序来指令相关的硬件来完成,所述的程序可存储于一计算机可读取存储介质中,该程序在执行时,可包括如上述各方法的实施例的流程。其中,所述的存储介质可为磁碟、光盘、只读存储记忆体或随机存储记忆体等。

Claims (14)

  1. 一种触摸操作响应方法,包括:
    在显示界面显示第一窗口和第二窗口,所述第一窗口和所述第二窗口具有重叠区域;
    接收在所述重叠区域的触摸操作;
    当所述触摸操作为第一类型时,在所述第一窗口上响应所述触摸操作;
    当所述触摸操作为第二类型时,在所述第二窗口上响应所述触摸操作,所述第一类型与所述第二类型不同。
  2. 根据权利要求1所述的触摸操作响应方法,其中,所述接收在所述重叠区域的触摸操作,包括:
    接收在所述重叠区域的手指触摸操作,获取所述手指触摸操作的触摸点数量;
    所述当所述触摸操作为第一类型时,在所述第一窗口上响应所述触摸操作,包括:
    当所述触摸点数量满足第一数量范围时,将所述手指触摸操作分发至所述第一窗口,并在所述第一窗口上响应所述手指触摸操作;
    所述当所述触摸操作为第二类型时,在所述第二窗口上响应所述触摸操作,包括:
    当所述触摸点数量满足第二数量范围时,将所述手指触摸操作分发至所述第二窗口,并在所述第二窗口上响应所述手指触摸操作,所述第二数量范围与所述第一数量范围不同。
  3. 根据权利要求1所述的触摸操作响应方法,其中,所述接收在所述重叠区域的触摸操作,包括:
    接收在所述重叠区域的笔头触控操作,获取所述笔头触控操作的触控接触面积;
    所述当所述触摸操作为第一类型时,在所述第一窗口上响应所述触摸操作,包括:
    当所述触控接触面积满足第一面积范围时,将所述笔头触控操作分发至所述第一窗口,并在所述第一窗口上响应所述笔头触控操作。
    所述当所述触摸操作为第二类型时,在所述第二窗口上响应所述触摸操作,包括:
    当所述触控接触面积满足第二面积范围时,将所述笔头触控操作分发至所述第二窗口,并在所述第二窗口上响应所述笔头触控操作,所述第二面积范围与所述第一面积范围不同。
  4. 根据权利要求3所述的触摸操作响应方法,其中,所述在所述第二窗口上响应所述笔头触控操作,包括:
    获取所述笔头触控操作的触控位置;
    在所述第二窗口上的所述触控位置处输入文本信息。
  5. 根据权利要求1所述的触摸操作响应方法,其中,所述接收在所述重叠区域的触摸操作之前,还包括:
    获取针对所述显示界面上的重叠区域输入的视图创建指令,在所述重叠区域创建操作接收视图;
    所述接收在所述重叠区域的触摸操作,包括:
    接收在所述操作接收视图上输入的触摸操作。
  6. 根据权利要求1所述的触摸操作响应方法,还包括:
    获取针对所述第二窗口输入的标识添加请求,基于所述标识添加请求在所述第二窗口上添加属性标识;
    所述在所述第二窗口上响应所述触摸操作,包括:
    拦截分发至所述第一窗口的所述触摸操作,并在将所述触摸操作回传至所述属性标识对应的第二窗口时,触发在所述第二窗口上响应所述触摸操作。
  7. 一种触摸操作响应装置,包括:
    窗口显示模块,用于在显示界面显示第一窗口和第二窗口,所述第一窗口和所述第二窗口具有重叠区域;
    操作接收模块,用于接收在所述重叠区域的触摸操作;
    第一响应模块,用于当所述触摸操作为第一类型时,在所述第一窗口上响应所述触摸操作;
    第二响应模块,用于当所述触摸操作为第二类型时,在所述第二窗口上响应所述触摸操作,所述第一类型与所述第二类型不同。
  8. 根据权利要求7所述的触摸操作响应装置,其中,所述操作接收模块,具体用于:
    接收在所述重叠区域的手指触摸操作,获取所述手指触摸操作的触摸点数量;
    所述第一响应模块,具体用于:
    当所述触摸点数量满足第一数量范围时,将所述手指触摸操作分发至所述第一窗口,并在所述第一窗口上响应所述手指触摸操作。
    所述第二响应模块,具体用于:
    当所述触摸点数量满足第二数量范围时,将所述手指触摸操作分发至所述第二窗口,并在所述第二窗口上响应所述手指触摸操作,所述第二数量范围与所述第一数量范围不同。
  9. 根据权利要求7所述的触摸操作响应装置,其中,所述操作接收模块,具体用于:
    接收在所述重叠区域的笔头触控操作,获取所述笔头触控操作的触控接触面积;
    所述第一响应模块,具体用于:
    当所述触控接触面积满足第一面积范围时,将所述笔头触控操作分发至所述第一窗口,并在所述第一窗口上响应所述笔头触控操作。
    所述第二响应模块,具体用于:
    当所述触控接触面积满足第二面积范围时,将所述笔头触控操作分发至所述第二窗口,并在所述第二窗口上响应所述笔头触控操作,所述第二面积范围与所述第一面积范围不同。
  10. 根据权利要求9所述的触摸操作响应装置,其中,所述第二响应模块,具体用于:
    获取所述笔头触控操作的触控位置;
    在所述第二窗口上的所述触控位置处输入文本信息。
  11. 根据权利要求7所述的触摸操作响应装置,其中,所述装置还包括:
    视图创建模块,用于获取针对所述显示界面上的重叠区域输入的视图创建指令,在所述重叠区域创建操作接收视图;
    所述操作接收模块,具体用于:
    接收在所述操作接收视图上输入的触摸操作。
  12. 根据权利要求7所述的触摸操作响应装置,其中,所述装置还包括:
    标识添加模块,用于获取针对所述第二窗口输入的标识添加请求,基于所述标识添加请求在所述第二窗口上添加属性标识;
    所述第二响应模块,具体用于:
    拦截分发至所述第一窗口的所述触摸操作,并在将所述触摸操作回传至所述属性标识对应的第二窗口时,触发在所述第二窗口上响应所述触摸操作。
  13. 一种计算机存储介质,其中,所述计算机存储介质存储有多条指令,所述指令适于由处理器加载并执行如权利要求1~6任意一项的方法步骤。
  14. 一种终端,包括:处理器和存储器;其中,所述存储器存储有计算机程序,所述计算机程序适于由所述处理器加载并执行如权利要求1~6任意一项的方法步骤。
PCT/CN2019/123366 2019-02-19 2019-12-05 触摸操作响应方法、装置、存储介质及终端 WO2020168786A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910122474.6 2019-02-19
CN201910122474.6A CN109885244A (zh) 2019-02-19 2019-02-19 一种触摸操作响应方法、装置、存储介质及终端

Publications (1)

Publication Number Publication Date
WO2020168786A1 true WO2020168786A1 (zh) 2020-08-27

Family

ID=66928508

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/123366 WO2020168786A1 (zh) 2019-02-19 2019-12-05 触摸操作响应方法、装置、存储介质及终端

Country Status (2)

Country Link
CN (1) CN109885244A (zh)
WO (1) WO2020168786A1 (zh)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109885244A (zh) * 2019-02-19 2019-06-14 广州视源电子科技股份有限公司 一种触摸操作响应方法、装置、存储介质及终端
CN110908580B (zh) * 2019-11-11 2021-11-02 广州视源电子科技股份有限公司 控制应用的方法和装置
CN111190674B (zh) * 2019-12-23 2021-08-10 广州朗国电子科技有限公司 统一处理触摸穿透方法、装置、存储介质及一体机设备
CN112306331B (zh) * 2020-10-26 2021-10-22 广州朗国电子科技股份有限公司 触摸穿透处理方法、装置、存储介质及一体机
CN116301441A (zh) * 2023-05-24 2023-06-23 北京翼鸥教育科技有限公司 屏幕触控指令执行方法、装置、电子设备及存储介质

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20170038428A (ko) * 2015-09-30 2017-04-07 엘지이노텍 주식회사 터치 윈도우
CN107038112A (zh) * 2016-10-13 2017-08-11 腾讯科技(北京)有限公司 应用界面的调试方法及装置
CN108829327A (zh) * 2018-05-07 2018-11-16 广州视源电子科技股份有限公司 交互智能设备的书写方法和装置
CN109885244A (zh) * 2019-02-19 2019-06-14 广州视源电子科技股份有限公司 一种触摸操作响应方法、装置、存储介质及终端

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105988689A (zh) * 2015-02-11 2016-10-05 阿里巴巴集团控股有限公司 一种信息展示方法及装置
KR102468120B1 (ko) * 2016-01-27 2022-11-22 삼성전자 주식회사 뷰 계층(뷰 레이어)들을 이용하여 입력을 처리하는 방법 및 전자장치
CN106445278A (zh) * 2016-08-31 2017-02-22 冠捷显示科技(厦门)有限公司 一种透明悬浮窗的控制方法
JP6747378B2 (ja) * 2017-05-17 2020-08-26 京セラドキュメントソリューションズ株式会社 表示入力装置およびそれを備えた画像形成装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20170038428A (ko) * 2015-09-30 2017-04-07 엘지이노텍 주식회사 터치 윈도우
CN107038112A (zh) * 2016-10-13 2017-08-11 腾讯科技(北京)有限公司 应用界面的调试方法及装置
CN108829327A (zh) * 2018-05-07 2018-11-16 广州视源电子科技股份有限公司 交互智能设备的书写方法和装置
CN109885244A (zh) * 2019-02-19 2019-06-14 广州视源电子科技股份有限公司 一种触摸操作响应方法、装置、存储介质及终端

Also Published As

Publication number Publication date
CN109885244A (zh) 2019-06-14

Similar Documents

Publication Publication Date Title
WO2020168786A1 (zh) 触摸操作响应方法、装置、存储介质及终端
US10908703B2 (en) User terminal device and method for controlling the user terminal device thereof
US11467715B2 (en) User interface display method, terminal and non-transitory computer-readable storage medium for splitting a display using a multi-finger swipe
US9996176B2 (en) Multi-touch uses, gestures, and implementation
US11650716B2 (en) Operation methods of a smart interactive tablet, storage medium and related equipment
US9658766B2 (en) Edge gesture
US20160342779A1 (en) System and method for universal user interface configurations
US20120304107A1 (en) Edge gesture
US20120304131A1 (en) Edge gesture
WO2021072926A1 (zh) 文件共享方法、装置、系统、交互智能设备、源端设备及存储介质
WO2021164460A1 (zh) 触摸响应方法、装置、电子设备及存储介质
US20140123036A1 (en) Touch screen display process
US10855481B2 (en) Live ink presence for real-time collaboration
WO2020248547A1 (zh) 一种窗口最小化方法、装置、存储介质及交互智能平板
CN108170338A (zh) 信息处理方法、装置、电子设备及存储介质
CN112948049B (zh) 多内容并行显示的方法、装置、终端及存储介质
US10466863B1 (en) Predictive insertion of graphical objects in a development environment
US9904461B1 (en) Method and system for remote text selection using a touchscreen device
WO2023231268A1 (zh) 快捷批注方法、装置、交互平板及存储介质
CN117234368A (zh) 触摸事件的响应方法、电子设备及可读存储介质
CN116048370A (zh) 显示设备及操作切换方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19915662

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19915662

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 19915662

Country of ref document: EP

Kind code of ref document: A1