CN117472220A - Operation identification method and device - Google Patents

Operation identification method and device Download PDF

Info

Publication number
CN117472220A
CN117472220A CN202311199825.6A CN202311199825A CN117472220A CN 117472220 A CN117472220 A CN 117472220A CN 202311199825 A CN202311199825 A CN 202311199825A CN 117472220 A CN117472220 A CN 117472220A
Authority
CN
China
Prior art keywords
window
interface
terminal device
control
terminal equipment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311199825.6A
Other languages
Chinese (zh)
Inventor
王自强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202311199825.6A priority Critical patent/CN117472220A/en
Publication of CN117472220A publication Critical patent/CN117472220A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/0418Control or interface arrangements specially adapted for digitisers for error correction or compensation, e.g. based on parallax, calibration or alignment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application provides an operation identification method and device, relates to the technical field of terminals, and aims to provide a terminal device for displaying a first interface, wherein the first interface comprises a first control; the terminal equipment receives a first operation aiming at a first control; responding to a first operation, and displaying a first window on the upper layer of a first interface by the terminal equipment, wherein the first window is a non-full-screen window; when the terminal device receives the second operation of the handwriting pen for the first window, the terminal device can determine whether to close the first window based on detection of the sliding distance and the operation duration in the second operation, so as to reduce the situation that the sliding operation is recognized as a clicking operation and the first window cannot be closed.

Description

Operation identification method and device
Technical Field
The present application relates to the field of terminal technologies, and in particular, to an operation identification method and apparatus.
Background
With the popularization and development of the internet, the functional demands of people on terminal devices are becoming more diverse. For example, in order to simplify the manner in which electronic devices are used, handwriting pens are becoming an important tool for inputting data into terminal devices, or controlling terminal devices.
In general, the terminal device may recognize a sliding operation or a clicking operation of the stylus with respect to the display screen, and execute corresponding steps according to the recognized operation.
However, in response to the sliding operation of the stylus on the display screen, since there are many reporting points reported by the stylus and there may be jitter in the reporting points, the terminal device recognizes a plurality of events reported by the stylus during the sliding operation as clicking events, which results in a failure for the user to select.
Disclosure of Invention
The embodiment of the application provides an operation identification method and device, wherein a terminal device can pop up a first window which is not full-screen from the bottom end of an interface, and determine whether to close the first window according to the sliding distance and the operation duration in the operation of a handwriting pen on the first window so as to reduce the situation that the first window cannot be closed due to the fact that the sliding operation is identified as a clicking operation.
In a first aspect, an embodiment of the present application provides an operation identification method, where the method includes: the terminal equipment displays a first interface, wherein the first interface comprises a first control; the terminal equipment receives a first operation aiming at a first control; responding to a first operation, and displaying a first window on the upper layer of a first interface by the terminal equipment, wherein the first window is a non-full-screen window; when the terminal equipment receives a second operation of the handwriting pen aiming at the first window, if the sliding distance in the second operation is smaller than a first threshold value and the duration time of the second operation is larger than a second threshold value, the terminal equipment continuously displays the first window; when the terminal equipment receives a third operation of the handwriting pen aiming at the first window, if the sliding distance in the third operation is larger than a first threshold value and the duration time of the third operation is larger than a second threshold value, the terminal equipment closes the first window.
The first window may be a target window described in embodiments of the present application.
In this way, in the case where the terminal device displays the first window, the terminal device can determine whether to close the first window based on detection of the sliding distance and the operation duration in the second operation, to reduce a case where the sliding operation is recognized as a click operation so that the first window cannot be closed.
In one possible implementation, the first threshold value is positively correlated with the screen size of the terminal device, or the first threshold value is positively correlated with the screen resolution of the terminal device.
It can be understood that the larger the screen size is, the more reporting points are uploaded by the handwriting pen during the same operation, and the larger the screen resolution is, the more obvious the reporting points are dithered by the handwriting pen during the same operation, so that in order to enable better recognition effect to be achieved in devices with different screen resolutions (or screen sizes), the terminal device can match different first thresholds for the devices with different screen resolutions (or screen sizes).
In a possible implementation manner, the first interface is an interface in a file management application, at least one file or folder is included in the first interface, the at least one file or folder may include a first file, and the terminal device receives a first operation for a first control, including: and under the condition that the first file is selected, the terminal equipment receives a first operation aiming at the first control.
In this way, in a scenario in which the terminal device selects the first file and pulls up the first window, the terminal device may implement the function setting for the first file through the first window.
In one possible implementation manner, when the first control is a control for moving a file or/and a folder, path information for moving the file or the folder is displayed in the first window; or when the first control is a control for copying the file or/and the folder, displaying path information for copying the file or/and the folder in the first window; when the first control is a control for compressing a file or a folder, one or more of the following are displayed in the first window: a text box for setting the name of the compressed package, a control for setting the compressed format, a control for setting the storage location of the compressed package, or the like.
It can be appreciated that moving (or copying, or compressing) the file or folder can implement pulling up the first window, and in the case that the terminal device displays the first window, the terminal device can reduce the situation that the sliding operation is identified as the clicking operation, so that the first window cannot be closed by identifying the second operation in the first window.
In one possible implementation manner, the displaying, by the terminal device, the first interface includes: the terminal equipment displays a second interface, and a card of the file management application is displayed in the second interface; when the terminal equipment receives the operation of the card aiming at the file management application, the terminal equipment displays a second window in the second interface, wherein the second window comprises a control for editing the card; when the terminal equipment receives the operation of the control for editing the card, the terminal equipment displays the first interface, the first control is used for selecting folders, and at least one folder in the file management application is displayed in the first window.
It can be understood that in the case of displaying the card of the file management application, the terminal device can pull up the first window by setting the folder displayed in the card of the file management application, and in the case of displaying the first window by the terminal device, the terminal device can reduce the case of recognizing the sliding operation as the clicking operation so that the first window cannot be closed by recognizing the second operation in the first window.
In one possible implementation manner, the displaying, by the terminal device, the first interface includes: the terminal equipment displays a third interface, and cards of note applications are displayed in the third interface; when the terminal equipment receives the operation of the card aiming at the note application, the terminal equipment displays a third window in the third interface, wherein the third window comprises a control for editing the card; when the terminal equipment receives the operation of the control for editing the card, the terminal equipment displays the first interface, the first control is a control for selecting notes, and at least one note in the note application is displayed in the first window.
It can be understood that in the case of displaying the note card, the terminal device can pull up the first window by setting the note content displayed in the note card, and in the case of displaying the first window by the terminal device, the terminal device can reduce the case of recognizing the sliding operation as the clicking operation so that the first window cannot be closed by recognizing the second operation in the first window.
In one possible implementation, when the second operation includes: when the stylus is pressed against the first window and the stylus is slid against the first window, the sliding distance is the distance between the pressing operation and the sliding operation, and the duration of the second operation is the difference between the time of the pressing operation and the time of the sliding operation.
In this way, the terminal device can calculate the sliding distance and the duration of the second operation based on the pressing operation and the sliding operation in the second operation, reducing the problem of inaccuracy in the identification of the sliding operation by only using the sliding distance in the conventional manner.
In a possible implementation manner, the terminal device is provided with a file management application, and before the terminal device closes the first window, the method further includes: the file management application determines that the sliding operation is handled by a first parent view in the first window, the first parent view being a largest view in the first window.
In one possible implementation manner, the first parent view includes at least one child view, the at least one child view includes a first child view, and before the terminal device continuously displays the first window, the method further includes: the file management application determines that the sliding operation is handled by the first sub-view.
It can be understood that the first parent view in the first window may intercept the second operation, when the first parent view intercepts the second operation, the first parent view may call onTouchEvent () to process the second operation, and when the first parent view cannot intercept the second operation, the first parent view may transfer an event corresponding to the second operation to the child view of the first parent view to process the second operation.
In one possible implementation manner, the terminal device is further provided with: a user interface tool UIkit, and an input module, the method further comprising, in response to the first operation: the input module sends an event corresponding to the first operation to the file management application; the file management application calls a first instruction to pull up the first window based on an event corresponding to the first operation; the file management application calls a second instruction to acquire the HwBottomsheel object from the UIkit, wherein the HwBottomsheel object comprises layout information of the first window, context information of the first window and style attribute information of the first window; the file management application sets the state of the hwbootomset object to visible.
In this way, the file management application may implement a pull-up for the first window through the hwbottomset object obtained from the UIkit.
In one possible implementation manner, after the terminal device closes the first window, the method further includes: the terminal device continues to display the first interface.
In a second aspect, an embodiment of the present application provides an operation recognition device, where the operation recognition device includes a display unit and a processing unit, where the display unit is configured to display a first interface, and the first interface includes a first control; a processing unit for receiving a first operation for a first control; the display unit is also used for displaying a first window on the upper layer of the first interface in response to the first operation, wherein the first window is a non-full-screen window; when the terminal equipment receives a second operation of the handwriting pen aiming at the first window, if the sliding distance in the second operation is smaller than a first threshold value and the duration time of the second operation is larger than a second threshold value, the display unit is further used for continuously displaying the first window; when the terminal equipment receives a third operation of the handwriting pen aiming at the first window, if the sliding distance in the third operation is larger than the first threshold value and the duration time of the third operation is larger than the second threshold value, the processing unit is further used for closing the first window.
In a third aspect, embodiments of the present application provide a terminal device, including a processor and a memory, where the memory is configured to store code instructions; the processor is configured to execute code instructions to cause the terminal device to perform a method as described in the first aspect or any implementation of the first aspect.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium storing instructions that, when executed, cause a computer to perform a method as described in the first aspect or any implementation of the first aspect.
In a fifth aspect, a computer program product comprising a computer program which, when run, causes a computer to perform the method as described in the first aspect or any implementation of the first aspect.
It should be understood that, the second aspect to the fifth aspect of the present application correspond to the technical solutions of the first aspect of the present application, and the beneficial effects obtained by each aspect and the corresponding possible embodiments are similar, and are not repeated.
Drawings
Fig. 1 is a schematic view of a scenario provided in an embodiment of the present application;
fig. 2 is a schematic hardware structure of a terminal device according to an embodiment of the present application;
Fig. 3 is a schematic software structure of a terminal device according to an embodiment of the present application;
FIG. 4 is a schematic diagram of an interface for pulling up a target window according to an embodiment of the present disclosure;
FIG. 5 is a schematic diagram of another interface for pulling up a target window according to an embodiment of the present disclosure;
FIG. 6 is a schematic diagram of an interface for pulling up a target window according to an embodiment of the present disclosure;
fig. 7 is a flow chart of an operation recognition method according to an embodiment of the present application;
fig. 8 is a schematic module interaction diagram of an operation recognition method according to an embodiment of the present application;
FIG. 9 is a diagram illustrating an interface of view according to an embodiment of the present application;
fig. 10 is a schematic structural diagram of an operation recognition device according to an embodiment of the present application;
fig. 11 is a schematic hardware structure of another terminal device according to an embodiment of the present application.
Detailed Description
In order to clearly describe the technical solutions of the embodiments of the present application, in the embodiments of the present application, the words "first", "second", etc. are used to distinguish the same item or similar items having substantially the same function and effect. For example, the first value and the second value are merely for distinguishing between different values, and are not limited in their order. It will be appreciated by those of skill in the art that the words "first," "second," and the like do not limit the amount and order of execution, and that the words "first," "second," and the like do not necessarily differ.
In this application, the terms "exemplary" or "such as" are used to mean serving as an example, instance, or illustration. Any embodiment or design described herein as "exemplary" or "for example" should not be construed as preferred or advantageous over other embodiments or designs. Rather, the use of words such as "exemplary" or "such as" is intended to present related concepts in a concrete fashion.
In the present application, "at least one" means one or more, and "a plurality" means two or more. "and/or", describes an association relationship of an association object, and indicates that there may be three relationships, for example, a and/or B, and may indicate: a alone, a and B together, and B alone, wherein a, B may be singular or plural. The character "/" generally indicates that the context-dependent object is an "or" relationship. "at least one of" or the like means any combination of these items, including any combination of single item(s) or plural items(s). For example, at least one (one) of a, b, or c may represent: a, b, c, a and b, a and c, b and c, or a, b and c, wherein a, b, c may be single or plural.
Exemplary, fig. 1 is a schematic view of a scenario provided in an embodiment of the present application. In the embodiment corresponding to fig. 1, a terminal device is taken as an example to be illustrated by taking a tablet as an example, and this example does not constitute a limitation of the embodiment of the present application.
The terminal device displays an interface as shown in a of fig. 1, which may be an interface in a file management application, and the interface may display at least one file or folder, such as new folder 1, new folder 2, document 1, document 2, and the like.
When the terminal device receives a long press operation of the user or the stylus pen on the position of the document 1, the terminal device displays an interface shown as b in fig. 1, and the interface may display: controls for sharing files or folders, controls for setting labels, controls 101 for moving files or folders (otherwise referred to as moving controls 101), controls for deleting files or folders, and controls for opening more functions, etc., in which the document 1 is in a selected state and other files and/or folders are not in a selected state.
When the terminal device receives a click operation of the user or the stylus on the mobile control 101, the terminal device may pop up a window (or called a card) 102 from the bottom of the display screen, and display an interface as shown by c in fig. 1, the window 102 may be a non-full screen control, and the window 102 may be closed in response to a downward sliding operation of the stylus. The window 102 may include one or more of the following: controls for selecting a movement path (e.g., controls displayed as a "preview" word, controls displayed as a "my tablet" word, etc.), controls 103 for closing a window, controls for confirming movement, etc., which are not limited in this embodiment of the present application.
When the terminal device receives a downward sliding operation of the stylus pen with respect to the window 102, the terminal device may recognize the downward sliding operation as a click operation, so that the terminal device may display the content of the response in response to the click operation without closing the window 102. Or, when the terminal device receives the click operation of the stylus for the window 102, the terminal device may identify the click operation as a sliding operation, so that the terminal device closes the window 102 or slides other contents in the display interface.
For example, the terminal device may set the sliding distance threshold to a value greater than 0dp, such as 1dp or 2dp, and when the terminal device receives a downward sliding operation of the stylus pen with respect to the window 102, the terminal device may confirm that the downward sliding operation is a sliding operation when detecting that the sliding distance is greater than the sliding distance threshold. In this scenario, since the stylus has more points, when the downward sliding operation of the stylus is slower, the terminal device may recognize the slower downward sliding operation as a continuous clicking operation, resulting in that the window 102 cannot be closed. Here, 1 dp=1 pixel (px) when the 160-pixel density is defined as a reference.
Alternatively, the terminal device may set the sliding distance threshold to 0dp, and when the terminal device receives a click operation of the stylus pen with respect to the window 102, the terminal device may determine that the click operation is performed when the sliding distance is detected not to be greater than the sliding distance threshold. In this scenario, due to the fact that the point of the stylus has jitter, the terminal device may determine that the sliding distance is greater than the sliding distance threshold based on at least two events reported during the jitter, and identify that the sliding operation is performed, so that the control cannot be selected.
In view of this, the embodiments of the present application provide an operation recognition method, where in a case where a terminal device pops up a first window that is not full-screen from a bottom end of an interface, the terminal device may determine, according to a sliding distance in an operation of a stylus with respect to the first window and a duration of the operation, whether to close the first window, so as to reduce a case where the sliding operation is recognized as a click operation and thus the first window cannot be closed, thereby improving accuracy of operation recognition. The first window may be a target window described in the embodiments of the present application.
It is understood that the above terminal device may also be referred to as a terminal (terminal), a User Equipment (UE), a Mobile Station (MS), a Mobile Terminal (MT), etc. The terminal device may be a mobile phone (mobile phone) with a touch screen, a smart tv, a wearable device, a tablet (Pad), a computer with wireless transceiving function, a Virtual Reality (VR) terminal device, an augmented reality (augmented reality, AR) terminal device, a wireless terminal in industrial control (industrial control), a wireless terminal in unmanned (self-driving), a wireless terminal in teleoperation (remote medical surgery), a wireless terminal in smart grid (smart grid), a wireless terminal in transportation security (transportation safety), a wireless terminal in smart city (smart city), a wireless terminal in smart home (smart home), and so on. The embodiment of the application does not limit the specific technology and the specific equipment form adopted by the terminal equipment.
Therefore, in order to better understand the embodiments of the present application, the structure of the terminal device of the embodiments of the present application is described below. Fig. 2 is a schematic structural diagram of a terminal device according to an embodiment of the present application.
The terminal device may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, keys 190, an indicator 192, a camera 193, a display 194, and the like.
The sensor module 180 may include, among other things, one or more of the following: pressure sensor, gyroscope sensor, barometric sensor, magnetic sensor, acceleration sensor, distance sensor, proximity sensor, fingerprint sensor, temperature sensor, touch sensor, ambient light sensor, or bone conduction sensor, etc., which are not specifically limited in the embodiments of the present application.
It is to be understood that the structure illustrated in the embodiments of the present application does not constitute a specific limitation on the terminal device. In other embodiments of the present application, the terminal device may include more or less components than illustrated, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units. Wherein the different processing units may be separate devices or may be integrated in one or more processors. A memory may also be provided in the processor 110 for storing instructions and data.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge a terminal device, or may be used to transfer data between the terminal device and a peripheral device. And can also be used for connecting with a headset, and playing audio through the headset. The interface may also be used to connect other electronic devices, such as AR devices, etc.
The charge management module 140 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger. The power management module 141 is used for connecting the charge management module 140 and the processor 110.
The wireless communication function of the terminal device may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Antennas in the terminal device may be used to cover single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas.
The mobile communication module 150 may provide a solution for wireless communication including 2G/3G/4G/5G or the like applied on a terminal device. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc. The mobile communication module 150 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wirelesslocal area networks, WLAN) (e.g., wireless fidelity (wireless fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), etc. as applied on a terminal device.
The terminal device implements display functions through a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. In some embodiments, the terminal device may include 1 or N display screens 194, N being a positive integer greater than 1.
The terminal device may implement photographing functions through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
The camera 193 is used to capture still images or video. In some embodiments, the terminal device may include 1 or N cameras 193, N being a positive integer greater than 1.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to realize expansion of the memory capability of the terminal device. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions. For example, files such as music, video, etc. are stored in an external memory card.
The internal memory 121 may be used to store computer-executable program code that includes instructions. The internal memory 121 may include a storage program area and a storage data area.
The terminal device may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The speaker 170A, also referred to as a "horn," is used to convert audio electrical signals into sound signals. The terminal device can listen to music through the speaker 170A or listen to hands-free calls. A receiver 170B, also referred to as a "earpiece", is used to convert the audio electrical signal into a sound signal. When the terminal device picks up a call or voice message, the voice can be picked up by placing the receiver 170B close to the human ear. The earphone interface 170D is used to connect a wired earphone. Microphone 170C, also referred to as a "microphone" or "microphone", is used to convert sound signals into electrical signals. In the embodiment of the present application, the terminal device may have a microphone 170C.
The pressure sensor is used for sensing a pressure signal and can convert the pressure signal into an electric signal. In some embodiments, the pressure sensor may be provided on the display screen 194. The gyro sensor may be used to determine the motion gesture of the terminal device. The air pressure sensor is used for measuring air pressure. The magnetic sensor includes a hall sensor. The acceleration sensor may detect the magnitude of the acceleration of the terminal device in all directions (typically three axes). And a distance sensor for measuring the distance. The proximity light sensor may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The ambient light sensor is used for sensing ambient light brightness. The fingerprint sensor is used for collecting fingerprints. The temperature sensor is used for detecting temperature. Touch sensors, also known as "touch devices". The bone conduction sensor may acquire a vibration signal.
The touch sensor may be disposed on the display 194, and the touch sensor and the display 194 form a touch screen, or "touch screen". In this embodiment of the present application, a grid of capacitive sensing nodes (hereinafter referred to as capacitive sensor) may be disposed in the touch screen, and when the terminal device determines that the value of the capacitance in at least one grid received by the capacitive sensor exceeds a capacitance threshold, it may determine that a touch operation occurs; further, the terminal device may determine a touch area corresponding to the touch operation based on an area occupied by at least one grid exceeding the capacitance threshold.
The keys 190 include a power-on key, a volume key, etc. The keys 190 may be mechanical keys. Or may be a touch key. The terminal device may receive key inputs, generating key signal inputs related to user settings of the terminal device and function control. The indicator 192 may be an indicator light, may be used to indicate a state of charge, a change in charge, a message indicating a missed call, a notification, etc.
The software system of the terminal device may adopt a layered architecture, an event driven architecture, a microkernel architecture, a microservice architecture, a cloud architecture, or the like, which will not be described herein.
Fig. 3 is a schematic software structure of a terminal device according to an embodiment of the present application. As shown in fig. 3, the layered architecture divides the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface. In some embodiments, the android system is divided into multiple layers, from top to bottom, an Application (APP) layer, an application framework (frame) layer, a hardware abstraction layer (hardware abstraction layer, HAL), a kernel layer (kernel), and the like, which are not limited in the embodiments of the present application.
The application layer may include a series of application packages. The application layer may include one or more of the following: file management applications, user interface tools (uesr interface kit, UIkit), bluetooth, or telephony, among other applications, which are not limited in this embodiment of the present application.
The UIkit may define a box of graphical interface elements for an application. For example, elements that a user operates on a screen, and interacts with an application, may all be defined by classes in the UIkit.
The application framework layer provides an application programming interface (application programming interface, API) and programming framework for application programs of the application layer. The application framework layer includes some predefined interfaces. The application framework layer may include one or more of the following: an input module, a display composition system (surface flinger), a window manager, a content provider, a resource manager, a view system, or a notification manager, etc.
The input module is used for transmitting the event information generated by the operation to the file management application and other modules.
The window manager is used for managing window programs. For example, the window manager may be used to apply for a foreground window, obtain a display screen size, determine if there is a status bar, lock a screen, touch a screen, drag a screen, intercept a screen, and the like.
The display synthesis system is used for realizing the switching of the frame rate and the synthesis of the layers.
The content provider is used to store and retrieve data and make such data accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebooks, etc. The view system includes visual controls, such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, a display interface including a text message notification icon may include a view displaying text and a view displaying a picture. The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, video files, and the like.
The purpose of the hardware abstraction layer is to abstract the hardware, and can provide a unified interface for querying the hardware device for the upper layer application, or can also provide a data storage service for the upper layer application. The hardware abstraction layer may include: hardware synthesis processor (hardware composer, HWC) and the like.
The kernel layer is a layer between hardware and software. The kernel layer is used for driving the hardware to enable the hardware to work. The kernel layer may include one or more of the following: display drive, camera drive, or sensor drive, etc.
The software layers involved in the software architecture, the modules included in the layers, and the roles of the modules in the embodiments of the present application are not specifically limited.
The following describes the technical solutions of the present application and how the technical solutions of the present application solve the above technical problems in detail with specific embodiments. The following embodiments may be implemented independently or combined with each other, and the same or similar concepts or processes may not be described in detail in some embodiments.
In order to implement the operation recognition method provided by the embodiment of the present application, the embodiment of the present application illustrates a scene of a target window in the operation recognition method (see embodiments corresponding to fig. 4 to 6), where the target window may be a non-full-screen window, and the target window may be closed when the terminal device determines that a downward sliding operation is detected.
The target window may be generated by the terminal device through a triggering operation of the handwriting pen on the note card in the desktop (see an embodiment corresponding to fig. 4), or the target window may be generated by the terminal device through a triggering operation of the handwriting pen on the file management card in the desktop (see an embodiment corresponding to fig. 5), or the target window may be generated by the terminal device through a triggering operation of the handwriting pen on a moving control (or a copy control or a compression control) in file management (or an embodiment corresponding to fig. 6), which is not limited in a specific way of pulling up the target window.
In one implementation, fig. 4 is an interface schematic diagram of a pull-up target window according to an embodiment of the present application.
The terminal device displays an interface shown as a in fig. 4, which may be a desktop of the terminal device, where one or more of the following may be displayed: note card 401, file management card, icon of setting application, icon of file management application, or application icon in a fixed column, and the like.
When the terminal device receives a long press operation of the user or the stylus pen on the note card 401, the terminal device displays an interface as shown in b in fig. 4, in which a card 402 may be displayed, and in which the card 402 may be displayed: controls for removing cards, controls for viewing more notes cards, and controls for editing cards.
When the terminal device receives a click operation of a user or a stylus on a control for editing a card, the terminal device displays an interface shown as c in fig. 4, in which a window 403 is displayed, the window 403 is used for editing the content displayed by the note card, and a control 404 for viewing all notes can be displayed in the window 403.
When the terminal device receives a clicking operation of the user or the stylus on the control 404, the terminal device may pop up a target window 405 from the bottom of the interface, where the target window 405 may display: controls for closing the window (located in the top middle of the target window 405), at least one note that is allowed to be displayed in the note card, such as unclassified notes, travel notes, personal notes, etc., which are not limited in this embodiment of the present application.
In the case where the terminal device displays the target window 405, the terminal device may close the target window 405 in response to a downward sliding operation of the stylus in the target window 405, or may select a control corresponding to a clicking operation of the stylus in the target window 405 for any control.
In another implementation, fig. 5 is an interface schematic diagram of another pull-up target window according to an embodiment of the present application.
The terminal device displays an interface shown in a in fig. 5, which may be a desktop of the terminal device, where the file management card 501 may be displayed in the interface, and other contents displayed in the interface may be referred to the interface shown in a in fig. 4, which is not described herein.
When the terminal device receives a long press operation of the user or the stylus pen against the file management card 501, the terminal device displays an interface as shown by b in fig. 5, in which a card 502 may be displayed, and in which the card 502 may be displayed: controls for removing cards, controls for viewing more notes cards, and controls for editing cards.
When the terminal device receives a click operation of the user or the stylus for a control for editing the card, the terminal device displays an interface as shown by c in fig. 5, in which a window 503 is displayed, the window 503 being used for editing the content displayed in the file management card. The window 503 may display: the control 504 for selecting the folder can be a control displayed as a "download manager" word, the download manager can be one of folders in the file management application, a control displayed as a "share" word, and the share can be one of folders in the file management application.
When the terminal device receives a clicking operation of the control 504 by a user or a stylus, the terminal device may pop up a target window 505 from the bottom of the interface, where the target window 505 may display: the control for closing the window (located in the middle of the top of the target window 505), at least one control corresponding to a folder allowed to be displayed in the file management card, such as my folder control, my tablet folder control, favorites folder control, download manager folder control, share folder control, recorder folder control, browser folder control, WLAN direct folder control, and print assistant folder control, etc., which are not limited in this embodiment of the present application. Wherein, in the interface shown in d in fig. 4, the download manager folder control is in a selected state.
In the case where the terminal device displays the target window 505, the terminal device may close the target window 505 in response to a downward sliding operation of the stylus in the target window 505, or may select a control corresponding to a clicking operation of the stylus in the target window 505 for any control.
In yet another implementation, fig. 6 is a schematic diagram of an interface for pulling up a target window according to an embodiment of the present application.
The terminal device displays an interface shown in a in fig. 6, which may be a desktop of the terminal device, where a file management card 601 may be displayed in the interface, and other contents displayed in the interface may be referred to the interface shown in a in fig. 4, which is not described herein.
When the terminal device receives the clicking operation of the user or the stylus on the file management card 601, the terminal device displays an interface shown in b in fig. 6, where the interface may be an interface in the file management application, and contents displayed in the interface may refer to an interface shown in a in fig. 1, which is not described herein.
When the terminal device receives a long-press operation of the user or the handwriting pen on the position of the document 1, the terminal device displays an interface shown as c in fig. 6, in which a movement control 602 may be displayed, and contents displayed in the interface shown as c in fig. 6 may refer to an interface shown as b in fig. 1, which is not described herein.
When the terminal device receives the clicking operation of the user or the stylus on the mobile control 101, the terminal device may pop up the target window 604 from the bottom of the display screen, and display an interface shown as d in fig. 6, where the content displayed in the interface shown as d in fig. 6 may refer to the interface shown as c in fig. 1, which is not described herein.
In a possible implementation manner, in the case that the document 1 in the interface shown in c in fig. 6 is in the selected state, when the terminal device receives a click operation of the user or the stylus for opening the control 603 with more functions, the terminal device displays the interface shown as e in fig. 6. The interface shown as e in fig. 6 may include at least one functional control for processing a file or folder, for example: a control 605 for copying files or folders, a control 606 for compressing files or folders, a favorites control, or a rename control, etc.
When the terminal device receives a click operation of the control 605 by the user, the terminal device may pop up the target window 607 from the bottom of the display screen, and display an interface as shown by f in fig. 6, where the target window 607 is used to determine a path for copying a file or folder.
Or when the terminal device receives the click operation of the user on the control 606, the terminal device may pop up a target window for compressing the file or the folder from the bottom of the display screen, where one or more of the following may be displayed in the target window for compressing the file or the folder: a text box for setting the name of the compressed package, a control for setting the compressed format, a control for setting the storage location of the compressed package, or the like, which are not shown in fig. 6.
In the case where the terminal device displays the target window 604 (or the target window 607, or a target window for compressing a file or folder), the terminal device may close the target window 604 in response to a downward sliding operation of the stylus pen in the target window 604, or may select a control corresponding to a clicking operation of the stylus pen in the target window 604 for any control.
It can be understood that the method for pulling up the target window in the operation recognition method provided in the embodiment of the present application is not limited to the embodiments corresponding to fig. 4 to 6, and the embodiment of the present application is not limited thereto specifically.
In a possible implementation manner, the operation identification method provided by the embodiment of the application can also be applied to a scene that the terminal equipment performs pen type switching in the note application, so that the terminal equipment can accurately identify the triggering operation of the handwriting pen in the note type switching scene.
In the embodiments corresponding to fig. 4 to fig. 6, on the basis that the terminal device displays the target window, the terminal device may determine whether to close the target window based on the embodiment corresponding to fig. 7 when receiving the triggering operation of the stylus for the target window.
Fig. 7 is a schematic flow chart of an operation recognition method according to an embodiment of the present application. As shown in fig. 7, the operation recognition method may include the steps of:
s701, the terminal equipment receives a first trigger operation aiming at a target window.
The first trigger operation may be any operation of the stylus with respect to the target window, such as a press (down) operation of the stylus with respect to the target window.
S702, the terminal equipment judges whether the first trigger operation is a pressing operation or not.
The terminal device may determine whether the first trigger operation is a press operation using the operation information in the first trigger operation, for example, the operation information of the first trigger operation may include: the trigger time of the first trigger operation, coordinates of the first trigger operation, and an operation type of the first trigger operation may include: down type, slide (move) type, and up (up) type.
When the terminal device determines that the first trigger operation is a down type based on the operation information of the first trigger operation, the terminal device may determine that the first trigger operation is a press operation. When the first trigger operation is a press operation, the terminal device performs the step shown in S703, or when the first trigger operation is not a press operation, the terminal device does not perform the operation recognition method provided in the embodiment of the present application.
S703, the terminal device initializes the operation start position and the operation start time.
The operation start position described in S703 may be the ordinate in the coordinates of the first trigger operation, for example, the terminal device may determine the operation start position lastYPos through event. The operation start time described in S703 may be a trigger time of the first trigger operation, for example, the terminal device may determine the operation start time lastTime through event.
S704, the terminal equipment receives a second triggering operation aiming at the target window.
The second trigger operation may be any operation of the stylus with respect to the target window, such as a sliding (move) operation of the stylus with respect to the target window.
S705, the terminal equipment judges whether the second triggering operation is a sliding operation or not.
The operation information of the second triggering operation may include: the trigger time of the second trigger operation, the coordinates of the second trigger operation, and the operation type of the second trigger operation.
When the terminal device determines that the second trigger operation is a move type based on the operation information of the second trigger operation, the terminal device may determine that the second trigger operation is a sliding operation. When the second trigger operation is a slide operation, the terminal device performs the step shown in S706, or when the second trigger operation is not a slide operation, the terminal device performs the step shown in S713.
S706, the terminal equipment judges whether the first sliding distance is larger than a third threshold value.
The third threshold may be determined based on the screen size of the terminal device, and the third threshold is positively correlated with the screen size, i.e. the larger the screen size, the larger the value of the third threshold.
In a possible implementation, the third threshold value may also be determined based on the screen resolution of the terminal device, and the third threshold value is positively correlated with the screen resolution.
The first sliding distance may be determined based on lastYPos and the ordinate in the coordinates of the second trigger operation, i.e. the first sliding distance may be a value obtained by subtracting lastYPos from the ordinate in the coordinates of the second trigger operation. The terminal device can take the left upper corner of the display screen as a coordinate initial position (0, 0), the ordinate value gradually becomes larger during the sliding down, and the ordinate value gradually decreases during the sliding up.
When the first sliding distance is greater than the third threshold, the terminal device performs the step shown in S710; alternatively, when the first sliding distance is less than or equal to the third threshold value, the terminal device performs the step shown in S707.
S707, the terminal equipment judges whether the first duration is greater than a second threshold.
The first duration may be determined based on lastTime and the trigger time of the second trigger operation, i.e., the first duration may be a value obtained by subtracting lastTime from the trigger time of the second trigger operation and taking an absolute value. When the first duration is greater than the second threshold, the terminal device performs the step shown in S708; alternatively, when the first duration is less than or equal to the second threshold, the terminal device performs the step shown in S709.
The second threshold may be determined for the terminal device based on an operation duration when the handset simulates a sliding operation.
It is understood that when the sliding distance of the operation is short and the operation duration is short, the operation is more likely to be a click operation.
S708, the terminal equipment judges whether the second sliding distance is larger than a first threshold value.
The first threshold may be determined based on the screen resolution of the terminal device, i.e. the higher the screen resolution, the larger the value of the first threshold, and the first threshold may be smaller than the third threshold.
In a possible implementation manner, in combination with the steps shown in S706 and S708, in the case that the first threshold is smaller than the third threshold, the value of the third threshold may be positively correlated with the screen size (or the screen resolution), and the value of the first threshold may also be positively correlated with the screen size (or the screen resolution), which is not limited in the embodiment of the present application.
It can be understood that the larger the screen resolution is, the more obvious the point-reporting jitter uploaded by the stylus is in the same operation, so that in order to enable the devices with different screen resolutions to achieve better recognition effects, the terminal device can match different first thresholds for the devices with different screen resolutions.
The second sliding distance may be determined based on lastYPos and the ordinate in the coordinates of the second trigger operation, i.e. the second sliding distance may be a value obtained by subtracting lastYPos from the ordinate in the coordinates of the second trigger operation. When the second sliding distance is greater than the first threshold, the terminal device performs the step shown in S710; alternatively, when the second sliding distance is less than or equal to the first threshold value, the terminal device performs the step shown in S709.
It is understood that when the sliding distance of the operation is long and the operation duration is long, the operation is more likely to be a sliding operation.
S709, the terminal device determines that the click event is a click event.
After the terminal device determines that the terminal device is a click event, the terminal device can respond to a control corresponding to the click event after detecting the lifting operation, and at this time, the terminal device can continue to display the target window.
S710, the terminal equipment determines that the sliding event is generated.
After the terminal device determines a sliding event, the terminal device may close the target window.
S711, the terminal device updates the operation start position and the operation start time.
The updated operation start position described in S711 may be the ordinate among the coordinates of the second trigger operation, and the terminal device may take the updated operation start position as lastYPos at the next time of the slide operation judgment; the updated operation start time described in S711 may be a trigger time of the second trigger operation, and the terminal device may use the updated operation start time as lastTime when the slide operation determination is next performed.
Further, when the terminal device detects a new move operation, the terminal device may determine whether the new move operation is a sliding operation based on the updated operation start position and the updated operation start time described in S711 and the operation information in the new move operation, and the process is similar to the steps shown in S705-S711, which are not repeated herein.
S712, the terminal equipment receives a third triggering operation aiming at the target window.
The third trigger operation may be any operation of the stylus with respect to the target window, such as a lift (up) operation of the stylus with respect to the target window.
S713, the terminal equipment judges whether the third trigger operation is a lifting operation.
When the terminal device determines that the third trigger operation is an up type based on the operation information of the third trigger operation, the terminal device may determine that the third trigger operation is a press operation. When the third trigger operation is a pressing operation, the terminal device does not execute the operation identification method provided in the embodiment of the present application, or when the third trigger operation is not a pressing operation, the terminal device executes the step shown in S705.
It can be understood that, in combination with the embodiments corresponding to fig. 4 to fig. 7, in the case that the terminal device pops up the first window that is not full screen from the bottom end of the interface, the terminal device may determine, according to the sliding distance and the duration of the operation in the operation of the stylus with respect to the first window, whether to close the first window, so as to reduce the case that the sliding operation is identified as the clicking operation, so that the first window cannot be closed, and improve the accuracy of operation identification.
On the basis of the embodiment corresponding to fig. 7, the terminal device may include: the file management application, the UIkit, and the input module are based on the above modules in the terminal device, and fig. 8 is a schematic module interaction diagram of an operation identification method provided in an embodiment of the present application.
As shown in fig. 8, the operation recognition method may include the steps of:
s801, responding to the operation of opening the target window by a user, and sending a fourth trigger event to the file management application by the input module.
The operation of opening the target window by the user and the target window may be described in the embodiments corresponding to fig. 4 to 6, which are not described herein.
The fourth trigger event may include: a down event generated in the operation of opening the target window, and an up event generated in the operation of opening the target window.
S802, the file management application calls startactivity () to pull up the target window.
The startactivity () is a function for starting activity, and in the step shown in S802, the file manager may pull up the activity of the target window, which may be a loadpath activity, by calling startactivity ().
S803, the file management application calls the oncreate () to initialize the target window.
oncreate () is a lifecycle method of activity, and its main function is to instantiate a required object for activity, and perform necessary initialization operations, such as loading a layout file, starting a corresponding service, and the like.
S804, the UIkit returns the HwBottomsheel object to the file management application.
Display information of the target window may be included in the hwbottomset object, for example, the hwbottomset object may include: contextual information of the window, layout information of the window, style attributes of the window, and the like. Wherein, the context information may include related content for accessing system resources and starting acitivity, and the layout information may include: the style attribute is used for representing the display of other shapes such as a rectangle or a rounded rectangle.
S805, the file management application sets the HwBottomsheel object to visible (visible).
After the file management application sets the state of the hwbootomset object to visible, the user can see the target window at the terminal device.
S806, responding to a first trigger operation of a user for a target window, and sending a first trigger event to the file management application by the input module.
The first trigger event may be a down event for the target window.
S807, the file management application initializes the operation start position and the operation start time.
S808, responding to a second trigger operation of the user for the target window, and sending a second trigger event to the file management application by the input module.
The second trigger event may be a move event for the target window.
S809, the file management application calls onInterceptTouchEvent () to judge whether to intercept the second trigger event.
onInterceptTouchEvent () is a method for intercepting a touch event, for example, when a touch event occurs, it is first passed to the dispatchTouchEvent () of the first parent view (view) and then the first parent view decides whether to intercept the event.
If the onInterceptTouchEvent () of the first parent view returns true, it indicates that the first parent view is to intercept the event and hand the event to its own onTouchEvent () for processing. If onInterceptTouchEvent () returns false, it indicates that the first parent view does not intercept the event, which will continue to pass to the child view.
If onInterceptTouchEvent () returns true, the first parent view receives all subsequent touch events, including move events and up events. If false is returned, the first parent view only receives the down event, and the subsequent events are directly transferred to the child view.
For example, in connection with the embodiment corresponding to fig. 9, the first parent view and the child view described in S809 are illustrated, and fig. 9 is a schematic interface diagram illustrating views provided in the embodiment of the present application.
A target window may be included in the interface shown in a in fig. 9, any element in the target window may correspond to one view, and the views in the window may constitute the view tree described in the interface shown in b in fig. 9. The View tree can be seen as a tree structure, which corresponds to the content displayed in the interface. One view in the view tree represents a block of space that can be rendered by the user interface component, each view occupies a rectangular area in the display screen, and in this rectangular area, this view object is responsible for graphics rendering and event processing.
As shown in the interface a of fig. 9, the target window may correspond to a maximum view, that is, a view901, the view901 may be the first parent view described in S809, and the rectangular area where the view901 is located may include all display contents in the target window.
The sub-views of view901 may be view902 and view903. The view903 may be a view corresponding to a control for closing a window, and the rectangular area corresponding to the view902 may be a rectangular area except the view903 in the rectangular area where the view901 is located.
The sub-views of view902 may be view904, view905, view906, and view907. The view905 may be a view corresponding to a control for closing a current window, the view906 may be a view corresponding to a control for creating a folder, the view907 may be a view corresponding to a control for completing movement (or a control displayed as "/") and the rectangular area corresponding to the view904 may be a rectangular area other than the view904, the view905, and the view906 in the rectangular area corresponding to the view 902.
Other sub-views may also be included in the view904, which is not described herein, and the tree structure formed by the views may be shown in b in fig. 9.
For example, the specific steps when the file management application calls onintersubstuchevent () to determine whether to intercept the second trigger event may be referred to as steps shown in S705-S711, and will not be described herein. For example, the file management application calling onintellect touch control () to determine that the second trigger event is intercepted may be understood that the file management application determines that the current operation is a sliding operation based on the second trigger event and the first trigger event, so that the onintellect touch control () of the first parent view returns true indicating that the first parent view is to intercept the event and performs the steps shown in S810.
The file manager may perform the steps shown in S810 for determining that the second trigger event is intercepted, or perform the steps shown in S811 for determining that the second trigger event is not intercepted.
S810, the file management application determines that the first parent view calls onTouchEvent () to process a second trigger event.
The file management application may close the target window through smoothslide (), which may include one or more of the following information, for example: the sliding position may be the coordinate of the second trigger operation, the sliding speed may be calculated based on the coordinate of the first trigger operation, the trigger time of the first trigger operation, the coordinate of the second trigger operation, and the trigger event of the second trigger operation, or the like, and the animation type may be understood as a specific animation called when the target window is closed.
The smoothslide () may be set in the onTouchEvent () described in S810, or may be called separately, which is not limited in the embodiment of the present application.
S811, the file management application determines to distribute the second trigger event to the child view of the first parent view for processing.
Adaptively, the child view of the first parent view may determine whether to further determine to intercept the second trigger event, e.g., when it is determined that the coordinates of the second trigger event are in the region of the child view of the first parent view, the child view of the first parent view determines to intercept the second trigger event and call onTouchEvent () to process the second trigger event.
When the child view of the first parent view is a list view (listview) in the target window, the terminal device may slide the list in the target window after the child view calls onTouchEvent () to process the second trigger event; or when the child view of the first parent view is the view corresponding to a control in the target window, the child view calls onTouchEvent () to process a second trigger event, and at the moment, the child view is temporarily not responsive to the trigger for the control, and is responsive to the control after receiving the up operation.
In a possible implementation, in response to a third trigger operation (e.g., an up operation for the target window) for the target window by the user, the input module may send a third trigger event to the file management application S812.
S813, the file management application determines that the child view of the first parent view processes the third trigger event.
When the file management application determines that the child view of the first parent view intercepts the second trigger event in the steps shown in S809-S811, the child view of the first parent view receives other trigger events, such as a third trigger event. In this scenario, the terminal device may perform the step after triggering the control after responding to the third trigger event.
It is to be understood that the interface provided by the embodiments of the present application is provided as an example only and is not intended to limit the embodiments of the present application.
In connection with the embodiment corresponding to fig. 8, the embodiment of the present application further provides an operation identification method, where the operation identification method may include the following steps:
s1, displaying a first interface by the terminal equipment, wherein the first interface comprises a first control.
When the first interface is the interface shown as c in fig. 4, the first control is control 404; when the first interface is the interface shown in c in fig. 5, the first control is control 504; when the first interface is the interface shown in c in fig. 6, the first control is control 602; when the first interface is the interface shown as e in fig. 6, the first control is control 605 or control 606.
It may be understood that the first interface may be an interface where the target window is not displayed, and the first interface may be an interface displayed in an application or a desktop interface, which is not limited in this embodiment of the present application.
S2, the terminal equipment receives a first operation aiming at the first control.
And S3, responding to the first operation, and displaying a first window on the upper layer of the first interface by the terminal equipment, wherein the first window is a non-full-screen window.
The first window may be window 405 in the interface shown as d in fig. 4, or window 505 in the interface shown as d in fig. 5, or window 604 in the interface shown as d in fig. 6, or window 607 in the interface shown as f in fig. 6.
And S4, when the terminal equipment receives a second operation of the handwriting pen aiming at the first window, if the sliding distance in the second operation is smaller than a first threshold value and the duration time of the second operation is larger than a second threshold value, the terminal equipment continuously displays the first window.
The second operation may be a click operation of the stylus pen with respect to the first window or an upward sliding operation, a leftward sliding operation, a rightward sliding operation, or the like with respect to the first window, in which case the terminal device may continue to display the first window.
And S5, when the terminal equipment receives a third operation of the handwriting pen aiming at the first window, if the sliding distance in the third operation is larger than a first threshold value and the duration time of the third operation is larger than a second threshold value, the terminal equipment closes the first window and displays a first interface.
The third threshold may be a sliding down operation of the stylus with respect to the first window, in which case the terminal device may close the first window upon recognizing the sliding down operation of the first window. After closing the first window, the terminal device may continue to display the first interface.
In one possible implementation, the first threshold value is positively correlated with the screen size of the terminal device, or the first threshold value is positively correlated with the screen resolution of the terminal device.
In one possible implementation manner, the first interface is an interface in the file management application, at least one file or folder is included in the first interface, the at least one file or folder may include a first file, and the terminal device receives a first operation for the first control, including: and under the condition that the first file is selected, the terminal equipment receives a first operation aiming at the first control.
The first interface may be an interface shown as c in fig. 6 or an interface shown as e in fig. 6.
In one possible implementation, when the first control is a control for moving a file or/and a folder, path information of the moving file or folder is displayed in the first window; or when the first control is a control for copying the file or/and the folder, displaying path information of the copied file or folder in the first window; when the first control is a control for compressing a file or folder, one or more of the following are displayed in the first window: a text box for setting the name of the compressed package, a control for setting the compressed format, a control for setting the storage location of the compressed package, or the like.
In one possible implementation manner, the terminal device displays a first interface, including: the terminal equipment displays a second interface, and the second interface displays a card of the file management application; when the terminal equipment receives the operation of the card aiming at the file management application, the terminal equipment displays a second window in a second interface, wherein the second window comprises a control for editing the card; when the terminal equipment receives an operation for a control for editing the card, the terminal equipment displays a first interface, the first control is a control for selecting folders, and at least one folder in the file management application is displayed in a first window.
The second interface may be the interface shown as a in fig. 5, the second window may be the window in the interface shown as b in fig. 5, and the first control may be control 504 in this scenario.
In one possible implementation manner, the terminal device displays a first interface, including: the terminal equipment displays a third interface, and cards of note applications are displayed in the third interface; when the terminal equipment receives the operation of the card aiming at the note application, the terminal equipment displays a third window in a third interface, wherein the third window comprises a control for editing the card; when the terminal equipment receives an operation for a control for editing the card, the terminal equipment displays a first interface, the first control is a control for selecting notes, and at least one note in a note application is displayed in a first window.
The third interface may be the interface shown as a in fig. 4, the third window may be the window in the interface shown as b in fig. 4, and the first control may be control 404 in this scenario.
In one possible implementation, when the second operation includes: when the stylus pen is pressed against the first window and the stylus pen is slid against the first window, the sliding distance is the distance between the pressing operation and the sliding operation, and the duration of the second operation is the difference between the time of the pressing operation and the time of the sliding operation.
The pressing operation may be a first triggering operation described in the embodiments of the present application, and the sliding operation may be a second triggering operation described in the embodiments of the present application.
In one possible implementation manner, the file management application is set in the terminal device, and before the terminal device closes the first window, the method further includes: the file management application determines that the sliding operation is handled by a first parent view in the first window, the first parent view being the largest view in the first window.
In one possible implementation, the first parent view includes at least one child view, the at least one child view includes the first child view, and before the terminal device continuously displays the first window, the method further includes: the file management application determines that the sliding operation is handled by the first sub-view.
In one possible implementation manner, the terminal device is further provided with: the UIkit, and the input module, after responding to the first operation, the method further comprises: the input module sends an event corresponding to the first operation to the file management application; the file management application calls a first instruction to pull up a first window based on an event corresponding to the first operation; the file management application calls a second instruction to acquire an HwBottomsheel object from the UIkit, wherein the HwBottomsheel object comprises layout information of a first window, context information of the first window and style attribute information of the first window; the file management application sets the state of the hwbootomset object to visible.
The method provided by the embodiment of the present application is described above with reference to fig. 4 to 9, and the device for performing the method provided by the embodiment of the present application is described below. As shown in fig. 10, fig. 10 is a schematic structural diagram of an operation recognition device provided in an embodiment of the present application, where the operation recognition device may be a terminal device in the embodiment of the present application, or may be a chip or a chip system in the terminal device.
As shown in fig. 10, the operation recognition apparatus 1000 may be used in a communication device, a circuit, a hardware component, or a chip, and the operation recognition apparatus 1000 includes: display unit 1001 and processing unit 1002. Wherein the display unit 1001 is used for supporting the step of displaying performed by the operation recognition method; the processing unit 1002 is for supporting the operation recognition apparatus 1000 to perform the steps of information processing.
In a possible implementation manner, the operation recognition device 1000 may further include a communication unit 1003, where the communication unit 1003 is configured to support the operation recognition device 1000 to perform steps such as receiving or transmitting a message.
The operation recognition apparatuses described in the embodiments of the present application may each include the unit described in the embodiment corresponding to fig. 10.
In particular, the processing unit 1002 may be integrated with the display unit 1001, and communication may occur between the processing unit 1002 and the display unit 1001.
In one possible implementation, the operation recognition apparatus 1000 may further include: a storage unit 1004. The storage unit 1004 may include one or more memories, which may be one or more devices, circuits, or devices for storing programs or data.
The memory unit 1004 may exist separately and be connected to the processing unit 1002 by a communication bus. The memory unit 1004 may also be integrated with the processing unit 1002.
Taking the example that the operation recognition apparatus 1000 may be a chip or a chip system of the terminal device in the embodiment of the present application, the storage unit 1004 may store computer-executed instructions of a method of the terminal device, so that the processing unit 1002 performs the method of the terminal device in the embodiment described above. The storage unit 1004 may be a register, a cache or random access memory (random access memory, RAM), etc., and the storage unit 1004 may be integrated with the processing unit 1002. The storage unit 1004 may be a read-only memory (ROM) or other type of static storage device that may store static information and instructions, and the storage unit 1004 may be independent of the processing unit 1002.
In one possible implementation, the operation recognition apparatus 1000 may further include: a communication unit 1003. Wherein the communication unit 1003 is used to support interaction of the operation recognition apparatus 1000 with other devices. For example, when the operation recognition apparatus 1000 is a terminal device, the communication unit 1003 may be a communication interface or an interface circuit. When the operation recognition apparatus 1000 is a chip or a chip system in a terminal device, the communication unit 1003 may be a communication interface. For example, the communication interface may be an input/output interface, pins or circuitry, etc.
The apparatus of this embodiment may be correspondingly configured to perform the steps performed in the foregoing method embodiments, and the implementation principle and technical effects are similar, which are not described herein again.
Fig. 11 is a schematic hardware structure of another terminal device according to an embodiment of the present application.
The terminal device comprises a processor 1101, a communication line 1104 and at least one communication interface (illustrated in fig. 11 by way of example as communication interface 1103).
The processor 1101 may be a general purpose central processing unit (central processing unit, CPU), microprocessor, application Specific Integrated Circuit (ASIC), or one or more integrated circuits for controlling the execution of the programs of the present application.
Communication line 1104 may include circuitry for communicating information between the components described above.
Communication interface 1103 uses any transceiver-like device for communicating with other devices or communication networks, such as ethernet, wireless local area network (wireless local area networks, WLAN), etc.
Possibly, the terminal device may also comprise a memory 1102.
The memory 1102 may be, but is not limited to, a read-only memory (ROM) or other type of static storage device that can store static information and instructions, a random access memory (random access memory, RAM) or other type of dynamic storage device that can store information and instructions, an electrically erasable programmable read-only memory (EEPROM), a compact disc-only memory (compact disc read-only memory) or other optical disk storage, a compact disc storage (including compact disc, laser disc, optical disc, digital versatile disc, blu-ray disc, etc.), a magnetic disk storage medium or other magnetic storage device, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer. The memory may be separate and coupled to the processor via communication line 1104. The memory may also be integrated with the processor.
The memory 1102 is used for storing computer-executable instructions for executing the embodiments of the present application, and the processor 1101 controls the execution. The processor 1101 is configured to execute computer-executable instructions stored in the memory 1102, thereby implementing the methods provided by the embodiments of the present application.
Possibly, the computer-executed instructions in the embodiments of the present application may also be referred to as application program code, which is not specifically limited in the embodiments of the present application.
In a particular implementation, the processor 1101 may include one or more CPUs, such as CPU0 and CPU1 of FIG. 11, as an embodiment.
In a specific implementation, as an embodiment, the terminal device may include multiple processors, such as processor 1101 and processor 1105 in fig. 11. Each of these processors may be a single-core (single-CPU) processor or may be a multi-core (multi-CPU) processor. A processor herein may refer to one or more devices, circuits, and/or processing cores for processing data (e.g., computer program instructions).
In the above embodiments, the instructions stored by the memory for execution by the processor may be implemented in the form of a computer program product. The computer program product may be written in the memory in advance, or may be downloaded in the form of software and installed in the memory.
The computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on a computer, the processes or functions in accordance with embodiments of the present application are produced in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in a computer-readable storage medium or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by wired (e.g., coaxial cable, fiber optic, digital subscriber line (digital subscriber line, DSL), or wireless (e.g., infrared, wireless, microwave, etc.), or semiconductor medium (e.g., solid state disk, SSD)) or the like.
Embodiments of the present application also provide a computer-readable storage medium. The methods described in the above embodiments may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. Computer readable media can include computer storage media and communication media and can include any medium that can transfer a computer program from one place to another. The storage media may be any target media that is accessible by a computer.
As one possible design, the computer-readable medium may include compact disk read-only memory (CD-ROM), RAM, ROM, EEPROM, or other optical disk memory; the computer readable medium may include disk storage or other disk storage devices. Moreover, any connection is properly termed a computer-readable medium. For example, if the software is transmitted from a website, server, or other remote source using a coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave, then the coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave are included in the definition of medium. Disk and disc, as used herein, includes Compact Disc (CD), laser disc, optical disc, digital versatile disc (digital versatiledisc, DVD), floppy disk and blu-ray disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers.
Combinations of the above should also be included within the scope of computer-readable media. The foregoing is merely illustrative embodiments of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily think about variations or substitutions within the technical scope of the present invention, and the invention should be covered. Therefore, the protection scope of the invention is subject to the protection scope of the claims.
It should be noted that, the user information (including but not limited to user equipment information, user personal information, etc.) and the data (including but not limited to data for analysis, stored data, presented data, etc.) referred to in the present application are information and data authorized by the user or fully authorized by each party, and the collection, use and processing of the related data need to comply with the related laws and regulations and standards, and provide corresponding operation entries for the user to select authorization or rejection.

Claims (12)

1. A method of operation identification, the method comprising:
the terminal equipment displays a first interface, wherein the first interface comprises a first control;
the terminal equipment receives a first operation aiming at a first control;
Responding to the first operation, the device displays a first window on the upper layer of the first interface, wherein the first window is a non-full screen window;
when the terminal equipment receives a second operation of the handwriting pen aiming at the first window, if the sliding distance in the second operation is smaller than a first threshold value and the duration time of the second operation is larger than a second threshold value, the terminal equipment continuously displays the first window;
when the terminal equipment receives a third operation of the handwriting pen aiming at the first window, if the sliding distance in the third operation is larger than the first threshold value and the duration of the third operation is larger than the second threshold value, the terminal equipment closes the first window.
2. The method of claim 1, wherein the first threshold is positively correlated with a screen size of the terminal device or with a screen resolution of the terminal device.
3. The method according to claim 1 or 2, wherein the first interface is an interface in a file management application, at least one file or folder is included in the first interface, the at least one file or folder may include a first file,
The terminal device receives a first operation for a first control, including: and under the condition that the first file is selected, the terminal equipment receives a first operation aiming at the first control.
4. A method according to claim 3, wherein when the first control is a control for moving a file or/and a folder, path information for moving the file or folder is displayed in the first window; or when the first control is a control for copying the file or/and the folder, displaying path information for copying the file or/and the folder in the first window; when the first control is a control for compressing a file or a folder, one or more of the following are displayed in the first window: a text box for setting the name of the compressed package, a control for setting the compressed format, or a control for setting the storage location of the compressed package.
5. The method according to claim 1 or 2, wherein the terminal device displays the first interface, comprising:
the terminal equipment displays a second interface, and a card of the file management application is displayed in the second interface;
when the terminal equipment receives the operation of the card aiming at the file management application, the terminal equipment displays a second window in the second interface, wherein the second window comprises a control for editing the card;
When the terminal equipment receives the operation of the control for editing the card, the terminal equipment displays the first interface, the first control is used for selecting folders, and at least one folder in the file management application is displayed in the first window.
6. The method according to claim 1 or 2, wherein the terminal device displays the first interface, comprising:
the terminal equipment displays a third interface, and cards of note applications are displayed in the third interface;
when the terminal equipment receives the operation of the card aiming at the note application, the terminal equipment displays a third window in the third interface, wherein the third window comprises a control for editing the card;
when the terminal equipment receives the operation of the control for editing the card, the terminal equipment displays the first interface, the first control is a control for selecting notes, and at least one note in the note application is displayed in the first window.
7. The method of any one of claims 1-6, wherein when the second operation comprises: when the stylus is pressed against the first window and the stylus is slid against the first window, the sliding distance is the distance between the pressing operation and the sliding operation, and the duration of the second operation is the difference between the time of the pressing operation and the time of the sliding operation.
8. The method of claim 7, wherein a file management application is provided in the terminal device, the method further comprising, prior to closing the first window by the terminal device:
the file management application determines that the sliding operation is handled by a first parent view in the first window, the first parent view being a largest view in the first window.
9. The method of claim 8, wherein the first parent view comprises at least one child view, the at least one child view comprising a first child view, the method further comprising, before the terminal device continues to display the first window:
the file management application determines that the sliding operation is handled by the first sub-view.
10. Method according to claim 8 or 9, characterized in that the terminal device is further provided with: a user interface tool UIkit, and an input module, the method further comprising, in response to the first operation:
the input module sends an event corresponding to the first operation to the file management application;
the file management application calls a first instruction to pull up the first window based on an event corresponding to the first operation;
The file management application calls a second instruction to acquire an HwBottomsheel object from the UIkit, wherein the HwBottomsheel object comprises layout information of the first window, context information of the first window and style attribute information of the first window;
the file management application sets the state of the hwbootomset object to visible.
11. A terminal device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor, when executing the computer program, causes the terminal device to perform the method according to any of claims 1-10.
12. A computer readable storage medium storing a computer program, which when executed by a processor causes a computer to perform the method of any one of claims 1-10.
CN202311199825.6A 2023-09-15 2023-09-15 Operation identification method and device Pending CN117472220A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311199825.6A CN117472220A (en) 2023-09-15 2023-09-15 Operation identification method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311199825.6A CN117472220A (en) 2023-09-15 2023-09-15 Operation identification method and device

Publications (1)

Publication Number Publication Date
CN117472220A true CN117472220A (en) 2024-01-30

Family

ID=89630120

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311199825.6A Pending CN117472220A (en) 2023-09-15 2023-09-15 Operation identification method and device

Country Status (1)

Country Link
CN (1) CN117472220A (en)

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102298503A (en) * 2011-09-27 2011-12-28 汉王科技股份有限公司 Method and device for displaying contents on mobile terminal list interface
US20120304131A1 (en) * 2011-05-27 2012-11-29 Jennifer Nan Edge gesture
CN103677576A (en) * 2012-09-14 2014-03-26 腾讯科技(深圳)有限公司 Method and device for closing window
CN105843535A (en) * 2016-03-18 2016-08-10 深圳市万普拉斯科技有限公司 Control method and control panel, and terminal
WO2018082269A1 (en) * 2016-11-04 2018-05-11 华为技术有限公司 Menu display method and terminal
US20180173414A1 (en) * 2016-07-25 2018-06-21 Beijing Luckey Technology Co., Ltd. Method and device for gesture control and interaction based on touch-sensitive surface to display
CN109656416A (en) * 2018-12-28 2019-04-19 腾讯音乐娱乐科技(深圳)有限公司 A kind of control method based on multi-medium data, device and relevant device
US20210333948A1 (en) * 2020-04-24 2021-10-28 Beijing Xiaomi Mobile Software Co., Ltd. Method, device, and storage medium for controlling display of floating window
WO2022000507A1 (en) * 2020-07-03 2022-01-06 深圳传音控股股份有限公司 Display interface processing method and apparatus, and storage medium
CN116088716A (en) * 2022-06-13 2023-05-09 荣耀终端有限公司 Window management method and terminal equipment
CN116610243A (en) * 2023-05-05 2023-08-18 维沃移动通信有限公司 Display control method, display control device, electronic equipment and storage medium

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120304131A1 (en) * 2011-05-27 2012-11-29 Jennifer Nan Edge gesture
CN102298503A (en) * 2011-09-27 2011-12-28 汉王科技股份有限公司 Method and device for displaying contents on mobile terminal list interface
CN103677576A (en) * 2012-09-14 2014-03-26 腾讯科技(深圳)有限公司 Method and device for closing window
CN105843535A (en) * 2016-03-18 2016-08-10 深圳市万普拉斯科技有限公司 Control method and control panel, and terminal
US20180173414A1 (en) * 2016-07-25 2018-06-21 Beijing Luckey Technology Co., Ltd. Method and device for gesture control and interaction based on touch-sensitive surface to display
WO2018082269A1 (en) * 2016-11-04 2018-05-11 华为技术有限公司 Menu display method and terminal
CN109656416A (en) * 2018-12-28 2019-04-19 腾讯音乐娱乐科技(深圳)有限公司 A kind of control method based on multi-medium data, device and relevant device
US20210333948A1 (en) * 2020-04-24 2021-10-28 Beijing Xiaomi Mobile Software Co., Ltd. Method, device, and storage medium for controlling display of floating window
WO2022000507A1 (en) * 2020-07-03 2022-01-06 深圳传音控股股份有限公司 Display interface processing method and apparatus, and storage medium
CN116088716A (en) * 2022-06-13 2023-05-09 荣耀终端有限公司 Window management method and terminal equipment
CN116610243A (en) * 2023-05-05 2023-08-18 维沃移动通信有限公司 Display control method, display control device, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
CN112269527B (en) Application interface generation method and related device
CN115623115B (en) Method, device, system and storage medium for creating application shortcut across devices
CN110221885B (en) Interface display method and terminal equipment
CN111240547A (en) Interactive method for cross-device task processing, electronic device and storage medium
CN111597000B (en) Small window management method and terminal
CN109917995B (en) Object processing method and terminal equipment
KR102221034B1 (en) Method for controlling a content display and an electronic device
CN114356198A (en) Data transmission method and device
CN108132790B (en) Method, apparatus and computer storage medium for detecting a garbage code
WO2021129536A1 (en) Icon moving method and electronic device
US20220214891A1 (en) Interface display method and electronic device
CN111026464A (en) Identification method and electronic equipment
CN111143299A (en) File management method and electronic equipment
CN110865765A (en) Terminal and map control method
CN115756268A (en) Cross-device interaction method and device, screen projection system and terminal
CN110968815B (en) Page refreshing method, device, terminal and storage medium
KR20150025456A (en) An electronic device and operating metod thereof
US20230236714A1 (en) Cross-Device Desktop Management Method, First Electronic Device, and Second Electronic Device
CN108196754B (en) Method, terminal and server for displaying object
CN111265885B (en) Resource display method, resource sending method, device, equipment and storage medium
CN111275607A (en) Interface display method and device, computer equipment and storage medium
CN114489429B (en) Terminal equipment, long screen capturing method and storage medium
CN113642010B (en) Method for acquiring data of extended storage device and mobile terminal
CN117472220A (en) Operation identification method and device
CN111143300A (en) File compression method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination