WO2021004352A1 - Procédé d'interaction d'interface et terminal - Google Patents

Procédé d'interaction d'interface et terminal Download PDF

Info

Publication number
WO2021004352A1
WO2021004352A1 PCT/CN2020/099719 CN2020099719W WO2021004352A1 WO 2021004352 A1 WO2021004352 A1 WO 2021004352A1 CN 2020099719 W CN2020099719 W CN 2020099719W WO 2021004352 A1 WO2021004352 A1 WO 2021004352A1
Authority
WO
WIPO (PCT)
Prior art keywords
touch operation
response
hidden
target object
target
Prior art date
Application number
PCT/CN2020/099719
Other languages
English (en)
Chinese (zh)
Inventor
解晓鹏
Original Assignee
维沃移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 维沃移动通信有限公司 filed Critical 维沃移动通信有限公司
Publication of WO2021004352A1 publication Critical patent/WO2021004352A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Definitions

  • the present disclosure relates to the field of terminal technology, and in particular to an interface interaction method and terminal.
  • Figure 1 The operation method of hiding notes in related technologies is shown in Figure 1: 1. Open the note software; 2. Long press the note to select the note to be hidden; 3. Click the selected note to set it as private; 4. The selected note disappears in the current interface (Hidden).
  • FIG. 3 The operation method of restoring the hidden note is shown in Figure 3: 1. Open the interface of the hidden note; 2. Long press the note to select the note to be restored; 3. Click the selected note to set it to cancel private; 4. The selected note is in the current interface Disappear (return to the interface before hiding), this interface is still the interface for hiding notes.
  • hiding, displaying and restoring notes are more complicated: users need to click multiple times, and there is a lock sign when displaying notes, other users are easy to know that there are hidden notes, which is not enough privacy.
  • hiding notes and restoring hidden notes can only be operated on one note. When multiple notes need to be hidden or restored, they need to be operated one by one, and the user experience is not good.
  • the purpose of the present disclosure is to provide an interface interaction method and terminal, so as to solve the problem of complicated operation and insufficient privacy in the hiding scheme of the sticky note in the related art.
  • embodiments of the present disclosure provide an interface interaction method applied to a terminal, and the method includes:
  • the first target object is hidden.
  • the hiding the first target object in response to the first touch operation specifically includes:
  • the first target object is hidden.
  • hiding the first target object specifically includes:
  • the method further includes:
  • the method further includes:
  • the second target object is located between the third object and the fourth object before being hidden.
  • the displaying the hidden second target object between the third object and the fourth object in response to the second touch operation includes:
  • the second target object is displayed between the third object and the fourth object .
  • the second target object is displayed on the third object and the fourth object.
  • the fourth objects specifically include:
  • the method further includes:
  • the method further includes:
  • the preset operation instruction includes at least one of a deletion instruction and a position movement instruction.
  • the embodiments of the present disclosure also provide a terminal, the terminal including:
  • a first display module configured to display a first object, a second object, and a first target object located between the first object and the second object;
  • a first receiving module configured to receive a first touch operation for the first object and the second object
  • the first processing module is configured to hide the first target object in response to the first touch operation.
  • the first processing module specifically includes:
  • the first control sub-module is configured to control the first object and the second object to move toward each other in response to the first touch operation
  • the first processing submodule is configured to hide the first target object when the first distance between the first object and the second object is less than or equal to a first threshold.
  • the first processing submodule specifically includes:
  • the first processing unit is configured to determine that the first distance between the first object and the second object is less than or equal to a first threshold, and at least one of the first object and the second object is not In the case of being touched again, the first target object is hidden.
  • the terminal further includes:
  • the first output module is configured to move between the first object and the second object in the process of controlling the first object and the second object to move toward each other in response to the first touch operation When the first distance of is less than or equal to the first threshold, output first prompt information.
  • the terminal further includes:
  • the second display module is used to display the third object and the fourth object
  • a second receiving module configured to receive a second touch operation for the third object and the fourth object
  • a second processing module configured to display the hidden second target object between the third object and the fourth object in response to the second touch operation
  • the second target object is located between the third object and the fourth object before being hidden.
  • the second processing module includes:
  • the second control sub-module is configured to control the third object and the fourth object to move backwards in response to the second touch operation
  • a second processing sub-module configured to display the second target object on the third object when the second distance between the third object and the fourth object is greater than or equal to a second threshold And the fourth object.
  • the second processing submodule specifically includes:
  • the second processing unit is configured to determine that the second distance between the third object and the fourth object is greater than or equal to a second threshold, and at least one of the third object and the fourth object is not In the case of being touched again, the second target object is displayed between the third object and the fourth object.
  • the terminal further includes:
  • the second output module is used to control the movement between the third object and the fourth object during the process of controlling the third object and the fourth object to move back in response to the second touch operation.
  • output second prompt information When the second distance between the two is less than or equal to the second threshold, output second prompt information.
  • the terminal further includes:
  • the third output module is configured to detect a preset operation on the first object and/or the second object after the first target object is hidden in response to the first touch operation In the case of an instruction, output third prompt information corresponding to the first target object;
  • the preset operation instruction includes at least one of a deletion instruction and a position movement instruction.
  • the embodiments of the present disclosure also provide a terminal, including a processor, a memory, and a computer program stored on the memory and capable of being run on the processor, the computer program being executed by the processor When realizing the steps of the interface interaction method described above.
  • embodiments of the present disclosure also provide a computer-readable storage medium on which a computer program is stored, and when the computer program is executed by a processor, the steps of the interface interaction method described above are implemented.
  • the first touch operation in response to the first touch operation, hide the first target object; realize that when the note is hidden, it is no longer necessary to press the note and click the button, and the two-finger gesture can be set Operate to hide the notes (easy to operate), operate the notes in a more secret and friendly way, giving users a safer and more reliable experience; and this solution can realize multiple notes at once, which is convenient, fast and playable Higher performance; in addition, the process of hiding notes in this solution is done on the same interface, and there is no need to switch back and forth between multiple interfaces, which can save costs.
  • Figure 1 is a schematic diagram of a sticky note hiding process in related technologies
  • Figure 2 is a schematic diagram of a hidden note viewing process in related technologies
  • Figure 3 is a schematic diagram of the process of restoring hidden notes in related technologies
  • FIG. 4 is a schematic flowchart of an interface interaction method according to an embodiment of the disclosure.
  • FIG. 5 is a schematic diagram 1 of hiding a note based on a two-finger operation according to an embodiment of the present disclosure
  • FIG. 6 is a second schematic diagram of hiding notes based on two-finger operations according to an embodiment of the disclosure.
  • FIG. 7 is a schematic diagram of a process of hiding a note based on a two-finger operation according to an embodiment of the disclosure.
  • FIG. 8 is a schematic diagram of viewing hidden notes based on a two-finger operation according to an embodiment of the disclosure.
  • FIG. 9 is a schematic diagram of a flow of viewing hidden notes based on two-finger operations according to an embodiment of the disclosure.
  • FIG. 10 is a schematic diagram of restoring hidden notes based on two-finger operations according to an embodiment of the disclosure.
  • FIG. 11 is a schematic diagram of the process of restoring hidden notes based on two-finger operations according to an embodiment of the disclosure.
  • FIG. 12 is a first schematic diagram of a terminal structure according to an embodiment of the disclosure.
  • FIG. 13 is a second structural diagram of a terminal according to an embodiment of the disclosure.
  • the present disclosure provides an interface interaction method applied to a terminal, as shown in FIG. 4, the method includes:
  • Step 41 Display a first object, a second object, and a first target object located between the first object and the second object.
  • the object can be a sticky note, picture, text, folder, video, etc., but it is not limited to this.
  • it may be displaying an object list including the first object, the second object, and the first target object on the first interface.
  • Step 42 Receive a first touch operation for the first object and the second object.
  • the first touch operation may be formed based on a two-finger operation: specifically, it may be formed based on a selection operation of selecting the first object and the second object, such as a sliding operation after a two-finger click (ie, a selection operation), but Not limited to this.
  • Step 43 In response to the first touch operation, hide the first target object.
  • the first target object can be encrypted.
  • the encryption can be after responding to the first touch operation and before hiding the first target object, displaying an encryption interface for the user to choose Encryption method and enter the corresponding password (such as digital password, fingerprint information, etc.); it can also be an encryption method by default, and preset the encryption password, and encrypt the first target object at the same time; also It may be in the process of responding to the first touch operation to obtain the identity information of the operating subject based on the input of the first touch operation (for example, to obtain the fingerprint information of the user's finger who input the first touch operation), Encryption based on the identity information is not limited here.
  • the interface interaction method displays a first object, a second object, and a first target object located between the first object and the second object;
  • the first touch operation of the second object in response to the first touch operation, the first target object is hidden; it is no longer necessary to long press the note and click the button when hiding the note, which can be set
  • the two-finger gesture operation is used to hide the notes (easy to operate), and operate the notes in a more secret and friendly way, giving users a safer and more reliable experience; and this solution can realize multiple notes at once, which is convenient Faster and more playable; in addition, the process of hiding notes in this solution is completed on the same interface, and there is no need to switch back and forth between multiple interfaces, which can save costs.
  • hiding the first target object specifically includes: in response to the first touch operation, controlling the first object and the second object to face each other (Directions approaching each other) moving; in a case where the first distance between the first object and the second object is less than or equal to a first threshold, the first target object is hidden.
  • the first distance may specifically be: the first distance between the first real-time position after the movement of the first object and the second real-time position after the movement of the second object;
  • the distance between the real-time positions of two objects it can be the distance between the centers of the objects, or the distance between the corresponding edges (adjacent edges) of the two objects, for example: label 1 in Figure 5
  • the first threshold may be 0 cm.
  • hiding the first target object specifically includes: The first distance between an object and the second object is less than or equal to a first threshold, and at least one of the first object and the second object is no longer touched (no longer selected) In this case, the first target object is hidden.
  • the method further includes: connecting the first object and the second object When the first distance between the two is less than or equal to the first threshold, first prompt information is output.
  • the user is prompted to hide the first target object if the first object and the second object are no longer selected.
  • the prompt can be set as prompt text information with transparency, but it is not limited to this, and can be a prompt manner such as voice.
  • the method further includes: displaying a fifth object and a sixth object; receiving a third touch operation (preset for the fifth object and the sixth object) Display instruction); in response to the third touch operation, the hidden third target object is displayed on a hidden display interface (different from the interface for displaying the first object and the second object); wherein, the first object
  • the three target objects are located between the fifth object and the sixth object before being hidden (that is, the display position of the third target object before being hidden is between the display position of the fifth object and the display position of the sixth object , And these objects can be displayed on an interface).
  • the third touch operation can be formed based on the two-finger operation: specifically, it can be formed based on the selection operation of the fifth object and the sixth object.
  • the selection operation is a two-finger tap operation
  • the third touch operation is based on The two-finger tap operation is formed by the operation of continuing to slide to the right or left of the terminal screen, but it is not limited to this.
  • the decryption can be to display a decryption interface before displaying the third target object for the user to enter a password (such as digital password, fingerprint information, etc.) for decryption; or
  • a password such as digital password, fingerprint information, etc.
  • the identity information of the operating subject is obtained based on the input of the third touch operation (for example, the fingerprint information of the user's finger who input the third touch operation is obtained), according to the The identity information is authenticated. If the authentication is passed, the third target object is displayed; otherwise, the third target object is not displayed, but it is not limited to this.
  • the method further includes: on the hidden display interface, obtaining a fourth touch operation (hidden Instruction); in response to the fourth touch operation, hide the hidden display interface.
  • a fourth touch operation hidden Instruction
  • the fourth touch operation it can be formed by the operation of continuing to slide to the left or right of the terminal screen based on a two-finger tap in the blank area of the hidden display interface.
  • the sliding operation here forms the third touch operation
  • the sliding operation of may be the opposite sliding operation to facilitate user memory, but it is not limited to this.
  • the encryption operation can also be performed on the objects in the hidden display interface.
  • the encryption operation can also be performed on the objects in the hidden display interface.
  • the method further includes: displaying a third object and a fourth object; receiving a second touch operation for the third object and the fourth object; responding to the second touch operation , Displaying the hidden second target object between the third object and the fourth object; wherein the second target object is located between the third object and the fourth object before being hidden .
  • the second touch operation may be formed based on a two-finger operation: specifically, it may be formed based on a selection operation of selecting a third object and a fourth object, such as a sliding operation after a two-finger click (ie, a selection operation), but Not limited to this.
  • This can facilitate the restoration of the hidden target object.
  • the decryption can be after responding to the second touch operation and before restoring the hidden second target object, displaying the decryption interface for the user to enter the password ( For example, digital passwords, fingerprint information, etc.) are decrypted; or in the process of responding to the second touch operation, the identity information of the operating subject is obtained based on the input of the second touch operation (for example, obtaining the input of the second touch operation). Fingerprint information of the user who controls the operation) is authenticated according to the identity information. If the authentication is passed, the second target object is restored; otherwise, the second target object is not restored, but it is not limited to this.
  • the password For example, digital passwords, fingerprint information, etc.
  • the identity information of the operating subject is obtained based on the input of the second touch operation (for example, obtaining the input of the second touch operation).
  • Fingerprint information of the user who controls the operation) is authenticated according to the identity information. If the authentication is passed, the second target object is restored; otherwise, the second target object is not restored, but it is
  • displaying the hidden second target object between the third object and the fourth object includes: in response to the second touch operation, Control the third object and the fourth object to move back (toward a direction away from each other); when the second distance between the third object and the fourth object is greater than or equal to a second threshold , Displaying the second target object between the third object and the fourth object.
  • the second distance may specifically be: the second distance between the third real-time position after the movement of the third object and the fourth real-time position after the movement of the fourth object; the second threshold may be 5 cm.
  • the second target object is displayed on the third object and the fourth object.
  • the fourth object specifically includes: a second distance between the third object and the fourth object is greater than or equal to a second threshold, and at least one of the third object and the fourth object When the person is no longer touched (no longer selected), the second target object is displayed between the third object and the fourth object.
  • the method further includes: connecting the third object and the fourth object When the second distance between the objects is less than or equal to the second threshold, output second prompt information.
  • the user is prompted if the third object and the fourth object are no longer selected, then the The second target object is restored and displayed (specifically, it can be displayed at the position where it was before being hidden).
  • the prompt can be set as prompt text information with transparency, but it is not limited to this, and can be a prompt manner such as voice.
  • the method further includes: detecting that the first object and/or the second object are In the case of a preset operation instruction, output third prompt information corresponding to the first target object;
  • the preset operation instruction includes at least one of a deletion instruction and a position movement instruction.
  • the third prompt information is used to prompt the user that the first target object is hidden.
  • the interface interaction method provided by the embodiments of the present disclosure will be further explained below.
  • the object is signed as an example, and the related touch operation is taken as an example with a two-finger operation.
  • the embodiments of the present disclosure provide an interface interaction method, which can specifically be a method of hiding, displaying, and restoring notes based on two-finger operations, which can operate on notes in a more secret and friendly way, giving users A more secure and reliable experience;
  • This solution no longer needs to press the note and click the button when hiding and restoring the note. Instead, it hides and restores the note through the set two-finger gesture operation and display related text prompts. At the same time, multiple operations can be performed. Note, thereby enhancing the privacy of the note and the user’s experience, thereby enhancing the user’s psychological perception value.
  • the method of hiding, displaying, and restoring notes based on two-finger operations removes the way that users long press notes and click privacy and cancel privacy icons, bringing a more friendly visual interaction experience; Hide multiple notes at once, convenient and fast.
  • Figure 7 the process of hiding notes based on two-finger operations can be shown in Figure 7, including:
  • Step 71 Start;
  • Step 72 Open the note software
  • Step 73 Two fingers that identify the user are located on two sticky notes respectively;
  • Step 74 Determine whether two fingers slide to each other at the same time and the two notes touch each other, if yes, go to step 76, if not, go to step 75;
  • Step 75 Judge whether the two fingers of the user restore the position, if yes, go to step 79, if not, go back to step 74;
  • Step 76 Provide text prompts
  • Step 77 Determine whether the user's two fingers are released (leave the note), if yes, go to step 78, if not, go back to step 75;
  • Step 78 Hide all the notes between the two notes
  • Step 79 End.
  • Figure 8 The process of viewing hidden notes based on two-finger operations is shown in Figure 8, which mainly includes:
  • Two fingers are on the left side of the two notes at the same time, and slide to the right at the same time, to determine whether there is a hidden label between the two labels. If there is, when the finger slides to the right, the hidden label gradually appears (the page shows two The process of pushing each interface), if not, there is no response; if you want to return, just slide two fingers to the left on the hidden tab interface.
  • Figure 9 the flow of viewing hidden notes based on two-finger operations may be as shown in Figure 9, including:
  • Step 91 Start;
  • Step 92 Open the note software
  • Step 93 The two fingers that identify the user are located on two sticky notes respectively;
  • Step 94 Determine whether two fingers slide to the right at the same time, if yes, go to step 96, if not, go to step 95;
  • Step 95 Display the current memo interface, and go to step 911;
  • Step 96 Determine whether there is a hidden note between the two notes selected by two fingers (that is, the third target object), if yes, go to step 97, if not, go to step 911;
  • Step 97 The hidden notes are gradually displayed
  • Step 98 Determine whether two fingers slide to the left at the same time, if yes, go to step 99, if not, go to step 910;
  • Step 99 The hidden note gradually disappears, and step 911 is entered;
  • Step 910 Display the hidden note interface
  • Figure 10 The process of restoring hidden notes based on two-finger operations is shown in Figure 10, which mainly includes:
  • Figure 11 the process of restoring hidden notes based on two-finger operations can be shown in Figure 11, including:
  • Step 111 Start;
  • Step 112 Open the note software
  • Step 113 The two fingers that identify the user are respectively located on two sticky notes;
  • Step 114 Determine whether the two fingers slide to a certain distance away from each other at the same time, if yes, go to step 116, if not, go to step 115;
  • Step 115 Judge whether the two fingers of the user restore the position, if yes, go to step 1110, if not, go back to step 114;
  • Step 116 Determine whether there is a hidden note (that is, the aforementioned second target object) between the two selected notes, if yes, go to step 117, if not, go to step 1110;
  • Step 117 Provide text prompts
  • Step 118 Determine whether the user's two fingers are released (leave the note), if yes, go to step 119, if not, return to step 115;
  • Step 119 The hidden note is restored
  • Step 1110 End.
  • the solution provided by the embodiments of the present disclosure can realize the hiding, displaying and restoring of the sticky notes, which is more private, and combined with the corresponding text prompts (with transparency), it gives users a more friendly interactive experience and visual impact. It makes users feel more comfortable, convenient to operate, and more playable; while the original way of long-pressing and clicking on the sticky notes can inevitably give people a sense of rigid and step-by-step procedures.
  • the freshness of users is not enough, but The playability is not high; further, in this solution, hiding or restoring notes can be done on one interface, and there is no need to switch back and forth between multiple interfaces, which can save costs.
  • hiding notes or restoring notes can operate on multiple notes at the same time, which is more convenient and faster than only operating on a single note at a time.
  • the embodiment of the present disclosure also provides a terminal, as shown in FIG. 12, including:
  • the first display module 121 is configured to display a first object, a second object, and a first target object located between the first object and the second object;
  • the first receiving module 122 is configured to receive a first touch operation for the first object and the second object;
  • the first processing module 123 is configured to hide the first target object in response to the first touch operation.
  • the terminal displayed a first object, a second object, and a first target object located between the first object and the second object through the first display module 121; the first receiving module 122 receives For the first touch operation of the first object and the second object; the first processing module 123 hides the first target object in response to the first touch operation; realizes that when the note is hidden Then you need to press and hold the note and click the button, you can hide the note through the set two-finger gesture operation (easy to operate), and operate the note in a more secret and friendly way, giving users a safer and more reliable experience;
  • this solution can operate multiple notes at once, which is convenient and fast, and has higher playability; in addition, the process of hiding notes in this solution is completed on the same interface, and multiple interfaces are not required to switch back and forth, which can save costs.
  • the first processing module specifically includes: a first control sub-module, configured to control the first object and the second object to move toward each other in response to the first touch operation; a first processing sub-module , For hiding the first target object when the first distance between the first object and the second object is less than or equal to a first threshold.
  • the first processing sub-module specifically includes: a first processing unit, configured to set a first distance between the first object and the second object to be less than or equal to a first threshold, and the first When at least one of an object and the second object is no longer touched, the first target object is hidden.
  • the terminal further includes: a first output module, configured to control the first object and the second object to move toward each other in response to the first touch operation, When the first distance between the object and the second object is less than or equal to the first threshold, first prompt information is output.
  • a first output module configured to control the first object and the second object to move toward each other in response to the first touch operation, When the first distance between the object and the second object is less than or equal to the first threshold, first prompt information is output.
  • the terminal further includes: a second display module for displaying a third object and a fourth object; a second receiving module for receiving the third object and the fourth object Two touch operations; a second processing module for displaying a hidden second target object between the third object and the fourth object in response to the second touch operation; wherein, the The second target object is located between the third object and the fourth object before being hidden.
  • the second processing module includes: a second control sub-module for controlling the third object and the fourth object to move backwards in response to the second touch operation; a second processing sub-module , Used for displaying the second target object on the third object and the fourth object when the second distance between the third object and the fourth object is greater than or equal to a second threshold. Between objects.
  • the second processing sub-module specifically includes: a second processing unit configured to set a second distance between the third object and the fourth object to be greater than or equal to a second threshold, and the first When at least one of the three objects and the fourth object is no longer touched, the second target object is displayed between the third object and the fourth object.
  • the terminal further includes: a second output module, configured to control the third object and the fourth object to move backwards in response to the second touch operation, in the first When the second distance between the three objects and the fourth object is less than or equal to the second threshold, output second prompt information.
  • a second output module configured to control the third object and the fourth object to move backwards in response to the second touch operation, in the first When the second distance between the three objects and the fourth object is less than or equal to the second threshold, output second prompt information.
  • the terminal further includes: a third output module, configured to, after hiding the first target object in response to the first touch operation, after detecting that the first object and/ Or in the case of a preset operation instruction of the second object, output third prompt information corresponding to the first target object; wherein, the preset operation instruction includes at least one of a deletion instruction and a position movement instruction .
  • the terminal provided by the embodiment of the present disclosure can implement the various processes implemented by the terminal in the method embodiments of FIG. 1 to FIG. 11. In order to avoid repetition, details are not described herein again.
  • the terminal 130 includes but is not limited to: a radio frequency unit 131, a network module 132, an audio output unit 133, an input unit 134, a sensor 135, a display unit 136, User input unit 137, interface unit 138, memory 139, processor 1310, power supply 1311 and other components.
  • a radio frequency unit 131 includes but is not limited to: a radio frequency unit 131, a network module 132, an audio output unit 133, an input unit 134, a sensor 135, a display unit 136, User input unit 137, interface unit 138, memory 139, processor 1310, power supply 1311 and other components.
  • terminal structure shown in FIG. 13 does not constitute a limitation on the terminal, and the terminal may include more or fewer components than shown in the figure, or combine certain components, or arrange different components.
  • terminals include, but are not limited to, mobile phones, tablet computers, notebook computers, palmtop computers, vehicle-mounted terminals, wearable devices, and pedometers.
  • the display unit 136 is used to display the first object, the second object, and the first target object located between the first object and the second object;
  • the user input unit 137 is used to receive The first touch operation of the object and the second object;
  • the processor 1310 is configured to hide the first target object in response to the first touch operation.
  • the first object, the second object, and the first target object located between the first object and the second object are displayed through the display unit 136; the user input unit 137 receives The first touch operation of the object and the second object; the processor 1310 hides the first target object in response to the first touch operation; it is no longer necessary to press and click the note when the note is hidden Button, you can hide the note through the set two-finger gesture operation (easy to operate), and operate the note in a more secret and friendly way, giving users a safer and more reliable experience; and this solution can achieve one-time Operating multiple notes is convenient and fast, and has higher playability; in addition, the process of hiding notes in this solution is completed on the same interface, and there is no need to switch back and forth between multiple interfaces, which can save costs.
  • the radio frequency unit 131 can be used for receiving and sending signals in the process of sending and receiving information or talking. Specifically, the downlink data from the base station is received and processed by the processor 1310; Uplink data is sent to the base station.
  • the radio frequency unit 131 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like.
  • the radio frequency unit 131 can also communicate with the network and other devices through a wireless communication system.
  • the terminal provides users with wireless broadband Internet access through the network module 132, such as helping users to send and receive emails, browse web pages, and access streaming media.
  • the audio output unit 133 may convert the audio data received by the radio frequency unit 131 or the network module 132 or stored in the memory 139 into an audio signal and output it as sound. Moreover, the audio output unit 133 may also provide audio output related to a specific function performed by the terminal 130 (for example, call signal reception sound, message reception sound, etc.).
  • the audio output unit 133 includes a speaker, a buzzer, a receiver, and the like.
  • the input unit 134 is used to receive audio or video signals.
  • the input unit 134 may include a graphics processing unit (GPU) 1341 and a microphone 1342, and the graphics processor 1341 is configured to monitor images of still pictures or videos obtained by an image capture device (such as a camera) in a video capture mode or an image capture mode. Data is processed.
  • the processed image frame may be displayed on the display unit 136.
  • the image frame processed by the graphics processor 1341 may be stored in the memory 139 (or other storage medium) or sent via the radio frequency unit 131 or the network module 132.
  • the microphone 1342 can receive sound, and can process such sound into audio data.
  • the processed audio data can be converted into a format that can be sent to the mobile communication base station via the radio frequency unit 131 in the case of a telephone call mode for output.
  • the terminal 130 also includes at least one sensor 135, such as a light sensor, a motion sensor, and other sensors.
  • the light sensor includes an ambient light sensor and a proximity sensor.
  • the ambient light sensor can adjust the brightness of the display panel 1361 according to the brightness of the ambient light.
  • the proximity sensor can close the display panel 1361 and/or when the terminal 130 is moved to the ear. Or backlight.
  • the accelerometer sensor can detect the magnitude of acceleration in various directions (usually three-axis), and can detect the magnitude and direction of gravity when stationary, and can be used to identify terminal posture (such as horizontal and vertical screen switching, related games, Magnetometer attitude calibration), vibration recognition related functions (such as pedometer, percussion), etc.; sensor 135 can also include fingerprint sensor, pressure sensor, iris sensor, molecular sensor, gyroscope, barometer, hygrometer, thermometer, infrared Sensors, etc., will not be repeated here.
  • the display unit 136 is used to display information input by the user or information provided to the user.
  • the display unit 136 may include a display panel 1361, and the display panel 1361 may be configured in the form of a liquid crystal display (LCD), an organic light-emitting diode (OLED), etc.
  • LCD liquid crystal display
  • OLED organic light-emitting diode
  • the user input unit 137 may be used to receive inputted numeric or character information, and generate key signal input related to user settings and function control of the terminal.
  • the user input unit 137 includes a touch panel 1371 and other input devices 1372.
  • the touch panel 1371 also known as a touch screen, can collect user touch operations on or near it (for example, the user uses any suitable objects or accessories such as fingers, stylus, etc.) on the touch panel 1371 or near the touch panel 1371. operating).
  • the touch panel 1371 may include two parts, a touch detection device and a touch controller.
  • the touch detection device detects the user's touch position, detects the signal brought by the touch operation, and transmits the signal to the touch controller; the touch controller receives the touch information from the touch detection device, converts it into contact coordinates, and then sends it To the processor 1310, the command sent by the processor 1310 is received and executed.
  • the touch panel 1371 can be implemented in multiple types such as resistive, capacitive, infrared, and surface acoustic wave.
  • the user input unit 137 may also include other input devices 1372.
  • other input devices 1372 may include, but are not limited to, a physical keyboard, function keys (such as volume control buttons, switch buttons, etc.), trackball, mouse, and joystick, which will not be repeated here.
  • the touch panel 1371 can cover the display panel 1361.
  • the touch panel 1371 detects a touch operation on or near it, it transmits it to the processor 1310 to determine the type of the touch event, and then the processor 1310 determines the type of touch event according to the touch.
  • the type of event provides corresponding visual output on the display panel 1361.
  • the touch panel 1371 and the display panel 1361 are used as two independent components to realize the input and output functions of the terminal, but in some embodiments, the touch panel 1371 and the display panel 1361 can be integrated. Realize the input and output functions of the terminal, which are not limited here.
  • the interface unit 138 is an interface for connecting an external device and the terminal 130.
  • the external device may include a wired or wireless headset port, an external power source (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device with an identification module, audio input/output (I/O) port, video I/O port, headphone port, etc.
  • the interface unit 138 may be used to receive input (for example, data information, power, etc.) from an external device and transmit the received input to one or more elements in the terminal 130, or may be used to communicate between the terminal 130 and the external device. Transfer data between.
  • the memory 139 can be used to store software programs and various data.
  • the memory 139 may mainly include a program storage area and a data storage area.
  • the program storage area may store an operating system, an application program required by at least one function (such as a sound playback function, an image playback function, etc.), etc.; Data (such as audio data, phone book, etc.) created by the use of mobile phones.
  • the memory 139 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, or other volatile solid-state storage devices.
  • the processor 1310 is the control center of the terminal. It uses various interfaces and lines to connect various parts of the entire terminal. It executes by running or executing software programs and/or modules stored in the memory 139, and calling data stored in the memory 139. Various functions of the terminal and processing data, so as to monitor the terminal as a whole.
  • the processor 1310 may include one or more processing units; optionally, the processor 1310 may integrate an application processor and a modem processor, where the application processor mainly processes the operating system, user interface, and application programs, etc.
  • the adjustment processor mainly deals with wireless communication. It can be understood that the foregoing modem processor may not be integrated into the processor 1310.
  • the terminal 130 may also include a power source 1311 (such as a battery) for supplying power to various components.
  • a power source 1311 such as a battery
  • the power source 1311 may be logically connected to the processor 1310 through a power management system, so as to manage charging, discharging, and power consumption management through the power management system And other functions.
  • the terminal 130 includes some functional modules not shown, which will not be repeated here.
  • an embodiment of the present disclosure further provides a terminal, including a processor 1310, a memory 139, a computer program stored on the memory 139 and running on the processor 1310, and the computer program is executed by the processor 1310.
  • a terminal including a processor 1310, a memory 139, a computer program stored on the memory 139 and running on the processor 1310, and the computer program is executed by the processor 1310.
  • the embodiments of the present disclosure also provide a computer-readable storage medium, and a computer program is stored on the computer-readable storage medium.
  • a computer program is stored on the computer-readable storage medium.
  • the computer program is executed by a processor, each process of the above-mentioned interface interaction method embodiment is realized, and the same technology can be achieved. The effect, to avoid repetition, will not be repeated here.
  • the computer-readable storage medium such as read-only memory (Read-Only Memory, ROM), random access memory (Random Access Memory, RAM), magnetic disk or optical disk, etc.
  • the technical solution of the present disclosure essentially or the part that contributes to the related technology can be embodied in the form of a software product.
  • the computer software product is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk).
  • a terminal which can be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.

Abstract

La présente invention concerne un procédé d'interaction d'interface et un terminal. Le procédé d'interaction d'interface consiste : à afficher un premier objet, un second objet, et un premier objet cible situé entre le premier objet et le second objet (41) ; à recevoir une première opération de commande tactile concernant le premier objet et le second objet (42) ; et en réponse à la première opération de commande tactile, à dissimuler le premier objet cible (43).
PCT/CN2020/099719 2019-07-10 2020-07-01 Procédé d'interaction d'interface et terminal WO2021004352A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910619914.9 2019-07-10
CN201910619914.9A CN110377219B (zh) 2019-07-10 2019-07-10 一种界面交互方法及终端

Publications (1)

Publication Number Publication Date
WO2021004352A1 true WO2021004352A1 (fr) 2021-01-14

Family

ID=68250886

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/099719 WO2021004352A1 (fr) 2019-07-10 2020-07-01 Procédé d'interaction d'interface et terminal

Country Status (2)

Country Link
CN (1) CN110377219B (fr)
WO (1) WO2021004352A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110377219B (zh) * 2019-07-10 2021-04-20 维沃移动通信有限公司 一种界面交互方法及终端

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102455844A (zh) * 2010-10-21 2012-05-16 英华达(上海)电子有限公司 电子书及其控制方法
CN102880837A (zh) * 2012-08-24 2013-01-16 腾讯科技(深圳)有限公司 提高移动终端安全性的方法和移动终端
CN104463004A (zh) * 2013-09-24 2015-03-25 北京三星通信技术研究有限公司 一种保护界面内容的方法和装置
CN110377219A (zh) * 2019-07-10 2019-10-25 维沃移动通信有限公司 一种界面交互方法及终端

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8065603B2 (en) * 2007-04-30 2011-11-22 Google Inc. Hiding portions of display content
KR20130143160A (ko) * 2012-06-20 2013-12-31 삼성전자주식회사 휴대단말기의 스크롤 제어장치 및 방법
US8904304B2 (en) * 2012-06-25 2014-12-02 Barnesandnoble.Com Llc Creation and exposure of embedded secondary content data relevant to a primary content page of an electronic book
CN111984165B (zh) * 2013-09-29 2022-07-08 小米科技有限责任公司 一种显示消息的方法、装置及终端设备
CN103927495B (zh) * 2014-04-16 2016-05-25 努比亚技术有限公司 隐藏对象的方法和装置
CN105607732A (zh) * 2014-11-20 2016-05-25 阿里巴巴集团控股有限公司 展现对象信息的方法和装置
CN104808919A (zh) * 2015-04-29 2015-07-29 努比亚技术有限公司 界面显示控制方法和装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102455844A (zh) * 2010-10-21 2012-05-16 英华达(上海)电子有限公司 电子书及其控制方法
CN102880837A (zh) * 2012-08-24 2013-01-16 腾讯科技(深圳)有限公司 提高移动终端安全性的方法和移动终端
CN104463004A (zh) * 2013-09-24 2015-03-25 北京三星通信技术研究有限公司 一种保护界面内容的方法和装置
CN110377219A (zh) * 2019-07-10 2019-10-25 维沃移动通信有限公司 一种界面交互方法及终端

Also Published As

Publication number Publication date
CN110377219A (zh) 2019-10-25
CN110377219B (zh) 2021-04-20

Similar Documents

Publication Publication Date Title
WO2021098678A1 (fr) Procédé de commande de vidéocapture d'écran et dispositif électronique
WO2020134813A1 (fr) Procédé de commande d'exploitation et terminal
WO2019228294A1 (fr) Procédé de partage d'objet et terminal mobile
WO2019174611A1 (fr) Procédé de configuration d'application et terminal mobile
WO2020156111A1 (fr) Terminal et procédé d'affichage d'interface
WO2020151516A1 (fr) Procédé d'envoi de message et terminal mobile
WO2020011074A1 (fr) Procédé de verrouillage d'écran et dispositif électronique
WO2020020126A1 (fr) Procédé de traitement d'informations et terminal
CN108646958B (zh) 一种应用程序启动方法及终端
WO2020238449A1 (fr) Procédé de traitement de messages de notification et terminal
WO2020057257A1 (fr) Procédé de basculement d'interface d'application et terminal mobile
WO2020238463A1 (fr) Procédé de traitement de messages et terminal
WO2020259091A1 (fr) Procédé d'affichage de contenu d'écran et terminal
WO2020042892A1 (fr) Procédé de changement de mode d'appel et dispositif terminal
WO2021004327A1 (fr) Procédé de définition d'autorisation d'application, et dispositif terminal
WO2020199987A1 (fr) Procédé d'affichage de message et terminal mobile
WO2020238408A1 (fr) Procédé d'affichage d'icône d'application et terminal
WO2020233218A1 (fr) Procédé de chiffrement d'informations, procédé de déchiffrement d'informations et terminal
WO2020199783A1 (fr) Procédé d'affichage d'interface et dispositif terminal
WO2019223492A1 (fr) Procédé d'affichage d'informations, terminal mobile, et support de stockage lisible par ordinateur
CN109815676B (zh) 一种隐私空间操作方法及终端设备
WO2021004306A1 (fr) Procédé et terminal de commande d'opérations
WO2019114522A1 (fr) Procédé de commande d'écran, appareil de commande d'écran et terminal mobile
WO2020038166A1 (fr) Procédé d'exploitation d'une application de bureau, et terminal
WO2020078234A1 (fr) Procédé de commande d'affichage et terminal

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20837511

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 20837511

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 20837511

Country of ref document: EP

Kind code of ref document: A1