CN110377219B - Interface interaction method and terminal - Google Patents

Interface interaction method and terminal Download PDF

Info

Publication number
CN110377219B
CN110377219B CN201910619914.9A CN201910619914A CN110377219B CN 110377219 B CN110377219 B CN 110377219B CN 201910619914 A CN201910619914 A CN 201910619914A CN 110377219 B CN110377219 B CN 110377219B
Authority
CN
China
Prior art keywords
touch operation
hidden
target object
terminal
displaying
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910619914.9A
Other languages
Chinese (zh)
Other versions
CN110377219A (en
Inventor
解晓鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201910619914.9A priority Critical patent/CN110377219B/en
Publication of CN110377219A publication Critical patent/CN110377219A/en
Priority to PCT/CN2020/099719 priority patent/WO2021004352A1/en
Application granted granted Critical
Publication of CN110377219B publication Critical patent/CN110377219B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Abstract

The invention provides an interface interaction method and a terminal, wherein the interface interaction method comprises the following steps: displaying a first object, a second object, and a first target object located between the first object and the second object; receiving a first touch operation for the first object and the second object; and hiding the first target object in response to the first touch operation. According to the scheme, the note does not need to be pressed and the button needs to be clicked for a long time when the note is hidden, the note can be hidden through the set two-finger gesture operation, and the method is convenient and fast; in addition, the process of hiding the note in the scheme is completed on the same interface, and the multiple interfaces do not need to be switched back and forth, so that the cost can be saved.

Description

Interface interaction method and terminal
Technical Field
The invention relates to the technical field of terminals, in particular to an interface interaction method and a terminal.
Background
It is well known that the main function of a note is "memo", i.e. recording and reminding. People are difficult to avoid some trivial matters both in work and life. In order to avoid forgetting due to too much trouble, people can choose to record the information by using a note. With the development of the internet +, note software of mobile terminals is used to record some important matters to prevent forgetting.
Some notebooks are important sometimes, and people do not want others to see the notebooks, so the notebooks need to be hidden. The operation mode of the prior art hidden note is shown in fig. 1: 1. opening the note software; 2. long press the note and select the note to be hidden; 3. clicking the selected note to set the note as private; 4. the selected note disappears (hidden) at the current interface.
Accordingly, the operation mode of viewing the hidden note is shown in fig. 2: 1. opening the note software; 2. the finger slides downwards to call out the interface with the lock; 3. clicking the lock by a finger; 4. and opening the interface of the hidden note.
The operation mode of restoring the hidden note is shown in fig. 3: 1. opening an interface of the hidden note; 2. selecting a note to be restored by long pressing the note; 3. clicking the selected note to set the note as privacy cancellation; 4. the selected note disappears on the current interface (reverts to the interface before hiding), which is still the interface of the hidden note.
However, the existing method for hiding the notes has the following defects:
1. insufficient privacy: since the purpose of hiding is to be achieved, the operation mode must be more secret. The mode of 'long press' is common, most users have the habit of long press of a certain APP or file, so that if other people take the mobile phone of the user, the interface of the hidden note can be called easily (if the hidden note is not encrypted). For users, the design experience is not friendly, the purpose of hiding the notes is not achieved, and the mood and experience of the users are influenced.
2. The operations related to hiding, displaying and restoring notes are complex: the user is required to click for many times, a lock mark is also arranged when the note is displayed, other users can easily know that the hidden note exists, and the privacy is not enough. In addition, the hidden note and the restored hidden note can only be operated on one note, when a plurality of notes need to be hidden or restored, one note is needed to be operated, and user experience is poor.
Disclosure of Invention
The invention aims to provide an interface interaction method and a terminal, and aims to solve the problems that in the prior art, a note hiding scheme is complex in operation and not secret enough.
In order to solve the technical problem, the invention is realized as follows:
in a first aspect, an embodiment of the present invention provides an interface interaction method, which is applied to a terminal, and the method includes:
displaying a first object, a second object, and a first target object located between the first object and the second object;
receiving a first touch operation for the first object and the second object;
and hiding the first target object in response to the first touch operation.
Optionally, the hiding the first target object in response to the first touch operation specifically includes:
responding to the first touch operation, and controlling the first object and the second object to move oppositely;
hiding the first target object if a first distance between the first object and the second object is less than or equal to a first threshold.
Optionally, hiding the first target object when the first distance between the first object and the second object is less than or equal to a first threshold specifically includes:
hiding the first target object if a first distance between the first object and the second object is less than or equal to a first threshold and at least one of the first object and the second object is no longer being touched.
Optionally, in a process of controlling the first object and the second object to move toward each other in response to the first touch operation, the method further includes:
when a first distance between the first object and the second object is smaller than or equal to the first threshold, outputting first prompt information.
Optionally, the method further includes:
displaying the third object and the fourth object;
receiving a second touch operation for the third object and the fourth object;
displaying a second target object which is hidden between the third object and the fourth object in response to the second touch operation;
wherein the second target object is located between the third object and the fourth object before being hidden.
Optionally, the displaying, in response to the second touch operation, the second target object that is hidden between the third object and the fourth object includes:
responding to the second touch operation, and controlling the third object and the fourth object to move backwards;
displaying the second target object between the third object and the fourth object if a second distance between the third object and the fourth object is greater than or equal to a second threshold.
Optionally, the displaying the second target object between the third object and the fourth object when the second distance between the third object and the fourth object is greater than or equal to a second threshold specifically includes:
displaying the second target object between the third object and the fourth object if a second distance between the third object and the fourth object is greater than or equal to a second threshold and at least one of the third object and the fourth object is no longer being touched.
Optionally, in a process of controlling the third object and the fourth object to move away from each other in response to the second touch operation, the method further includes:
and outputting second prompt information when a second distance between the third object and the fourth object is greater than or equal to the second threshold.
Optionally, after hiding the first target object in response to the first touch operation, the method further includes:
outputting third prompt information corresponding to the first target object when a preset operation instruction aiming at the first object and/or the second object is detected;
the preset operation instruction comprises at least one of a deletion instruction and a position moving instruction.
In a second aspect, an embodiment of the present invention further provides a terminal, where the terminal includes:
a first display module for displaying a first object, a second object, and a first target object located between the first object and the second object;
a first receiving module, configured to receive a first touch operation for the first object and the second object;
and the first processing module is used for hiding the first target object in response to the first touch operation.
Optionally, the first processing module specifically includes:
the first control submodule is used for responding to the first touch operation and controlling the first object and the second object to move oppositely;
a first processing sub-module, configured to hide the first target object when a first distance between the first object and the second object is smaller than or equal to a first threshold.
Optionally, the first processing sub-module specifically includes:
a first processing unit, configured to hide the first target object when a first distance between the first object and the second object is smaller than or equal to a first threshold and at least one of the first object and the second object is no longer touched.
Optionally, the terminal further includes:
the first output module is used for outputting first prompt information when a first distance between the first object and the second object is smaller than or equal to the first threshold in the process of controlling the first object and the second object to move oppositely in response to the first touch operation.
Optionally, the terminal further includes:
the second display module is used for displaying the third object and the fourth object;
a second receiving module, configured to receive a second touch operation for the third object and the fourth object;
the second processing module is used for responding to the second touch operation and displaying a hidden second target object between the third object and the fourth object;
wherein the second target object is located between the third object and the fourth object before being hidden.
Optionally, the second processing module includes:
the second control submodule is used for responding to the second touch operation and controlling the third object and the fourth object to move backwards;
a second processing sub-module for displaying the second target object between the third object and the fourth object if a second distance between the third object and the fourth object is greater than or equal to a second threshold.
Optionally, the second processing sub-module specifically includes:
a second processing unit, configured to display the second target object between the third object and the fourth object if a second distance between the third object and the fourth object is greater than or equal to a second threshold and at least one of the third object and the fourth object is no longer touched.
Optionally, the terminal further includes:
and the second output module is used for outputting second prompt information when a second distance between the third object and the fourth object is greater than or equal to the second threshold in the process of controlling the third object and the fourth object to move backwards in response to the second touch operation.
Optionally, the terminal further includes:
a third output module, configured to output third prompt information corresponding to the first target object when a preset operation instruction for the first object and/or the second object is detected after the first target object is hidden in response to the first touch operation;
the preset operation instruction comprises at least one of a deletion instruction and a position moving instruction.
In a third aspect, an embodiment of the present invention further provides a terminal, including a processor, a memory, and a computer program stored on the memory and operable on the processor, where the computer program, when executed by the processor, implements the steps of the interface interaction method described above.
In a fourth aspect, the embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the steps of the interface interaction method described above are implemented.
In the embodiment of the invention, the first target object is displayed by displaying a first object, a second object and a first target object positioned between the first object and the second object; receiving a first touch operation for the first object and the second object; hiding the first target object in response to the first touch operation; the note is hidden (the operation is simple and convenient) through the set two-finger gesture operation without long-time pressing of the note and clicking of a button during the note hiding, the note is operated in a more secret and friendly mode, and the user is provided with safer and more reliable experience; in addition, the scheme can realize one-time operation of a plurality of notepads, is convenient and quick and has higher playability; in addition, the process of hiding the note in the scheme is completed on the same interface, and the multiple interfaces do not need to be switched back and forth, so that the cost can be saved.
Drawings
FIG. 1 is a schematic diagram illustrating a note hiding process in the prior art;
FIG. 2 is a schematic diagram illustrating a hidden note viewing process in the prior art;
FIG. 3 is a schematic diagram illustrating a hidden note recovery process in the prior art;
FIG. 4 is a flowchart illustrating an interface interaction method according to an embodiment of the present invention;
FIG. 5 is a first diagram illustrating a hidden note based on two-finger operation according to an embodiment of the present invention;
FIG. 6 is a second diagram of a hidden note based on two-finger operation according to an embodiment of the present invention;
FIG. 7 is a schematic diagram illustrating a process of hiding notes based on two-finger operation according to an embodiment of the present invention;
FIG. 8 is a schematic diagram of a two-finger operation-based hidden note viewing in accordance with an embodiment of the present invention;
FIG. 9 is a flow chart illustrating a process for viewing a hidden note based on two-finger operation according to an embodiment of the present invention;
FIG. 10 is a schematic diagram illustrating recovery of a hidden note based on two-finger operation according to an embodiment of the present invention;
FIG. 11 is a flowchart illustrating a process of restoring a hidden note based on two-finger operation according to an embodiment of the present invention;
fig. 12 is a first schematic structural diagram of a terminal according to an embodiment of the present invention;
fig. 13 is a schematic diagram of a terminal structure according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Aiming at the problems that the operation of a note hiding scheme in the prior art is complex and not secret enough, the invention provides an interface interaction method which is applied to a terminal, and as shown in figure 4, the method comprises the following steps:
step 41: displaying a first object, a second object, and a first target object located between the first object and the second object.
The object may be a display icon of a note, a picture, text, a folder, a video, or the like, but is not limited thereto.
Specifically, an object list including a first object, a second object, and a first target object may be displayed on the first interface.
Step 42: receiving a first touch operation for the first object and the second object.
The first touch operation may be formed based on a two-finger operation: specifically, the selection operation may be formed based on a selection operation of selecting the first object and the second object, for example, based on a sliding operation after a two-finger click (i.e., a selection operation), but the invention is not limited thereto.
Step 43: and hiding the first target object in response to the first touch operation.
In order to further ensure the privacy of the user, the first target object may be encrypted, and specifically, the encryption may be performed by displaying an encryption interface after responding to the first touch operation and before hiding the first target object, so that the user selects an encryption mode and inputs a corresponding password (such as a digital password, fingerprint information, and the like); or, an encryption mode is defaulted, an encryption password is preset, and encryption is carried out simultaneously in the process of hiding the first target object; in the process of responding to the first touch operation, identity information of an operation subject (for example, fingerprint information of a finger of a user who inputs the first touch operation) may be acquired based on the input of the first touch operation, and encryption may be performed according to the identity information, which is not limited herein.
The interface interaction method provided by the embodiment of the invention displays a first object, a second object and a first target object positioned between the first object and the second object; receiving a first touch operation for the first object and the second object; hiding the first target object in response to the first touch operation; the note is hidden (the operation is simple and convenient) through the set two-finger gesture operation without long-time pressing of the note and clicking of a button during the note hiding, the note is operated in a more secret and friendly mode, and the user is provided with safer and more reliable experience; in addition, the scheme can realize one-time operation of a plurality of notepads, is convenient and quick and has higher playability; in addition, the process of hiding the note in the scheme is completed on the same interface, and the multiple interfaces do not need to be switched back and forth, so that the cost can be saved.
Wherein hiding the first target object in response to the first touch operation specifically includes: in response to the first touch operation, controlling the first object and the second object to move towards each other (in a direction of approaching each other); hiding the first target object if a first distance between the first object and the second object is less than or equal to a first threshold.
Specifically, the two fingers can slide in the direction of approaching after pressing the first object and the second object. The first distance may specifically be: a first distance between a first real-time location of the first object after the movement and a second real-time location of the second object after the movement;
as to the distance between the real-time positions of the two objects, specifically, the distance between the centers of the objects, or the distance between the corresponding edges (adjacent edges) of the two objects, such as: the distance between the lower edge of label 1 and the upper edge of label 4 in fig. 5. The first threshold may be 0 cm.
Specifically, hiding the first target object when the first distance between the first object and the second object is less than or equal to a first threshold specifically includes: hiding the first target object when a first distance between the first object and the second object is less than or equal to a first threshold and at least one of the first object and the second object is no longer touched (no longer selected).
This can prevent the target object from being hidden by a user's misoperation.
Further, in the process of controlling the first object and the second object to move towards each other in response to the first touch operation, the method further includes: when a first distance between the first object and the second object is smaller than or equal to the first threshold, outputting first prompt information.
The method specifically comprises the following steps: and under the condition that the first distance is smaller than or equal to a first threshold value, prompting a user to hide the first target object if the first object and the second object are not selected any more.
Therefore, more friendly interactive experience and visual impact can be provided for the user, and the target object is further prevented from being hidden due to misoperation of the user. In order to further improve the user experience, the prompt may be prompt text information with transparency, but not limited to this, and may be a prompt method such as voice.
In this embodiment of the present invention, after hiding the first target object, the method further includes: displaying a fifth object and a sixth object; receiving a third touch operation (preset display instruction) aiming at the fifth object and the sixth object; responding to the third touch operation, and displaying a third target object to be hidden on a hidden display interface (different from an interface for displaying the first object and the second object); wherein the third target object is located between the fifth object and the sixth object before being hidden (i.e. the display position of the third target object before being hidden is located between the display position of the fifth object and the display position of the sixth object, and these objects may be specifically displayed on one interface).
This can facilitate viewing of hidden objects. The third touch operation may be formed based on a two-finger operation: specifically, the touch screen may be formed based on a selection operation of selecting the fifth object and the sixth object, for example, the selection operation is a two-finger click operation, and the third touch operation is an operation of continuing to slide to the right side or the left side of the terminal screen based on the two-finger click operation, but not limited thereto.
For the above encryption operation, corresponding decryption is also required, specifically, regarding decryption, a decryption interface is displayed before the third target object is displayed, so that a user inputs a password (such as a digital password, fingerprint information, and the like) to decrypt the password; in the process of responding to the third touch operation, identity information of the operation subject is acquired based on the input of the third touch operation (for example, fingerprint information of a finger of a user who inputs the third touch operation is acquired), authentication is performed according to the identity information, if the authentication is passed, the third target object is displayed, otherwise, the third target object is not displayed, but the limitation is not made.
Further, after the third target object to be hidden is displayed on the hidden display interface in response to the third touch operation, the method further includes: acquiring a fourth touch operation (a hidden instruction) on the hidden display interface; and hiding the hidden display interface in response to the fourth touch operation.
This can facilitate re-hiding after the third target object has been viewed. The fourth touch operation may be formed by continuing the sliding operation to the left side or the right side of the terminal screen based on the two-finger click operation in the blank area of the hidden display interface, and the sliding operation forming the third touch operation may be opposite sliding operations for the user to remember, but not limited thereto.
Here, hiding the hidden display interface may also perform an encryption operation on an object in the hidden display interface, which is specifically referred to the above operation for encrypting the first target object, and is not described herein again.
In the embodiment of the present invention, the method further includes: displaying the third object and the fourth object; receiving a second touch operation for the third object and the fourth object; displaying a second target object which is hidden between the third object and the fourth object in response to the second touch operation; wherein the second target object is located between the third object and the fourth object before being hidden.
The second touch operation may be formed based on a two-finger operation: specifically, the selection operation may be performed based on the selection operation of selecting the third object and the fourth object, for example, based on the sliding operation after the two-finger click (i.e., the selection operation), but the invention is not limited thereto.
This can facilitate the restoration of the hidden target object.
For the encryption operation, corresponding decryption is also required, specifically, the decryption may be performed by displaying a decryption interface after responding to the second touch operation and before restoring the hidden second target object, so that a user inputs a password (such as a digital password, fingerprint information, and the like) to perform decryption; in the process of responding to the second touch operation, identity information of the operation subject is acquired based on the input of the second touch operation (for example, fingerprint information of a finger of a user who inputs the second touch operation is acquired), authentication is performed according to the identity information, if the authentication is passed, the second target object is restored, otherwise, the second target object is not restored, but the invention is not limited thereto.
Wherein the displaying the hidden second target object between the third object and the fourth object in response to the second touch operation comprises: in response to the second touch operation, controlling the third object and the fourth object to move back (in a direction away from each other); displaying the second target object between the third object and the fourth object if a second distance between the third object and the fourth object is greater than or equal to a second threshold.
Specifically, the third object and the fourth object are pressed by two fingers and then slid away from each other. The second distance may specifically be: a second distance between a third real-time position of the third object after the movement and a fourth real-time position of the fourth object after the movement; the second threshold may be 5 cm.
This prevents restoration of the hidden object due to a false touch.
Specifically, the displaying the second target object between the third object and the fourth object when the second distance between the third object and the fourth object is greater than or equal to a second threshold specifically includes: displaying the second target object between the third object and the fourth object if a second distance between the third object and the fourth object is greater than or equal to a second threshold and at least one of the third object and the fourth object is no longer touched (no longer selected).
This can prevent the target object hidden by the user from being restored and displayed by the misoperation.
Further, in the process of controlling the third object and the fourth object to move backwards in response to the second touch operation, the method further includes: and outputting second prompt information when a second distance between the third object and the fourth object is greater than or equal to the second threshold.
The method specifically comprises the following steps: and when the second distance is greater than or equal to a second threshold and the second target object exists, prompting a user to restore and display the second target object (specifically, the second target object may be displayed at a position before being hidden) if the third object and the fourth object are no longer selected.
Therefore, more friendly interactive experience and visual impact can be provided for the user, and the hidden target object is further prevented from being restored and displayed due to misoperation of the user. In order to further improve the user experience, the prompt may be prompt text information with transparency, but not limited to this, and may be a prompt method such as voice.
Further, after hiding the first target object in response to the first touch operation, the method further includes: outputting third prompt information corresponding to the first target object when a preset operation instruction aiming at the first object and/or the second object is detected;
the preset operation instruction comprises at least one of a deletion instruction and a position moving instruction.
Specifically, the third prompt message is used to prompt the user that the first target object hidden exists.
Therefore, the situation that the hidden target object cannot be restored and displayed due to the moving, deleting and the like of the first object and/or the second object can be avoided.
In the following, the interface interaction method provided by the embodiment of the present invention is further described, where the object is a note, and the related touch operation is a two-finger operation.
In view of the above technical problems, embodiments of the present invention provide an interface interaction method, and in particular, may be a method for hiding, displaying, and restoring a note based on two-finger operation, which operates the note in a more secret and friendly manner, thereby providing a safer and more reliable experience for users;
according to the scheme, the note does not need to be pressed and the button is clicked any longer when the note is hidden and restored, the note is hidden and restored through the set two-finger gesture operation and the display of the related text prompt, and meanwhile, the multiple notes can be operated at one time, so that the privacy of the note and the user experience are improved, and the psychological perception value of the user is further improved.
Specifically, the method for hiding, displaying and restoring the note based on two-finger operation provided by the embodiment of the invention removes the way that a user presses the note for a long time, clicks the privacy and cancels the privacy icon, and brings more friendly visual interaction experience; meanwhile, a plurality of notes can be hidden at one time, and convenience and rapidness are achieved.
The flow of operating the hidden note based on two fingers is shown in fig. 5 and 6, and mainly includes:
(1) opening the note software;
(2) the two fingers slide to each other on the two notebooks respectively, and give a text prompt at the same time: "after releasing the finger, all notes in the middle of two notes will be hidden";
(3) the note is hidden, and the hidden note cannot be seen on the current interface;
it should be noted that: (1) when the two fingers slide towards each other at the same time, when the two notes collide with each other (the first distance is 0), text prompt is carried out to achieve the purpose of privacy; (2) if the two fingers slide to the middle (before colliding with each other), the user does not want to hide the note and only needs to restore the two fingers to the original positions. (3) As shown in FIG. 5, only notes 2 and 3 are hidden and notes 1 and 4 are not hidden while operating on notes 1 and 4. As shown in FIG. 6, while operating on notes 2 and 4, only note 3 is hidden, and notes 1, 2, 4, and 5 are not hidden.
Specifically, the process for hiding the note based on the two-finger operation may be as shown in fig. 7, and includes:
step 71: starting;
step 72: opening the note software;
step 73: identifying that two fingers of a user are respectively positioned on two notebooks;
step 74: judging whether the two fingers slide towards each other at the same time and the two notes collide with each other, if so, entering a step 76, and if not, entering a step 75;
step 75: judging whether the two fingers of the user restore the positions, if so, entering a step 79, and if not, returning to the step 74;
step 76: providing a text prompt;
step 77: judging whether the two fingers of the user are loosened (leave the note), if so, entering a step 78, otherwise, returning to the step 75;
step 78: hiding all notes in the middle of the two notes;
step 79: and (6) ending.
The process of viewing hidden notes based on two-finger operation is shown in fig. 8, and mainly includes:
(1) opening the note software;
(2) the two fingers are positioned on the left sides of the two notepads at the same time and slide towards the right side at the same time, whether a hidden label exists between the two labels is judged, if yes, the hidden label is gradually exposed when the fingers slide towards the right (the process that the two interfaces push each other is displayed on a page), and if not, no reaction exists; and if the user wants to return, the user only needs to slide left on the hidden label interface by two fingers.
It should be noted that: if the two fingers start to slide, the user does not want to display the page for hiding the notepad at the moment, and only needs to slide the two fingers to the left.
Specifically, the process of viewing hidden notes based on two-finger operation may be as shown in fig. 9, including:
step 91: starting;
and step 92: opening the note software;
step 93: identifying that two fingers of a user are respectively positioned on two notebooks;
step 94: judging whether two fingers slide rightwards at the same time, if so, entering a step 96, and if not, entering a step 95;
step 95: displaying the current note interface, and entering step 911;
step 96: judging whether a hidden note (namely the third target object) exists between two notes selected by the two fingers, if so, entering a step 97, and if not, entering a step 911;
step 97: gradually displaying the hidden note;
step 98: judging whether two fingers slide leftwards at the same time, if so, entering a step 99, and if not, entering a step 910;
and 99: the hidden note gradually disappears, and the step 911 is entered;
step 910: displaying a hidden note interface;
step 911: and (6) ending.
Fig. 10 shows a process of restoring a hidden note based on two-finger operation, which mainly includes:
(1) opening the note software;
(2) two fingers slide to opposite direction simultaneously respectively on two notes (slide to certain distance, if 5cm), if have the hidden note in the middle of two notes, give the word suggestion simultaneously: "after releasing the finger, the hidden note between the two notes will revert; if the note is not hidden, no text prompt is given, and no operation occurs.
(3) The note is restored and the restored note is visible on the interface.
It should be noted that: (1) when the two fingers slide towards opposite directions at the same time and slide to a certain distance (for example, 5cm), corresponding judgment is carried out, and the user is prevented from touching the finger by mistake.
(2) If the two fingers start to slide in opposite directions, the user does not want to restore the hidden notepad at the moment, and only needs to restore the two fingers to the original positions.
Specifically, the process for restoring the hidden note based on the two-finger operation may be as shown in fig. 11, and includes:
step 111: starting;
step 112: opening the note software;
step 113: identifying that two fingers of a user are respectively positioned on two notebooks;
step 114: judging whether the two fingers slide to a certain distance away from each other at the same time, if so, entering a step 116, and if not, entering a step 115;
step 115: judging whether the two fingers of the user restore the positions, if so, entering a step 1110, and if not, returning to the step 114;
step 116: judging whether a hidden note (namely the second target object) exists between the two selected notes, if so, entering the step 117, and if not, entering the step 1110;
step 117: providing a text prompt;
step 118: judging whether the two fingers of the user are loosened (leave the note), if so, entering a step 119, otherwise, returning to the step 115;
step 119: the hidden note is restored;
step 1110: and (6) ending.
Therefore, the scheme provided by the embodiment of the invention can realize the hiding, displaying and restoring of the note, has more privacy, and combines with corresponding text prompts (with transparency), so that the user can feel more comfortable, is convenient to operate and has higher playability, and more friendly interactive experience and visual impact are provided for the user; the original note is operated in a long-press and click mode, so that a feeling of hardness and work by pressing is not given to people, the freshness of a user is insufficient, and the playability is not high; furthermore, in the scheme, the note hiding or note restoring can be finished on one interface, and the multiple interfaces do not need to be switched back and forth, so that the cost can be saved. In addition, the note is hidden or restored in the scheme, so that a plurality of notes can be operated simultaneously, and the operation is more convenient and quicker compared with the operation only for a single note at each time.
It should be noted that the solution provided in the embodiment of the present invention is not limited to change the note related action through two-finger operation, but may also be applied to operations such as deleting, moving, encrypting, decrypting, and changing information of files, pictures, and videos, and the two-finger operation can be used to perform related operations on the files, pictures, and videos very conveniently and quickly.
An embodiment of the present invention further provides a terminal, as shown in fig. 12, including:
a first display module 121 for displaying a first object, a second object, and a first target object located between the first object and the second object;
a first receiving module 122, configured to receive a first touch operation for the first object and the second object;
the first processing module 123 is configured to hide the first target object in response to the first touch operation.
The terminal provided by the embodiment of the present invention displays a first object, a second object, and a first target object located between the first object and the second object through a first display module 121; the first receiving module 122 receives a first touch operation for the first object and the second object; the first processing module 123 hides the first target object in response to the first touch operation; the note is hidden (the operation is simple and convenient) through the set two-finger gesture operation without long-time pressing of the note and clicking of a button during the note hiding, the note is operated in a more secret and friendly mode, and the user is provided with safer and more reliable experience; in addition, the scheme can realize one-time operation of a plurality of notepads, is convenient and quick and has higher playability; in addition, the process of hiding the note in the scheme is completed on the same interface, and the multiple interfaces do not need to be switched back and forth, so that the cost can be saved.
The first processing module specifically includes: the first control submodule is used for responding to the first touch operation and controlling the first object and the second object to move oppositely; a first processing sub-module, configured to hide the first target object when a first distance between the first object and the second object is smaller than or equal to a first threshold.
Specifically, the first processing sub-module specifically includes: a first processing unit, configured to hide the first target object when a first distance between the first object and the second object is smaller than or equal to a first threshold and at least one of the first object and the second object is no longer touched.
Further, the terminal further includes: the first output module is used for outputting first prompt information when a first distance between the first object and the second object is smaller than or equal to the first threshold in the process of controlling the first object and the second object to move oppositely in response to the first touch operation.
In the embodiment of the present invention, the terminal further includes: the second display module is used for displaying the third object and the fourth object; a second receiving module, configured to receive a second touch operation for the third object and the fourth object; the second processing module is used for responding to the second touch operation and displaying a hidden second target object between the third object and the fourth object; wherein the second target object is located between the third object and the fourth object before being hidden.
Wherein the second processing module comprises: the second control submodule is used for responding to the second touch operation and controlling the third object and the fourth object to move backwards; a second processing sub-module for displaying the second target object between the third object and the fourth object if a second distance between the third object and the fourth object is greater than or equal to a second threshold.
Specifically, the second processing sub-module specifically includes: a second processing unit, configured to display the second target object between the third object and the fourth object if a second distance between the third object and the fourth object is greater than or equal to a second threshold and at least one of the third object and the fourth object is no longer touched.
Further, the terminal further includes: and the second output module is used for outputting second prompt information when a second distance between the third object and the fourth object is greater than or equal to the second threshold in the process of controlling the third object and the fourth object to move backwards in response to the second touch operation.
Further, the terminal further includes: a third output module, configured to output third prompt information corresponding to the first target object when a preset operation instruction for the first object and/or the second object is detected after the first target object is hidden in response to the first touch operation; the preset operation instruction comprises at least one of a deletion instruction and a position moving instruction.
The terminal provided by the embodiment of the present invention can implement each process implemented by the terminal in the method embodiments of fig. 1 to fig. 11, and is not described herein again to avoid repetition.
Fig. 13 is a schematic diagram of a hardware structure of a terminal for implementing various embodiments of the present invention, where the terminal 130 includes, but is not limited to: radio frequency unit 131, network module 132, audio output unit 133, input unit 134, sensor 135, display unit 136, user input unit 137, interface unit 138, memory 139, processor 1310, and power supply 1311. Those skilled in the art will appreciate that the terminal configuration shown in fig. 13 is not intended to be limiting, and that the terminal may include more or fewer components than shown, or some components may be combined, or a different arrangement of components. In the embodiment of the present invention, the terminal includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, a pedometer, and the like.
The display unit 136 is configured to display a first object, a second object, and a first target object located between the first object and the second object; a user input unit 137 for receiving a first touch operation for the first object and the second object; a processor 1310 configured to hide the first target object in response to the first touch operation.
In the embodiment of the present invention, a first object, a second object, and a first target object located between the first object and the second object are displayed through the display unit 136; the user input unit 137 receives a first touch operation for the first object and the second object; the processor 1310 hides the first target object in response to the first touch operation; the note is hidden (the operation is simple and convenient) through the set two-finger gesture operation without long-time pressing of the note and clicking of a button during the note hiding, the note is operated in a more secret and friendly mode, and the user is provided with safer and more reliable experience; in addition, the scheme can realize one-time operation of a plurality of notepads, is convenient and quick and has higher playability; in addition, the process of hiding the note in the scheme is completed on the same interface, and the multiple interfaces do not need to be switched back and forth, so that the cost can be saved.
It should be understood that, in the embodiment of the present invention, the rf unit 131 may be used for receiving and transmitting signals during a message transmission or a call, and specifically, receives downlink data from a base station and then processes the received downlink data to the processor 1310; in addition, the uplink data is transmitted to the base station. Generally, the radio frequency unit 131 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 131 can also communicate with a network and other devices through a wireless communication system.
The terminal provides the user with wireless broadband internet access via the network module 132, such as assisting the user in sending and receiving e-mails, browsing web pages, and accessing streaming media.
The audio output unit 133 may convert audio data received by the radio frequency unit 131 or the network module 132 or stored in the memory 139 into an audio signal and output as sound. Also, the audio output unit 133 may also provide audio output related to a specific function performed by the terminal 130 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 133 includes a speaker, a buzzer, a receiver, and the like.
The input unit 134 is used to receive audio or video signals. The input Unit 134 may include a Graphics Processing Unit (GPU) 1341 and a microphone 1342, and the Graphics processor 1341 processes image data of a still picture or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 136. The image frames processed by the graphic processor 1341 may be stored in the memory 139 (or other storage medium) or transmitted via the radio frequency unit 131 or the network module 132. The microphone 1342 may receive sound and may be capable of processing such sound into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 131 in case of the phone call mode.
The terminal 130 also includes at least one sensor 135, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that adjusts the brightness of the display panel 1361 according to the brightness of ambient light, and a proximity sensor that turns off the display panel 1361 and/or the backlight when the terminal 130 moves to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the terminal posture (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration identification related functions (such as pedometer, tapping), and the like; the sensors 135 may also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc., which are not described in detail herein.
The display unit 136 is used to display information input by a user or information provided to the user. The Display unit 136 may include a Display panel 1361, and the Display panel 1361 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 137 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the terminal. Specifically, the user input unit 137 includes a touch panel 1371 and other input devices 1372. Touch panel 1371, also referred to as a touch screen, may collect touch operations by a user on or near the touch panel 1371 (e.g., operations by a user on or near touch panel 1371 using a finger, a stylus, or any other suitable object or attachment). The touch panel 1371 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 1310, and receives and executes commands sent from the processor 1310. In addition, the touch panel 1371 may be implemented by various types, such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to the touch panel 1371, the user input unit 137 may include other input devices 1372. Specifically, the other input devices 1372 may include, but are not limited to, a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described herein again.
Further, touch panel 1371 can be overlaid on display panel 1361, and when touch panel 1371 detects a touch operation on or near the touch panel, the touch panel is transmitted to processor 1310 to determine the type of the touch event, and then processor 1310 provides a corresponding visual output on display panel 1361 according to the type of the touch event. Although in fig. 13, the touch panel 1371 and the display panel 1361 are implemented as two independent components to implement the input and output functions of the terminal, in some embodiments, the touch panel 1371 and the display panel 1361 may be integrated to implement the input and output functions of the terminal, which is not limited herein.
The interface unit 138 is an interface through which an external device is connected to the terminal 130. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 138 may be used to receive input (e.g., data information, power, etc.) from an external device and transmit the received input to one or more elements within the terminal 130 or may be used to transmit data between the terminal 130 and the external device.
The memory 139 may be used to store software programs as well as various data. The memory 139 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required for at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 139 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 1310 is a control center of the terminal, connects various parts of the entire terminal using various interfaces and lines, performs various functions of the terminal and processes data by operating or executing software programs and/or modules stored in the memory 139 and calling data stored in the memory 139, thereby performing overall monitoring of the terminal. Processor 1310 may include one or more processing units; preferably, the processor 1310 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 1310.
The terminal 130 may further include a power supply 1311 (e.g., a battery) for supplying power to various components, and preferably, the power supply 1311 may be logically connected to the processor 1310 through a power management system, so that functions of managing charging, discharging, and power consumption are implemented through the power management system.
In addition, the terminal 130 includes some functional modules that are not shown, and are not described in detail herein.
Preferably, an embodiment of the present invention further provides a terminal, including a processor 1310, a memory 139, and a computer program stored in the memory 139 and capable of running on the processor 1310, where the computer program is executed by the processor 1310 to implement each process of the above-mentioned interface interaction method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not described here again.
The embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements each process of the interface interaction method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (16)

1. An interface interaction method is applied to a terminal, and is characterized in that the method comprises the following steps:
displaying a first object, a second object, and a first target object located between the first object and the second object;
receiving a first touch operation for the first object and the second object;
hiding the first target object in response to the first touch operation;
after the hiding the first target object, the method further comprises:
displaying a fifth object and a sixth object; receiving a third touch operation for the fifth object and the sixth object;
responding to the third touch operation, and displaying a third hidden target object on a hidden display interface;
wherein the third target object is located between the fifth object and the sixth object before being hidden, the hidden display interface being different from an interface displaying the first object and the second object;
after the displaying, in response to the third touch operation, the third target object to be hidden on the hidden display interface, the method further includes:
acquiring a fourth touch operation on the hidden display interface;
and hiding the hidden display interface in response to the fourth touch operation.
2. The interface interaction method according to claim 1, wherein hiding the first target object in response to the first touch operation specifically includes:
responding to the first touch operation, and controlling the first object and the second object to move oppositely;
hiding the first target object if a first distance between the first object and the second object is less than or equal to a first threshold.
3. The interface interaction method according to claim 2, wherein in controlling the first object and the second object to move toward each other in response to the first touch operation, the method further comprises:
when a first distance between the first object and the second object is smaller than or equal to the first threshold, outputting first prompt information.
4. The interface interaction method of claim 1, further comprising:
displaying the third object and the fourth object;
receiving a second touch operation for the third object and the fourth object;
displaying a second target object which is hidden between the third object and the fourth object in response to the second touch operation;
wherein the second target object is located between the third object and the fourth object before being hidden.
5. The interface interaction method of claim 4, wherein the displaying the hidden second target object between the third object and the fourth object in response to the second touch operation comprises:
responding to the second touch operation, and controlling the third object and the fourth object to move backwards;
displaying the second target object between the third object and the fourth object if a second distance between the third object and the fourth object is greater than or equal to a second threshold.
6. The interface interaction method according to claim 5, wherein in controlling the third object and the fourth object to move away from each other in response to the second touch operation, the method further comprises:
and outputting second prompt information when a second distance between the third object and the fourth object is greater than or equal to the second threshold.
7. The interface interaction method of claim 1, wherein after hiding the first target object in response to the first touch operation, the method further comprises:
outputting third prompt information corresponding to the first target object when a preset operation instruction aiming at the first object and/or the second object is detected;
the preset operation instruction comprises at least one of a deletion instruction and a position moving instruction.
8. A terminal, characterized in that the terminal comprises:
a first display module for displaying a first object, a second object, and a first target object located between the first object and the second object;
a first receiving module, configured to receive a first touch operation for the first object and the second object;
the first processing module is used for hiding the first target object in response to the first touch operation;
the terminal further comprises:
a third display module, configured to display a fifth object and a sixth object after the first target object is hidden;
a third receiving module, configured to receive a third touch operation for the fifth object and the sixth object;
the third processing module is used for responding to the third touch operation and displaying the hidden third target object on a hidden display interface;
wherein the third target object is located between the fifth object and the sixth object before being hidden, the hidden display interface being different from an interface displaying the first object and the second object;
the terminal further comprises:
an obtaining module, configured to obtain a fourth touch operation on a hidden display interface after the third target object to be hidden is displayed on the hidden display interface in response to the third touch operation;
and the interface hiding module is used for hiding the hidden display interface in response to the fourth touch operation.
9. The terminal according to claim 8, wherein the first processing module specifically includes:
the first control submodule is used for responding to the first touch operation and controlling the first object and the second object to move oppositely;
a first processing sub-module, configured to hide the first target object when a first distance between the first object and the second object is smaller than or equal to a first threshold.
10. The terminal of claim 9, wherein the terminal further comprises:
the first output module is used for outputting first prompt information when a first distance between the first object and the second object is smaller than or equal to the first threshold in the process of controlling the first object and the second object to move oppositely in response to the first touch operation.
11. The terminal of claim 8, wherein the terminal further comprises:
the second display module is used for displaying the third object and the fourth object;
a second receiving module, configured to receive a second touch operation for the third object and the fourth object;
the second processing module is used for responding to the second touch operation and displaying a hidden second target object between the third object and the fourth object;
wherein the second target object is located between the third object and the fourth object before being hidden.
12. The terminal of claim 11, wherein the second processing module comprises:
the second control submodule is used for responding to the second touch operation and controlling the third object and the fourth object to move backwards;
a second processing sub-module for displaying the second target object between the third object and the fourth object if a second distance between the third object and the fourth object is greater than or equal to a second threshold.
13. The terminal of claim 12, wherein the terminal further comprises:
and the second output module is used for outputting second prompt information when a second distance between the third object and the fourth object is greater than or equal to the second threshold in the process of controlling the third object and the fourth object to move backwards in response to the second touch operation.
14. The terminal of claim 8, wherein the terminal further comprises:
a third output module, configured to output third prompt information corresponding to the first target object when a preset operation instruction for the first object and/or the second object is detected after the first target object is hidden in response to the first touch operation;
the preset operation instruction comprises at least one of a deletion instruction and a position moving instruction.
15. A terminal, characterized in that it comprises a processor, a memory and a computer program stored on the memory and executable on the processor, the computer program, when executed by the processor, implementing the steps of the interface interaction method according to any one of claims 1 to 7.
16. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the interface interaction method according to any one of claims 1 to 7.
CN201910619914.9A 2019-07-10 2019-07-10 Interface interaction method and terminal Active CN110377219B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201910619914.9A CN110377219B (en) 2019-07-10 2019-07-10 Interface interaction method and terminal
PCT/CN2020/099719 WO2021004352A1 (en) 2019-07-10 2020-07-01 Interface interaction method and terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910619914.9A CN110377219B (en) 2019-07-10 2019-07-10 Interface interaction method and terminal

Publications (2)

Publication Number Publication Date
CN110377219A CN110377219A (en) 2019-10-25
CN110377219B true CN110377219B (en) 2021-04-20

Family

ID=68250886

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910619914.9A Active CN110377219B (en) 2019-07-10 2019-07-10 Interface interaction method and terminal

Country Status (2)

Country Link
CN (1) CN110377219B (en)
WO (1) WO2021004352A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110377219B (en) * 2019-07-10 2021-04-20 维沃移动通信有限公司 Interface interaction method and terminal

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2677414A2 (en) * 2012-06-20 2013-12-25 Samsung Electronics Co., Ltd Information display apparatus and method of user device
CN103927495A (en) * 2014-04-16 2014-07-16 深圳市中兴移动通信有限公司 Method and device for hiding objects
CN104808919A (en) * 2015-04-29 2015-07-29 努比亚技术有限公司 Interface display control method and device

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8065603B2 (en) * 2007-04-30 2011-11-22 Google Inc. Hiding portions of display content
CN102455844B (en) * 2010-10-21 2014-12-10 英华达(上海)电子有限公司 Electronic book and control method thereof
US8904304B2 (en) * 2012-06-25 2014-12-02 Barnesandnoble.Com Llc Creation and exposure of embedded secondary content data relevant to a primary content page of an electronic book
CN102880837B (en) * 2012-08-24 2016-05-04 腾讯科技(深圳)有限公司 Improve method and the mobile terminal of security of mobile terminal
CN104463004B (en) * 2013-09-24 2018-08-28 北京三星通信技术研究有限公司 A kind of method and apparatus of protection interface content
CN111984165B (en) * 2013-09-29 2022-07-08 小米科技有限责任公司 Method and device for displaying message and terminal equipment
CN105607732A (en) * 2014-11-20 2016-05-25 阿里巴巴集团控股有限公司 Method and device for showing object information
CN110377219B (en) * 2019-07-10 2021-04-20 维沃移动通信有限公司 Interface interaction method and terminal

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2677414A2 (en) * 2012-06-20 2013-12-25 Samsung Electronics Co., Ltd Information display apparatus and method of user device
CN103927495A (en) * 2014-04-16 2014-07-16 深圳市中兴移动通信有限公司 Method and device for hiding objects
CN104808919A (en) * 2015-04-29 2015-07-29 努比亚技术有限公司 Interface display control method and device

Also Published As

Publication number Publication date
WO2021004352A1 (en) 2021-01-14
CN110377219A (en) 2019-10-25

Similar Documents

Publication Publication Date Title
CN111107222B (en) Interface sharing method and electronic equipment
CN108491133B (en) Application program control method and terminal
WO2019141174A1 (en) Unread message processing method and mobile terminal
WO2019174611A1 (en) Application configuration method and mobile terminal
CN109343755B (en) File processing method and terminal equipment
CN108646958B (en) Application program starting method and terminal
CN109032447B (en) Icon processing method and mobile terminal
CN111104029B (en) Shortcut identifier generation method, electronic device and medium
CN109815676B (en) Privacy space operation method and terminal equipment
CN110007822B (en) Interface display method and terminal equipment
CN110618969B (en) Icon display method and electronic equipment
CN107992342B (en) Application configuration changing method and mobile terminal
CN110308834B (en) Setting method of application icon display mode and terminal
CN108153460B (en) Icon hiding method and terminal
CN111459362A (en) Information display method, information display device, electronic apparatus, and storage medium
CN111027107B (en) Object display control method and electronic equipment
CN110941469B (en) Application splitting creation method and terminal equipment thereof
CN107358083B (en) Information processing method, terminal and computer readable storage medium
CN110210206B (en) Authority management method and terminal
CN109740312B (en) Application control method and terminal equipment
CN111522477A (en) Application starting method and electronic equipment
CN111079118A (en) Icon display control method, electronic device and medium
CN111143596A (en) Article searching method and electronic equipment
CN111008179A (en) File management method and electronic equipment
CN111694497B (en) Page combination method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant