CN108958854B - Window display method and device and terminal - Google Patents

Window display method and device and terminal Download PDF

Info

Publication number
CN108958854B
CN108958854B CN201710352997.0A CN201710352997A CN108958854B CN 108958854 B CN108958854 B CN 108958854B CN 201710352997 A CN201710352997 A CN 201710352997A CN 108958854 B CN108958854 B CN 108958854B
Authority
CN
China
Prior art keywords
window
attention
windows
received
type
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710352997.0A
Other languages
Chinese (zh)
Other versions
CN108958854A (en
Inventor
玄立永
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201710352997.0A priority Critical patent/CN108958854B/en
Publication of CN108958854A publication Critical patent/CN108958854A/en
Application granted granted Critical
Publication of CN108958854B publication Critical patent/CN108958854B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the invention discloses a window display method, a window display device and a terminal, and belongs to the technical field of computers. The method comprises the following steps: obtaining attention information of each window in the created windows, wherein the attention information at least comprises attention time when the window is paid attention last time; sequencing the windows according to the attention information; and displaying the sorted windows, wherein the display priority of the windows and the time interval from the concerned moment to the current moment are in a negative correlation relationship. In the embodiment of the invention, the window with the focus moment closer to the current moment is preferentially displayed, so that a user can quickly find the window which is focused recently from the preferentially displayed windows according to the display priority of the windows, the user is prevented from traversing all established windows, and the window searching efficiency is improved.

Description

Window display method and device and terminal
Technical Field
The embodiment of the invention relates to the technical field of computers, in particular to a window display method, a window display device and a terminal.
Background
When the client supporting multiple windows is used, a user can create multiple windows according to requirements and freely switch among the multiple windows. For example, during the process of using the communication client, a user can create a plurality of session windows at the same time, and can switch among the plurality of session windows, thereby performing sessions with different users.
Because the windows are sorted and displayed in the client according to the sequence of window creation, when a user searches a window which is recently concerned, the user needs to traverse the created windows one by one according to the window name of the window which is recently concerned, so that the window which is recently concerned is searched.
However, when a user searches a window that has been focused recently from a plurality of windows in a traversal manner, a lot of time is required, which affects the efficiency of searching the window.
Disclosure of Invention
In order to solve the problems in the related art, embodiments of the present invention provide a window display method, device and terminal. The technical scheme is as follows:
according to a first aspect of embodiments of the present invention, there is provided a window display method, including:
obtaining attention information of each window in the created windows, wherein the attention information at least comprises attention time when the window is paid attention last time;
sequencing the windows according to the attention information;
and displaying the sorted windows, wherein the display priority of the windows and the time interval from the concerned moment to the current moment are in a negative correlation relationship.
Optionally, before obtaining the attention information of each window in the created windows, the method further includes:
detecting whether a current window receives an attention operation, wherein the attention operation comprises setting the window as a focus window and/or executing operation in the window;
and if the current window receives the attention operation, setting the attention time corresponding to the current window according to the time when the attention operation is received.
Optionally, the attention information further includes an attention type, where the attention type is used to indicate a type of an attention operation received by the window, and the attention type includes setting a focus attention and/or operating window attention;
after detecting whether the current window receives the attention operation, the method further comprises the following steps:
if the current window receives the attention operation, determining the type of the attention operation received by the current window;
and setting an attention type corresponding to the current window according to the type of the received attention operation.
Optionally, before obtaining the attention information of each window in the created windows, the method further includes:
receiving a window sorting signal, wherein the window sorting signal is used for indicating that windows belonging to a target attention type are sorted;
sorting the windows according to the attention information, comprising:
determining a target attention type corresponding to the window sorting signal;
determining a target window according to the target attention type and the attention type contained in the attention information, wherein the attention type corresponding to the target window is matched with the target attention type;
and sequencing the target windows according to the attention time corresponding to the target windows.
Optionally, the receiving a window sorting signal includes:
when a pressing signal of a preset entity key is received, determining the received window sorting signal;
and/or the presence of a gas in the gas,
when a preset gesture instruction is received, determining a received window sorting signal;
and/or the presence of a gas in the gas,
when a preset voice command is received, determining a received window sorting signal;
and/or the presence of a gas in the gas,
and when the touch signal of the preset virtual key is received, determining the received window sorting signal.
Optionally, the method further includes:
and when the window is detected to be closed, deleting the information corresponding to the window.
According to a second aspect of embodiments of the present invention, there is provided a window display apparatus including:
the information acquisition module is used for acquiring the attention information of each window in the created windows, wherein the attention information at least comprises the attention time when the window is paid attention last time;
the window sorting module is used for sorting the windows according to the attention information;
and the window display module is used for displaying the sorted windows, wherein the display priority of the windows and the time interval from the concerned moment to the current moment are in a negative correlation relationship.
Optionally, the apparatus further includes:
the detection module is used for detecting whether the current window receives an attention operation, wherein the attention operation comprises setting the window as a focus window and/or executing operation in the window;
and the first setting module is used for setting the attention time corresponding to the current window according to the time when the attention operation is received when the current window receives the attention operation.
Optionally, the attention information further includes an attention type, where the attention type is used to indicate a type of an attention operation received by the window, and the attention type includes setting a focus attention and/or operating the window to pay attention;
the device also comprises:
the determining module is used for determining the type of the attention operation received by the current window when the current window receives the attention operation;
and the second setting module is used for setting the attention type corresponding to the current window according to the type of the received attention operation.
Optionally, the apparatus further includes:
the signal receiving module is used for receiving a window sorting signal, and the window sorting signal is used for indicating that windows belonging to a target attention type are sorted;
a window ordering module comprising:
the first determining unit is used for determining a target attention type corresponding to the window sorting signal;
the second determining unit is used for determining a target window according to the target attention type and the attention type contained in the attention information, and the attention type corresponding to the target window is matched with the target attention type;
and the sequencing unit is used for sequencing the target windows according to the attention time corresponding to the target windows.
Optionally, the signal receiving module includes:
the first signal receiving unit is used for determining a received window sorting signal when a pressing signal of a preset entity key is received;
and/or the presence of a gas in the gas,
the second signal receiving unit is used for determining a received window sorting signal when a preset gesture instruction is received;
and/or the presence of a gas in the gas,
a third signal receiving unit for determining a received window sorting signal when a predetermined voice instruction is received;
and/or the presence of a gas in the gas,
and the fourth signal receiving unit is used for determining the received window sorting signal when receiving the touch signal of the preset virtual key.
Optionally, the apparatus further includes:
and the deleting module is used for deleting the concerned information corresponding to the window when the window is detected to be closed.
According to a third aspect of embodiments of the present invention, there is provided a terminal, including a processor and a memory, where at least one instruction is stored, and the instruction is loaded and executed by the processor to implement the window display method according to the first aspect.
According to a fourth aspect of embodiments of the present invention, there is provided a computer-readable storage medium having at least one instruction stored therein, the instruction being loaded and executed by a processor to implement the window display method according to the first aspect.
In the embodiment of the invention, when a window sorting signal for ordering the windows is received, the attention information of each created window is obtained, and the windows are sorted according to the attention time when the instruction window in the attention information is paid for the last time, so that the sorted windows are displayed, the window with the attention time closer to the current time is preferentially displayed, a user can quickly find the window which is most concerned from the preferentially displayed windows according to the display priority of the windows, the user is prevented from traversing each created window, and the window searching efficiency is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 is a schematic diagram illustrating an implementation environment provided by one embodiment of the invention;
FIG. 2 is a diagram illustrating a multi-window display mode in the related art;
FIG. 3A is a flow chart illustrating a method for displaying a window provided by one embodiment of the present invention;
FIG. 3B is a schematic diagram of an implementation of the window display method shown in FIG. 3A;
FIG. 4A is a flow chart illustrating a window display method according to another embodiment of the present invention;
FIG. 4B is a schematic diagram of an interface when a focus setting operation is performed;
FIG. 4C is a diagram illustrating an exemplary focus operation setup procedure according to the window display method shown in FIG. 4A;
FIG. 4D is a diagram illustrating an implementation of a process for setting attention information according to the window display method shown in FIG. 4A;
FIG. 4E is a method flow diagram of a window sorting process in the window display method of FIG. 4A;
fig. 5 is a block diagram showing a configuration of a window display apparatus according to an embodiment of the present invention;
fig. 6 is a schematic structural diagram of a terminal according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, embodiments of the present invention will be described in detail with reference to the accompanying drawings.
Reference herein to "a plurality" means two or more. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship.
For convenience of understanding, terms referred to in the embodiments of the present invention are explained below.
Attention is paid to: in the embodiment of the invention, attention refers to a process that a user performs certain specific operations on a window so as to generate an impression on the window. Wherein the specific operation includes, but is not limited to: setting the window to the focus window (i.e., setting the window to an operable state), an operation is to be performed in the window. For example, when the user sets a certain conversation window in the plurality of conversation windows as the focus window (no operation is performed on the conversation window), it is determined that the user focuses on the conversation window; for another example, when the user sets a session window of the plurality of session windows as the focus window and inputs information in the session window (i.e., performs an operation on the session window), it is determined that the user focuses on the session window. In the embodiment of the invention, the criterion for judging whether the window is concerned is the default of the client or the setting by the user.
Focusing operation: and the operation for indicating that the trigger window is concerned is set by the client side or the user.
Time of interest: the method is used for indicating the corresponding time when the window is concerned, and when the window is concerned and is triggered by the specified attention operation, the attention time is the time when the attention operation is received. For example, when the session window is set as the focus window by the user, or when the user inputs information in the session window, the concerned time is the time for setting the focus window or the time for inputting information. In the embodiment of the invention, each time a window is concerned, the concerned moment needs to be updated.
Referring to fig. 1, a schematic diagram of an implementation environment provided by an embodiment of the present invention is shown, where the implementation environment includes a terminal 110 and a server 120.
The terminal 110 is an electronic device having a display function. The electronic device is a smart phone, a tablet computer, a portable personal computer or the like. In each embodiment of the present invention, a client supporting multiple windows is installed in the terminal 110, where the client is a communication client, a video client, a browser client, or a mail client, and correspondingly, multiple windows displayed in the client are a session window, a video playing window, a web page label window, or an email receiving window. The embodiment of the present invention does not limit the specific types of the client and the window.
Optionally, when a plurality of windows are created in the client, each window is displayed in the status bar in the form of a window thumbnail or a window tab. When a user selects a certain window thumbnail or window label in the status bar, the client displays the corresponding window so that the user can operate in the window. In the embodiment of the invention, the client detects whether the window sorting signal is received or not in the running process, and sorts each window according to the time of the last attention of the created window when the window sorting signal is received, and preferentially displays the window which is recently concerned by the user.
The terminal 110 and the server 120 are connected by a wired or wireless network.
The server 120 is a server, a server cluster composed of a plurality of servers, or a cloud computing center. In the embodiment of the present invention, the server 120 is a background server of a client in the terminal 110, and is configured to store various parameters set by the user in a process of using the client.
Optionally, in this embodiment of the present invention, the server 120 is configured to store the attention parameters, such as the attention operation, the attention type, and the window sorting signal triggering manner, set by the user on the client side. The focus operation is used to indicate an operation received when the window is focused, for example, the focus operation is to set the window as a focus window; the focus type is used to indicate a type of a focus operation received when the window is focused, for example, when the window is focused by setting the window as a focus window, the focus type corresponding to the window is set as focus; the window sorting signal triggering mode is an operation mode set by a user and used for triggering the client to sort the windows, for example, the window sorting signal triggering mode is gesture triggering or shortcut key triggering.
After receiving the attention parameter sent by the terminal 110, the server 120 stores the attention parameter in the attention parameter database 121 in association with a client identifier (such as an account used when logging in the client) corresponding to a client in the terminal 110. After the subsequent terminal 110 reinstalls the client, the previously set parameters of interest can be directly obtained from the parameter of interest database 121 of the server 120 without resetting.
Optionally, the wireless or wired networks described above use standard communication techniques and/or protocols. The Network is typically the Internet, but may be any Network including, but not limited to, a Local Area Network (LAN), a Metropolitan Area Network (MAN), a Wide Area Network (WAN), a mobile, wireline or wireless Network, a private Network, or any combination of virtual private networks. In some embodiments, data exchanged over a network is represented using techniques and/or formats including Hypertext Mark-up Language (HTML), Extensible Markup Language (XML), and the like. All or some of the links may also be encrypted using conventional encryption techniques such as Secure Socket Layer (SSL), Transport Layer Security (TLS), Virtual Private Network (VPN), Internet Protocol Security (IPsec). In other embodiments, custom and/or dedicated data communication techniques may also be used in place of, or in addition to, the data communication techniques described above.
For convenience of description, in the embodiments of the present invention, a window display method is used for a client in the terminal 110 as an example, but the present invention is not limited thereto.
In the related art, when a user uses a client supporting multiple windows, multiple windows can be created according to the user's own needs. For example, when a user needs to perform a session with another user during using an instant messaging client, a session window corresponding to the session is created. As shown in fig. 2, session windows 21 to 26 are created in the communication client for conducting sessions with A, B, C, D, E, F, respectively.
In a usage scenario, a user knows that a task needs to be completed according to session content displayed in a target session window, and after the task is completed, finds the target session window according to an identifier of the target session window (for example, a user name of a session user), and reports a task completion condition through the target session window. Obviously, since the windows in the client are sorted according to the order of window creation (for example, the window creation time of the session window 21 to the session window 26 in fig. 2 is from early to late), and the order of window creation is irrelevant to the time when the window is concerned, when the user searches for the window that is concerned recently, the user needs to search in a traversal manner. In the case of a large number of windows, the search efficiency is low. For example, when the user needs to find the conversation window (conversation window 25) with the user E which has been focused recently, it is necessary to traverse the conversation windows 21 to 24 to finally locate the conversation window 25.
In the embodiment of the invention, the client records the attention time when each window is paid attention last time, and adjusts the display sequence of each window according to the sequence of the attention time corresponding to each window when the window sequencing signal triggered by the user is received, so as to ensure the preferential display of the window which is used for the latest attention, and facilitate the user to quickly position the window which is paid attention recently.
The window display method comprises three stages, namely an attention operation setting stage, an attention information setting stage and a window sequencing stage. In the attention operation setting stage, a client receives attention operation set by a user and a triggering mode of a window sequencing signal set by the user; in the attention information setting stage, the client judges whether the window is concerned or not based on the closing operation acquired in the attention operation setting stage, and sets or updates the attention information corresponding to the window when the window is concerned; in the window sorting stage, the client sorts the windows according to the attention time contained in the attention information corresponding to each window and the sequence of the attention time, and finally preferentially displays the window which is most recently concerned based on the sorting. The following description will be made by using exemplary embodiments.
Referring to fig. 3A, a flowchart of a window display method according to an embodiment of the present invention is shown, where the window display method is used for a terminal 110 for illustration, and the method includes:
step 301, obtaining the attention information of each window in the created windows, wherein the attention information at least comprises the attention time when the window is focused last time.
In one possible implementation, when the window sorting signal is received, the client acquires the attention information of the window. Wherein, the window sorting signal is triggered by a trigger operation preset by a user. Optionally, the user triggers the window sorting signal by pressing a predetermined physical key, a predetermined gesture, a predetermined voice, or touching a predetermined virtual key. In other possible implementations, when it is detected that the number of windows of a created window is greater than a threshold, the client acquires attention information of the window, and automatically sorts and displays the window based on the attention information, which is not limited in the embodiment of the present invention.
Optionally, a user presets a type of an attention operation received when the window is concerned, when it is detected that the current window receives the attention operation, the client determines that the current window is concerned, and sets the attention information corresponding to the current window.
In one possible embodiment, the focus operation instructs the user to set the window as the focus window and/or, when an operation is performed in the window, determines that the window is focused on. Accordingly, the client determines the time of setting the focus window or the time of executing the operation in the window as the attention time of the current window. Illustratively, taking the messaging client as an example, when a user inputs information in the session window, uploads a picture in the session window, or is called by other users in the session window (e.g., the @ function in the WeChat client offered by Tencent).
Optionally, the attention information of the window further includes an attention type, where the attention type is used to indicate an attention operation received when the window is focused. For example, when a window is set as a focus window, the focus type corresponding to the window is set as focus; when a user executes an operation in a window, the attention type corresponding to the window is the attention of the operation window.
Schematically, as shown in fig. 3B, the attention time of each session window acquired by the client is shown in table one.
Watch 1
Conversation window Conversation window 21 Conversation window 22 Conversation window 23 Conversation Window 24 Conversation window 25 Conversation window 26
Moment of interest 13:16 13:17 13:10 13:09 13:18 13:14
And step 302, sorting the windows according to the attention information.
After obtaining the attention information of each window, the client ranks the windows according to the attention sequence indicated by the attention time. Optionally, when the acquired attention time of each window is not empty (that is, each created window is concerned), the client sorts the created windows according to the attention time.
In one possible implementation, the client sorts the windows of interest in order from late to early in time of interest. Illustratively, after sequencing windows according to attention time, a client obtains a window sequence including n windows (all of which are attention windows), wherein the attention time corresponding to the ith window is later than the attention time corresponding to the (i + 1) th window, and i is greater than or equal to 1 and is less than or equal to n-1.
Illustratively, in combination with the attention time shown in table one, the window sequence obtained by the client sorting the windows is as follows: session window 25, session window 22, session window 21, session window 26, session window 23, session window 24.
Optionally, since not all created windows are paid attention, for a window not paid attention (for example, a time of attention in the attention information corresponding to the window is empty), the client sorts the window not paid attention according to a preset sorting policy (for example, sorts the window not paid attention according to the time of creation of the window), and sets a display priority of the window not paid attention to be lower than a priority of the window being paid attention (that is, preferentially displays the window being paid attention), which is not limited in the embodiment of the present invention.
And 303, displaying the sorted windows, wherein the display priority of the windows and the time interval from the concerned moment to the current moment are in a negative correlation relationship.
Wherein, the smaller the time interval between the attention moment and the current moment (the later the attention moment is), the higher the display priority of the window is; the smaller the time interval between the time of interest and the current time (the earlier the time of interest), the lower the display priority of the window.
For the display mode of the windows after sorting, in a possible implementation manner, the client reorders and displays the window thumbnails (or window labels) corresponding to the windows in the status bar according to the sorting, so as to ensure that the window thumbnail (or window label) corresponding to the window concerned recently is displayed at the head of the window thumbnail sequence. And the follow-up user can quickly find the window which is concerned recently according to the display sequence of the window thumbnails.
Illustratively, as shown in fig. 3B, the client adjusts the display priority of the sorted conversation windows, and displays the window thumbnails corresponding to each of the conversation window 25, the conversation window 22, the conversation window 21, the conversation window 26, the conversation window 23, and the conversation window 24 in sequence.
In summary, in this embodiment, when a window sorting signal indicating to sort windows is received, the attention information of each created window is obtained, and the windows are sorted according to the attention time when the instruction window in the attention information is paid for the last time, so that the sorted windows are displayed, and the window with the attention time closer to the current time is preferentially displayed, so that a user can quickly find the window which is most recently paid for from the preferentially displayed windows according to the display priority of the windows, thereby avoiding the user from traversing each created window, and improving the window finding efficiency.
In the attention operation setting stage, the user may set the type of the attention operation received when the window is focused by himself/herself according to the requirement of the user, and accordingly, in the attention information setting stage, the client determines whether the window is focused or not according to the attention operation set by the user, so as to set or update the attention information of the focused window, which is described below with an exemplary embodiment.
Referring to fig. 4A, a flowchart of a window display method according to another embodiment of the present invention is shown, where the window display method is used for a terminal 110 for illustration, and the method includes:
step 401, obtaining the attention operation prestored in the terminal.
In practical applications, when a user focuses on a certain window, certain specific operations are usually performed on the window. For example, for a conversation window in a messaging client, when a user focuses on a certain conversation window, the conversation window is usually set as a focus window, or information is input in the conversation window. Therefore, in a possible implementation, when the client receives the operation signal for the current window, the client obtains a locally pre-stored attention operation, wherein the attention operation is provided for the user by the client to select, or is set by the user. Illustratively, as shown in fig. 4B, several attention operation options 400a are provided in an attention operation setting interface 400 displayed by the communication client for the user to select, and the attention operation set by the user is determined according to the received selection signal for the attention operation option 400 a.
Optionally, the focusing operation includes setting a window as a focus window, and/or performing an operation in the window. Optionally, the user may further subdivide the type of the operation concerned according to the type of the operation performed in the window (for example, sending text information, a picture, a video, or a voice in the conversation window), which is not limited in the embodiment of the present invention.
In other possible embodiments, in addition to the user actively triggering the attention window, the user's attention to the window may also be passively triggered. For example, when a user is instructed by other users in the session window to focus on the current window (e.g., the other users name the current user in the session window), the client determines that the session window is focused on. The embodiments of the present invention are not limited thereto.
As shown in fig. 4C, after the client running in the terminal 41 obtains the attention operation set by the user, the attention operation is stored in the storage device 42 of the terminal 41, so as to be called when subsequently determining whether the window is focused.
Optionally, after obtaining the attention operation set by the user, the client uploads the attention operation and the client identifier to the server 43 (a background server of the client), so that the server 43 performs associated storage on the attention operation and the client identifier. The subsequent terminal reinstalls the client, or the user can download the preset attention operation from the server 43 according to the client identifier when using the client on other terminals, so as to avoid repeated setting.
Step 402, detecting whether the current window receives attention operation.
In the process of using the client supporting multiple windows, for each created window, the client detects whether the window receives attention operation. In a possible implementation manner, the client sets a trigger for each created window, and when the window receives an operation signal, the trigger corresponding to the window is triggered, so as to trigger the client to detect whether the operation received by the window is an attention operation.
When detecting that the current window receives the attention operation, the client determines that the current window is concerned, and executes the following step 403 to set attention information corresponding to the current window; when the current window is detected not to receive the attention operation, the client determines that the current window is not concerned, and keeps the attention information of the current window.
In one possible embodiment, when the attention operation is to set the current window as the focus window, the client determines that the current window receives the attention operation when detecting that the window is in an operable state (i.e., set as the focus window), or when the user performs an operation in the window (since the window must be set as the focus window before performing an operation in the window).
In another possible implementation, when the attention operation is to perform an operation in a window, the client determines that the current window receives the attention operation only when detecting that a user performs an operation in the window.
Step 403, if the current window receives the attention operation, setting an attention time corresponding to the current window according to the time when the attention operation is received.
When the current window is determined to be concerned, the client sets or updates the concerned information corresponding to the current window.
In a possible implementation manner, when detecting that a current window receives an attention operation, a client further detects whether attention information corresponding to the window is stored, and when the attention information corresponding to the window is stored, the client updates attention time contained in the attention information; and when the concerned information corresponding to the window is not stored, the client creates the concerned information for the window.
Optionally, when each window is created in the client, the client allocates a window ID (identity) to the created window, and the window IDs of the windows are different from each other. The attention information created by the client for the window at least comprises a window ID and attention time; correspondingly, when the client updates the attention information corresponding to the window, at least the attention moment is updated.
Optionally, when the client detects that the current window receives the attention operation, the client sets the attention time corresponding to the current window according to the time when the attention operation is received. For example, when it is determined that the current window receives the attention operation and the time when the attention operation is received is 13:18, the client determines 13:18 as the attention time of the current window, and updates the historical attention time in the attention information.
In step 404, if the current window receives the attention operation, the type of the attention operation received by the current window is determined.
In order to implement the ordering of the window with the specified attention type, in one possible implementation, for different types of attention operations, the attention information generated by the client includes, in addition to the attention time, an attention type corresponding to the window, where the attention type is used to indicate a type of the attention operation received by the window.
Correspondingly, when the client sets (or updates) the attention information corresponding to the current window, the type of the attention operation received by the current window is determined, and therefore the attention type corresponding to the current window is set (or updated) according to the type of the received attention operation.
Step 405, setting an attention type corresponding to the current window according to the type of the received attention operation.
Optionally, when the focus operation received by the window is to set the window as a focus window, the client determines that the focus type corresponding to the window is to set focus; when the attention operation received by the window is to execute the operation in the window, the client determines that the attention type corresponding to the window is the attention of the operation window. In other alternative implementations, the client may further subdivide other types of interest according to the interest operation received by the window, which is not limited in the embodiment of the present invention.
It should be noted that the same attention operation may correspond to multiple attention types at the same time, for example, since a focus window must be set before an operation is performed in a window, the attention operation of performing an operation in a window corresponds to setting a focus attention and operating a window attention at the same time.
In an actual implementation process, as shown in fig. 4D, after a user performs an operation on a certain window, the client in the terminal 41 detects whether the operation is a focus operation, and updates the focus information of the window in the storage device 42 when the operation is a focus operation.
Illustratively, the information of interest stored in the terminal storage device is shown in table two.
Watch two
Figure BDA0001298334830000131
Because only the created windows are sorted when the subsequent windows are sorted, the client deletes the attention information corresponding to the window when detecting that the window is closed.
In step 406, a window sorting signal is received, where the window sorting signal is used to instruct to sort windows belonging to the target attention type.
In a possible implementation manner, a user presets a trigger condition for triggering a client to sort created windows, and when the client detects that the trigger condition is met in the operation process, the client determines that a window sorting signal is received, and then sorts and displays the windows through the following steps 407 to 409.
Wherein the trigger condition includes any one of:
1. when a pressing signal of a predetermined physical key is received, the received window sorting signal is determined.
In a possible implementation manner, the client receives a predetermined entity key set by the user in advance, and the predetermined entity key is a single entity key or a combination of at least two entity keys. When a pressing signal of a predetermined entity key is received, the client determines that a window sorting signal is received.
In order to enable the client to sort the windows of the designated attention types according to the user requirements, optionally, the user sets at least two groups of predetermined entity keys, and each group of predetermined entity keys corresponds to respective attention types. And after a window sequencing signal triggered by the preset entity key is subsequently received, the client sequences the window with the appointed attention type.
Illustratively, the user sets a key combination of "CTRL + F" for triggering the sorting of windows of a first type of attention (such as setting focus attention) and "CTRL + L" for triggering the sorting of windows of a second type of attention (such as operating window attention).
In other possible embodiments, the user may set only one set of predetermined physical keys and trigger the client to sort windows of different types of interest by setting different numbers of presses. The embodiments of the present invention are not limited thereto.
2. When a preset gesture instruction is received, determining a received window sorting signal;
in a possible implementation manner, a predetermined gesture instruction set by a user is stored in the client, and when the received gesture instruction is matched with the predetermined gesture instruction, the client determines that the window sorting signal is received.
Similar to the first trigger condition, the user may preset different gesture instructions, so that the client is instructed to sequence the windows of different attention types by using the different gesture instructions.
3. When a preset voice command is received, determining a received window sorting signal;
in a possible implementation manner, a predetermined voice instruction set by a user is stored in the client, and when the received voice instruction is matched with the predetermined voice instruction, the client determines that the window sorting signal is received.
Similar to the first trigger condition, the user may preset different voice commands, so that the client is instructed to sequence the windows of different attention types by using different voice commands.
4. And when the touch signal of the preset virtual key is received, determining the received window sorting signal.
In a possible implementation manner, the display interface of the client includes a predetermined virtual key, and when the touch signal is received through the predetermined virtual key, the client determines the received window sorting signal.
Step 407, when receiving the window sorting signal, acquiring the attention information of each window in the created windows.
And step 408, sorting the windows according to the attention information.
As described in step 406 above, the user may trigger different window sorting signals so that the client sorts the windows of the specified type of interest according to the received window sorting signals. In one possible embodiment, as shown in fig. 4E, this step includes the following steps.
Step 408A, determining a target attention type corresponding to the window sorting signal.
Because the window sequencing signals triggered by different triggering modes are different, after the client receives the window sequencing signals, the client firstly determines the target attention type corresponding to the window sequencing signals.
Illustratively, when detecting that the window sorting signal is triggered by the key combination "CTRL + F", the client determines that the target attention type corresponding to the window sorting signal is "set focus attention", and when detecting that the window sorting signal is triggered by the key combination "CTRL + L", the client determines that the target attention type corresponding to the window sorting signal is "operation window attention".
And step 408B, determining a target window according to the target attention type and the attention type contained in the attention information, wherein the attention type corresponding to the target window is matched with the target attention type.
After the target attention type corresponding to the window sequencing signal is determined, the client further obtains the attention type corresponding to each window, and accordingly the window matched with the target attention type is determined to be the target window.
Illustratively, in combination with the contents shown in table two, when the target attention type is set focus attention, the client determines all the session windows 21 to 26 as target windows; when the target attention type is the operation window attention, the client determines the session windows 21, 22, 25, and 26 as target windows.
And step 408C, sequencing the target windows according to the attention time corresponding to the target windows.
Further, the client acquires the attention time corresponding to each target window, and sorts the target windows according to the attention time.
Illustratively, with reference to the contents shown in table two, when the determined target windows are the session windows 21, 22, 25, and 26, the client sorts the target windows to obtain a window sequence as follows: session window 25, session window 22, session window 21, session window 26. For the non-target windows (session windows 23 and 24), the client sets the display priority of the non-target window to be lower than that of the target window, so as to obtain a complete window sequence as follows: session window 25, session window 22, session window 21, session window 26, session window 23, session window 24.
And 409, displaying the sorted windows, wherein the display priority of the windows and the time interval from the concerned moment to the current moment are in a negative correlation relationship.
The implementation of step 409 is similar to that of step 303 described above, and the embodiment of the present invention is not described herein again.
In summary, in this embodiment, when a window sorting signal indicating to sort windows is received, the attention information of each created window is obtained, and the windows are sorted according to the attention time when the instruction window in the attention information is paid for the last time, so that the sorted windows are displayed, and the window with the attention time closer to the current time is preferentially displayed, so that a user can quickly find the window which is most recently paid for from the preferentially displayed windows according to the display priority of the windows, thereby avoiding the user from traversing each created window, and improving the window finding efficiency.
In this embodiment, the attention type corresponding to the window is set according to the type of the attention operation received when the window is attended, so that the window of the designated attention type can be ordered according to the user instruction when the window is subsequently ordered according to the window ordering signal triggered by the user, and the pertinence of the window ordering is improved.
It should be noted that, in the foregoing embodiments, only the window display method is applied to the client layer as an example for description, (that is, only a plurality of windows in the same client are sorted), in other possible embodiments, if the window display method is applied to the operating system layer, the terminal may implement sorting of windows in different clients, which is not limited in the embodiments of the present invention.
The following are embodiments of the apparatus of the present invention, and for details not described in detail in the embodiments of the apparatus, reference may be made to the above-mentioned one-to-one corresponding method embodiments.
Referring to fig. 5, a block diagram of a window display apparatus according to an embodiment of the invention is shown. The access means may be implemented as all or part of the terminal 110 in fig. 1 by hardware or a combination of hardware and software. The device includes:
an information obtaining module 510, configured to obtain attention information of each window in the created windows, where the attention information at least includes an attention time when the window is paid attention last time;
a window sorting module 520, configured to sort windows according to the attention information;
and a window display module 530, configured to display the sorted windows, where a display priority of the window is in a negative correlation with a time interval from the attention time to the current time.
Optionally, the apparatus further includes:
the detection module is used for detecting whether the current window receives an attention operation, wherein the attention operation comprises setting the window as a focus window and/or executing operation in the window;
and the first setting module is used for setting the attention time corresponding to the current window according to the time when the attention operation is received when the current window receives the attention operation.
Optionally, the attention information further includes an attention type, where the attention type is used to indicate a type of an attention operation received by the window, and the attention type includes setting a focus attention and/or operating window attention;
the device, still include:
the determining module is used for determining the type of the attention operation received by the current window when the attention operation is received by the current window;
and the second setting module is used for setting the attention type corresponding to the current window according to the received attention operation type.
Optionally, the apparatus further includes:
the signal receiving module is used for receiving a window sorting signal, and the window sorting signal is used for indicating that windows belonging to a target attention type are sorted;
the window sorting module 520 includes:
a first determining unit, configured to determine the target attention type corresponding to the window sorting signal;
a second determining unit, configured to determine a target window according to the target attention type and an attention type included in the attention information, where an attention type corresponding to the target window is matched with the target attention type;
and the sequencing unit is used for sequencing the target windows according to the attention time corresponding to the target windows.
Optionally, the signal receiving module includes:
the first signal receiving unit is used for determining the received window sorting signal when receiving a pressing signal of a preset entity key;
and/or the presence of a gas in the gas,
the second signal receiving unit is used for determining the received window sorting signal when a preset gesture instruction is received;
and/or the presence of a gas in the gas,
the third signal receiving unit is used for determining the received window sorting signal when a preset voice instruction is received;
and/or the presence of a gas in the gas,
and the fourth signal receiving unit is used for determining the received window sorting signal when receiving the touch signal of the preset virtual key.
Optionally, the apparatus further includes:
and the deleting module is used for deleting the concerned information corresponding to the window when the window is detected to be closed.
Referring to fig. 6, a schematic structural diagram of a terminal according to an embodiment of the present invention is shown. The terminal 1000 is the terminal 110 of fig. 1. Specifically, the method comprises the following steps:
terminal 1000 can include RF (Radio Frequency) circuitry 1010, memory 1020 including one or more computer-readable storage media, input unit 1030, display unit 1040, sensors 1050, audio circuitry 1060, near field communication module 1070, processor 1080 including one or more processing cores, and power supply 1090. Those skilled in the art will appreciate that the terminal structure shown in fig. 6 is not intended to be limiting and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components. Wherein:
RF circuit 1010 may be used for receiving and transmitting signals during a message transmission or communication process, and in particular, for receiving downlink information from a base station and then processing the received downlink information by one or more processors 1080; in addition, data relating to uplink is transmitted to the base station. In general, RF circuitry 1010 includes, but is not limited to, an antenna, at least one Amplifier, a tuner, one or more oscillators, a Subscriber Identity Module (SIM) card, a transceiver, a coupler, an LNA (Low Noise Amplifier), a duplexer, and the like. In addition, the RF circuitry 1010 may also communicate with networks and other devices via wireless communications. The wireless communication may use any communication standard or protocol, including but not limited to GSM (Global System for Mobile communications), GPRS (General Packet Radio Service), CDMA (Code Division Multiple Access), WCDMA (Wideband Code Division Multiple Access), LTE (Long Term Evolution), email, SMS (Short Messaging Service), and the like.
The memory 1020 may be used to store software programs and modules, and the processor 1080 executes various functional applications and data processing by operating the software programs and modules stored in the memory 1020. The memory 1020 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the terminal 1000, and the like. Further, the memory 1020 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device. Accordingly, memory 1020 may also include a memory controller to provide access to memory 1020 by processor 1080 and input unit 1030.
The input unit 1030 may be used to receive input numeric or character information and generate keyboard, mouse, joystick, optical or trackball signal inputs related to user settings and function control. Specifically, the input unit 1030 may include an image input device 1031 and other input devices 1032. The image input device 1031 may be a camera or a photoelectric scanning device. The input unit 1030 may include other input devices 1032 in addition to the image input device 1031. In particular, other input devices 1032 may include, but are not limited to, one or more of a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a track ball, a mouse, a joystick, or the like.
Display unit 1040 can be used to display information entered by or provided to a user as well as various graphical user interfaces of terminal 1000, which can be comprised of graphics, text, icons, video, and any combination thereof. The Display unit 1040 may include a Display panel 1041, and optionally, the Display panel 1041 may be configured in the form of an LCD (Liquid Crystal Display), an OLED (Organic Light-Emitting Diode), or the like.
Terminal 1000 can also include at least one sensor 1050, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor may include an ambient light sensor that adjusts the brightness of the display panel 1041 according to the brightness of ambient light, and a proximity sensor that turns off the display panel 1041 and/or a backlight when the terminal 1000 moves to the ear. As one of the motion sensors, the gravity acceleration sensor can detect the magnitude of acceleration in each direction (generally, three axes), can detect the magnitude and direction of gravity when the mobile phone is stationary, and can be used for applications of recognizing the posture of the mobile phone (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration recognition related functions (such as pedometer and tapping), and the like; other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor that can be configured for terminal 1000 are not described herein.
Audio circuitry 1060, speaker 1061, and microphone 1062 can provide an audio interface between a user and terminal 1000. The audio circuit 1060 can transmit the electrical signal converted from the received audio data to the speaker 1061, and the electrical signal is converted into a sound signal by the speaker 1061 and output; on the other hand, the microphone 1062 converts the collected sound signals into electrical signals, which are received by the audio circuit 1060 and converted into audio data, which are then processed by the audio data output processor 1080 and then transmitted to, for example, another electronic device via the RF circuit 1010, or output to the memory 1020 for further processing. Audio circuitry 1060 may also include an earbud jack to provide communication of peripheral headphones with terminal 1000.
Terminal 1000 can establish a near field communication connection with an external device via near field communication module 1070 and can exchange data via the near field communication connection. In this embodiment, the near field communication module 1070 specifically includes a bluetooth module and/or a WiFi module.
Processor 1080 is the control center for terminal 1000, and is coupled to various components of the overall handset using various interfaces and lines to perform various functions and process data of terminal 1000 by running or executing software programs and/or modules stored in memory 1020 and invoking data stored in memory 1020, thereby providing overall monitoring of the handset. Optionally, processor 1080 may include one or more processing cores; preferably, the processor 1080 may integrate an application processor, which handles primarily the operating system, user interfaces, applications, etc., and a modem processor, which handles primarily the wireless communications. It is to be appreciated that the modem processor described above may not be integrated into processor 1080.
Terminal 1000 can also include a power supply 1090 (e.g., a battery) for powering the various components, which can preferably be logically coupled to processor 1080 via a power management system that can facilitate managing charging, discharging, and power consumption via the power management system. Power supply 1090 may also include any component including one or more DC or AC power sources, recharging systems, power failure detection circuitry, power converters or inverters, power status indicators, and the like.
Although not shown, terminal 1000 can also include a Bluetooth module or the like, which is not described in detail herein.
Specifically, in this embodiment, the terminal 1000 further includes a memory, the memory further includes one or more programs, the one or more programs are stored in the memory, and the one or more programs include instructions for performing the window display method provided in the embodiment of the present invention. The instructions are loaded and executed by a processor in the terminal, so as to implement the functions of the information obtaining module 510, the window sorting module 520, the window displaying module 530, and other functional modules or units in the window displaying device.
It will be understood by those skilled in the art that all or part of the steps in the window display method of the above embodiments may be implemented by a program to instruct associated hardware, where the program may be stored in a computer-readable storage medium, where the storage medium may include: read Only Memory (ROM), Random Access Memory (RAM), magnetic or optical disks, and the like.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.

Claims (14)

1. A method for displaying a window, the method comprising:
receiving the setting of attention parameters, wherein the attention parameters comprise attention operation, attention types and window sequencing signal triggering modes;
if the current window is detected to receive the attention operation, setting or updating attention information corresponding to the window;
receiving a window sorting signal, wherein the window sorting signal is used for indicating that windows belonging to a target attention type are sorted;
acquiring the attention information of each window in the created windows, wherein the attention information at least comprises the attention time when the window is paid attention last time;
sequencing the windows according to the attention information;
and displaying the sorted windows, wherein the display priority of the windows and the time interval from the concerned moment to the current moment are in a negative correlation relationship.
2. The method of claim 1, wherein detecting that the current window receives the attention operation further comprises:
the focusing operation comprises setting a window as a focus window and/or executing operation in the window;
and setting the attention time corresponding to the current window according to the time when the attention operation is received.
3. The method according to claim 2, wherein the attention information further includes an attention type, the attention type is used for indicating a type of an attention operation received by the window, and the attention type includes setting focus attention and/or operating window attention;
after detecting whether the current window receives the attention operation, the method further includes:
if the current window receives the attention operation, determining the type of the attention operation received by the current window;
and setting the attention type corresponding to the current window according to the received attention operation type.
4. The method of claim 3, wherein before obtaining the attention information of each of the created windows, further comprising:
the sorting the windows according to the attention information comprises:
determining the target attention type corresponding to the window sorting signal;
determining a target window according to the target attention type and the attention type contained in the attention information, wherein the attention type corresponding to the target window is matched with the target attention type;
and sequencing the target windows according to the attention time corresponding to the target windows.
5. The method of claim 4, wherein receiving the window ordering signal comprises:
when a pressing signal of a preset entity key is received, determining the received window sorting signal;
and/or the presence of a gas in the gas,
when a preset gesture instruction is received, determining the received window sorting signal;
and/or the presence of a gas in the gas,
when a preset voice instruction is received, determining the received window sorting signal;
and/or the presence of a gas in the gas,
and when a touch signal of a preset virtual key is received, determining the received window sorting signal.
6. The method of any of claims 1 to 5, further comprising:
and when the window is detected to be closed, deleting the attention information corresponding to the window.
7. A window display apparatus, the apparatus comprising:
the device comprises a parameter receiving module, a window sorting module and a display module, wherein the parameter receiving module is used for receiving the setting of attention parameters, and the attention parameters comprise attention operation, attention types and window sorting signal triggering modes;
the window detection module is used for setting or updating the attention information corresponding to the window if the current window is detected to receive the attention operation;
the signal receiving module is used for receiving a window sorting signal, and the window sorting signal is used for indicating that windows belonging to a target attention type are sorted;
the information acquisition module is used for acquiring the attention information of each window in the created windows, wherein the attention information at least comprises the attention time when the window is paid attention last time;
the window sorting module is used for sorting the windows according to the attention information;
and the window display module is used for displaying the sorted windows, wherein the display priority of the windows and the time interval from the concerned moment to the current moment are in a negative correlation relationship.
8. The apparatus of claim 7, wherein the window detection module further comprises:
the judging module is used for setting the window as a focus window and/or executing operation in the window;
and the first setting module is used for setting the attention time corresponding to the current window according to the time when the attention operation is received when the current window receives the attention operation.
9. The apparatus according to claim 8, wherein the attention information further includes an attention type, the attention type is used to indicate a type of an attention operation received by the window, and the attention type includes setting a focus attention and/or operating a window attention;
the device, still include:
the determining module is used for determining the type of the attention operation received by the current window when the attention operation is received by the current window;
and the second setting module is used for setting the attention type corresponding to the current window according to the received attention operation type.
10. The apparatus of claim 9, further comprising:
the window sorting module comprises:
a first determining unit, configured to determine the target attention type corresponding to the window sorting signal;
a second determining unit, configured to determine a target window according to the target attention type and an attention type included in the attention information, where an attention type corresponding to the target window is matched with the target attention type;
and the sequencing unit is used for sequencing the target windows according to the attention time corresponding to the target windows.
11. The apparatus of claim 10, wherein the signal receiving module comprises:
the first signal receiving unit is used for determining the received window sorting signal when receiving a pressing signal of a preset entity key;
and/or the presence of a gas in the gas,
the second signal receiving unit is used for determining the received window sorting signal when a preset gesture instruction is received;
and/or the presence of a gas in the gas,
the third signal receiving unit is used for determining the received window sorting signal when a preset voice instruction is received;
and/or the presence of a gas in the gas,
and the fourth signal receiving unit is used for determining the received window sorting signal when receiving the touch signal of the preset virtual key.
12. The apparatus of any of claims 7 to 11, further comprising:
and the deleting module is used for deleting the concerned information corresponding to the window when the window is detected to be closed.
13. A terminal, characterized in that it comprises a processor and a memory, in which at least one instruction is stored, which is loaded and executed by the processor to implement the window display method according to any one of claims 1 to 6.
14. A computer-readable storage medium having stored thereon at least one instruction, which is loaded and executed by a processor to implement the window display method of any one of claims 1 to 6.
CN201710352997.0A 2017-05-18 2017-05-18 Window display method and device and terminal Active CN108958854B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710352997.0A CN108958854B (en) 2017-05-18 2017-05-18 Window display method and device and terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710352997.0A CN108958854B (en) 2017-05-18 2017-05-18 Window display method and device and terminal

Publications (2)

Publication Number Publication Date
CN108958854A CN108958854A (en) 2018-12-07
CN108958854B true CN108958854B (en) 2020-11-10

Family

ID=64461452

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710352997.0A Active CN108958854B (en) 2017-05-18 2017-05-18 Window display method and device and terminal

Country Status (1)

Country Link
CN (1) CN108958854B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112333551B (en) * 2020-06-18 2023-05-23 深圳Tcl新技术有限公司 Video window display method, device, equipment and computer readable storage medium
CN112256372B (en) * 2020-10-20 2023-12-26 北京字跳网络技术有限公司 Information processing method and device and electronic equipment
CN113577448A (en) * 2021-08-10 2021-11-02 南方医科大学珠江医院 Visual miniature intelligent transfusion port and control method

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102148861A (en) * 2011-01-25 2011-08-10 中兴通讯股份有限公司 Widget sequencing method and device
CN103309673A (en) * 2013-06-24 2013-09-18 北京小米科技有限责任公司 Session processing method and device based on gesture, and terminal equipment
CN103777838A (en) * 2012-10-18 2014-05-07 华为技术有限公司 Switching method and device of a plurality of message response windows
CN104598097A (en) * 2013-11-07 2015-05-06 腾讯科技(深圳)有限公司 Ordering method and device of instant messaging (IM) windows
CN105656754A (en) * 2015-12-23 2016-06-08 别业辉 Method for increasing efficiency of searching contact person and viewing information, and mobile terminal
CN106100969A (en) * 2016-05-30 2016-11-09 北京三快在线科技有限公司 A kind of do not read the based reminding method of session, device and terminal unit
CN106856447A (en) * 2015-12-09 2017-06-16 北京三星通信技术研究有限公司 The processing method and relevant apparatus and terminal device of interactive contents information

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7917582B2 (en) * 2004-07-27 2011-03-29 Siemens Enterprise Communications, Inc. Method and apparatus for autocorrelation of instant messages

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102148861A (en) * 2011-01-25 2011-08-10 中兴通讯股份有限公司 Widget sequencing method and device
CN103777838A (en) * 2012-10-18 2014-05-07 华为技术有限公司 Switching method and device of a plurality of message response windows
CN103309673A (en) * 2013-06-24 2013-09-18 北京小米科技有限责任公司 Session processing method and device based on gesture, and terminal equipment
CN104598097A (en) * 2013-11-07 2015-05-06 腾讯科技(深圳)有限公司 Ordering method and device of instant messaging (IM) windows
CN106856447A (en) * 2015-12-09 2017-06-16 北京三星通信技术研究有限公司 The processing method and relevant apparatus and terminal device of interactive contents information
CN105656754A (en) * 2015-12-23 2016-06-08 别业辉 Method for increasing efficiency of searching contact person and viewing information, and mobile terminal
CN106100969A (en) * 2016-05-30 2016-11-09 北京三快在线科技有限公司 A kind of do not read the based reminding method of session, device and terminal unit

Also Published As

Publication number Publication date
CN108958854A (en) 2018-12-07

Similar Documents

Publication Publication Date Title
US10708649B2 (en) Method, apparatus and system for displaying bullet screen information
US10304461B2 (en) Remote electronic service requesting and processing method, server, and terminal
CN106878406B (en) Information sharing method, device and system
US20140365892A1 (en) Method, apparatus and computer readable storage medium for displaying video preview picture
WO2014194713A1 (en) Method,apparatus and computer readable storage medium for displaying video preview picture
CN106293738B (en) Expression image updating method and device
US10621259B2 (en) URL error-correcting method, server, terminal and system
CN108958854B (en) Window display method and device and terminal
US10298590B2 (en) Application-based service providing method, apparatus, and system
CN104252508A (en) Multimedia file search method, device and terminal equipment
JP6915074B2 (en) Message notification method and terminal
CN108809799B (en) Information sending method, information display method, device and system
CN106020945B (en) Shortcut item adding method and device
WO2015117554A1 (en) Data processing method, apparatus, and terminal device
CN105095161B (en) Method and device for displaying rich text information
CN105306244B (en) Router management method, system and equipment
CN107786423B (en) A kind of method and system of instant messaging
US20160283047A1 (en) Login interface displaying method and apparatus
CN111274463A (en) Information display method and device based on IM contact person grouping setting and storage medium
CN112333337A (en) Message checking method, device, equipment and storage medium
CN110196662B (en) Method, device, terminal and storage medium for displaying synchronization state
CN106657281B (en) File sharing method and device
CN105681723B (en) Audio and video call method and device
CN110381341B (en) Data processing method and related equipment
CN109688548B (en) VOLTE frequency band sharing method, server, mobile terminal and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant