CN117032522A - Display method, device, terminal and storage medium - Google Patents

Display method, device, terminal and storage medium Download PDF

Info

Publication number
CN117032522A
CN117032522A CN202311110398.XA CN202311110398A CN117032522A CN 117032522 A CN117032522 A CN 117032522A CN 202311110398 A CN202311110398 A CN 202311110398A CN 117032522 A CN117032522 A CN 117032522A
Authority
CN
China
Prior art keywords
information
target
target scene
display area
application
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311110398.XA
Other languages
Chinese (zh)
Inventor
陈帅
张永凤
杨明鑫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN202311110398.XA priority Critical patent/CN117032522A/en
Publication of CN117032522A publication Critical patent/CN117032522A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus

Abstract

The embodiment of the application discloses a display method, a device, a terminal and a storage medium, wherein the method is applied to the terminal and comprises the following steps: determining at least one target information matched with a target scene from a plurality of application programs of the terminal; wherein the target information includes at least one of: notification information, interface state information, or content page information of the application; and displaying the at least one piece of target information in a desktop display area of the terminal.

Description

Display method, device, terminal and storage medium
Technical Field
The present application relates to, but not limited to, the field of computer technologies, and in particular, to a display method, apparatus, terminal, and storage medium.
Background
With the development of computer technology, application icons corresponding to various application programs are paved on the desktop of terminal devices (such as mobile phones, portable video players, personal digital assistants, tablet computers and the like).
In general, in order to open a target information page, a user needs to find an application icon corresponding to a target application program from a plurality of application icons, then open the target application icon, enter the target application program page, and finally find the target information page by operating control buttons at all levels in the target application program page.
As can be seen, in order to open the desired target information page, the user needs to perform lengthy operations, resulting in high time costs and poor use experience.
Disclosure of Invention
In view of this, the embodiments of the present application at least provide a display method, a device, a terminal, and a storage medium.
The technical scheme of the embodiment of the application is realized as follows:
in one aspect, an embodiment of the present application provides a display method, where the display method is applied to a terminal, and the method includes:
determining at least one target information matched with a target scene from a plurality of application programs of the terminal; wherein the target information includes at least one of: notification information, interface state information and content page information of the application program;
and displaying the at least one piece of target information in a desktop display area of the terminal.
In some embodiments, the desktop display region includes a first target scene display region and a second target scene display region adjacent to the first target scene display region;
the displaying the target information in the desktop display area of the terminal comprises the following steps:
displaying at least one first target information matched with a first target scene in the first target scene display area; wherein, each piece of first target information comprises an interactive indication mark; the interaction indication identifier is used for responding to an operation indication of a user;
And displaying at least one piece of second target information matched with the second target scene in the second target scene display area.
In some embodiments, the first target scene is capable of automatically changing based on current environmental information of the terminal; the second target scene is changeable based on a user selection operation of a plurality of second target scene options; wherein the plurality of second target scene options are displayed in a third display region of the desktop display region.
In some embodiments, the method further comprises:
and in case that a preset condition is met, in response to a user selecting one second target scene from the plurality of second target scenes, synchronously updating at least one first target information displayed in the first target scene display area and at least one second target information displayed in the second target scene display area.
In some embodiments, the at least one second target information comprises at least one of:
interface state information of at least one controlled device application;
more than two levels of interface state information for at least one application; and the interface state information with more than two levels is not displayed to the user in the state that the application program is not opened.
In some embodiments, displaying at least one first target information matching the first target scene in the first target scene display area includes:
determining at least one first target information matched with the first target scene from the plurality of application programs of the terminal based on the operation information of the user on the plurality of application programs in a first time period, and determining the display priority of each first target information;
the at least one first target information is displayed in the first target scene display area based on a display priority of each of the first target information.
In some embodiments, the first target information is notification information from at least one application and is based on at least one piece of interaction information generated by the first target scene.
In another aspect, an embodiment of the present application provides a display apparatus, including:
a determining module, configured to determine at least one target information matched with a target scene from a plurality of application programs of the terminal; wherein the target information includes at least one of: notification information, interface state information and content page information of the application program;
And the display module is used for displaying the at least one piece of target information in a desktop display area of the terminal.
In yet another aspect, an embodiment of the present application provides a terminal, including a memory and a processor, where the memory stores a computer program executable on the processor, and the processor implements some or all of the steps of the above method when the processor executes the program.
In yet another aspect, embodiments of the present application provide a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, performs some or all of the steps of the above-described method.
In yet another aspect, embodiments of the present application provide a computer program comprising computer readable code which, when run in a computer device, causes a processor in the computer device to perform some or all of the steps for carrying out the above method.
In the embodiment of the application, at least one piece of target information matched with the target scene is determined from a plurality of application programs of the terminal, the at least one piece of target information is displayed in the desktop display area of the terminal, and various application information originally enclosed under the application icons of the application programs is directly displayed in the desktop display area, so that a user can obtain the information under the target scene by checking the desktop display area of the terminal, the operation time for checking the application information by the user is greatly shortened, and the use experience of the user is improved; meanwhile, the embodiment of the application displays at least one target information according to the target scene, so that the information displayed in the unit of the application program is adjusted to the information displayed in the unit of the scene, namely, the information belonging to the same scene in different application programs is displayed to the user, so that the user can conveniently obtain more comprehensive information about the same scene, and the use experience of the user is further improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the aspects of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the application and together with the description, serve to explain the principles of the application.
Fig. 1 is a schematic implementation flow chart of a display method according to an embodiment of the present application;
fig. 2 is a schematic layout diagram of a desktop display area according to a display method provided by an embodiment of the present application;
fig. 3 is a schematic layout diagram of a terminal display screen according to a display method provided by an embodiment of the present application;
fig. 4 is a schematic layout diagram of a terminal display screen according to a display method provided by an embodiment of the present application;
fig. 5 is a schematic layout diagram of a terminal display screen according to a display method provided by an embodiment of the present application;
fig. 6 is a schematic diagram of a composition structure of a display device according to an embodiment of the present application;
fig. 7 is a schematic diagram of a hardware entity of a terminal according to an embodiment of the present application.
Detailed Description
The technical solution of the present application will be further elaborated with reference to the accompanying drawings and examples, which should not be construed as limiting the application, but all other embodiments which can be obtained by one skilled in the art without making inventive efforts are within the scope of protection of the present application.
In the following description, reference is made to "some embodiments" which describe a subset of all possible embodiments, but it is to be understood that "some embodiments" can be the same subset or different subsets of all possible embodiments and can be combined with one another without conflict.
The term "first/second/third" is merely to distinguish similar objects and does not represent a particular ordering of objects, it being understood that the "first/second/third" may be interchanged with a particular order or precedence, as allowed, to enable embodiments of the application described herein to be implemented in other than those illustrated or described herein.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terminology used herein is for the purpose of describing the application only and is not intended to be limiting of the application.
In the related art, in order to quickly and intuitively display the content possibly concerned by the user in the main application program to the user, a Widget (Web Widget) corresponding to the application program is set on the desktop of the terminal. Through widgets, a user can obtain personalized application information at an appropriate time.
However, the widgets are information that application information is displayed on a desktop in units of application programs, and there is no correlation between information displayed by widgets of different application programs. Meanwhile, the Widget only can display application information to the user, and cannot actively recommend the application information to the user and interact according to personal behavior habits of the user.
Based on the above, the embodiment of the application provides a display method, which can display at least one target information matched with a target scene to a user on a desktop of a terminal based on the target scene, so that information sealed in a main application program is displayed in a scene unit, and the user can conveniently and quickly position interesting information in the target scene, thereby improving the use experience of the user.
The display method provided by the embodiment of the application can be executed by a processor of computer equipment. The computer device may be a device with data processing capability, such as a server, a notebook computer, a tablet computer, a desktop computer, a smart television, a set-top box, a mobile device (e.g., a mobile phone, a portable video player, a personal digital assistant, a dedicated messaging device, and a portable game device). Fig. 1 is a schematic implementation flow chart of a display method according to an embodiment of the present application, as shown in fig. 1, the method includes steps S101 to S102 as follows:
Step S101, determining at least one target information matched with a target scene from a plurality of application programs of the terminal; wherein the target information includes at least one of: notification information, interface state information and content page information of the application program;
an application is herein defined as a computer program that has been developed to perform some specific task or tasks. Applications may run under a variety of operating systems and may interact with a user with a visual user interface.
When an application is installed on a terminal, various information related to the application is generated during the start-up or running of the application. For example, during the start-up of an application, the generated application state information; after the application program is started, a notification message pushed by a background operator of the application program; the end user's operational information for the application, e.g., the user's historical operating record, etc. The information together form application information corresponding to the application program.
In practical applications, the application information may be directly obtained from an application program installed in the terminal, for example, in the case of setting a corresponding widget for the application program, the corresponding application information may be directly obtained from the widget.
The target scene is a scene determined based on the current environment information described by the user selection or the terminal. Here, a scene refers to a scene in which certain application information is applied in a specific environment.
In some embodiments, the scenes include, but are not limited to, smart home scenes, work scenes, internet of things scenes, and other scenes.
Here, the target information is information that may be of interest to the user pushed to the user in the target scene. In practical application, the information concerned by the user can be determined according to the historical operation habit of the user, and corresponding interaction information is generated for the user.
Meanwhile, the target information is application information which is determined from application information corresponding to a plurality of application programs and is matched with a target scene, namely, in the embodiment of the application, the application information corresponding to the plurality of application programs is classified according to the scenes, and the application information matched with the target scene is displayed to a user as the target information. In some embodiments, application information obtained from a plurality of application programs is subjected to information processing, and the processed information is used as target information matched with a target scene.
The target information includes at least one of:
the notification information of the application program refers to notification information pushed to the user by a background operator of the application program, for example, update information, recommendation information and the like of the application program in the running process of the application program after the application program is installed on the terminal.
The interface state information of the application program refers to information displayed during or after the start-up of the application program, for example, interface loading state information when the main application program is not yet completely started, guiding state information displayed when the user opens the application program interface for the first time, error state information caused by error operation of the user or other reasons, browsing state information, and the like.
The content page information of the application program refers to a subject content of the application program displayed through characters, pictures, videos or a combination thereof.
Determining at least one target information matching the target scene from among the plurality of application programs means determining at least one target information matching the target scene from among the application information of the plurality of application programs.
Here, the application information of different applications may belong to the same target scene, for example, the application information of an audio application and a video application may all belong to a smart home scene. Meanwhile, a plurality of application information of the same application program may belong to different target scenes, for example, recommended information about a television series in a video application program belongs to a smart home scene, and recommended information about a science and technology video may belong to a work scene.
And step S102, displaying the at least one piece of target information in a desktop display area of the terminal.
Here, the desktop display area of the terminal refers to a main display area in a display screen of the terminal, that is, a display area except for a status bar and a task bar in the display screen of the terminal.
The target information is displayed in the desktop display area of the terminal, but not in the status bar and the task bar of the display screen of the terminal, so that the terminal user can be helped to quickly locate the information of interest in the target scene in the main display area of the display screen of the terminal.
In the embodiment of the application, at least one piece of target information matched with the target scene is determined from a plurality of application programs of the terminal, the at least one piece of target information is displayed in the desktop display area of the terminal, and various application information originally enclosed under the application icons of the application programs is directly displayed in the desktop display area, so that a user can obtain the information under the target scene by checking the desktop display area of the terminal, the operation time for checking the application information by the user is greatly shortened, and the use experience of the user is improved; meanwhile, the embodiment of the application displays at least one target information according to the target scene, so that the information displayed in the unit of the application program is adjusted to the information displayed in the unit of the scene, namely, the information belonging to the same scene in different application programs is displayed to the user, so that the user can conveniently obtain more comprehensive information about the same scene, and the use experience of the user is further improved.
In some embodiments of the application, the desktop display area includes a first target scene display area and a second target scene display area adjacent to the first target scene display area.
Here, the first target scene display area and the second target scene display area are two display areas divided in the desktop display area, any one of the size, shape, and color of which can be adjusted in practical applications.
For example, in some embodiments, the first target scene display area and/or the second target scene display area may be shaped as any one of a rectangle, a diamond, an oval, and the like; the size may also be set according to how much content is displayed in the first target scene display area and/or the second target scene display area.
For another example, in some embodiments, the background of the first target scene display area and/or the second target scene display area may be set, e.g., set to a semi-transparent state or set to a specified color.
In some embodiments, the step S102 may be implemented by the following steps S1021 and S1022:
step S1021, at least one piece of first target information matched with a first target scene is displayed in the first target scene display area; wherein, each piece of first target information comprises an interactive indication mark; the interactive indication identifier is used for responding to the operation indication of the user.
In some embodiments, the first target information is notification information from at least one application and is based on at least one piece of interaction information generated by the target scene.
Here, the notification information from at least one application refers to information actively pushed by an operator of the application. For example, for a shopping class application, the notification information may be a store recommendation message, or a merchandise recommendation message.
The at least one piece of interaction information generated based on the target scene refers to at least one piece of interaction information generated for the user based on notification information of at least one application program in combination with the first target scene. For example, for a shopping application, when the notification information is a store recommendation message, an interaction message may be generated for the user based on the current time as the mother's day, "is the mother's day, is a bundle of flowers to be given to the small principals? ", and displaying the interactive information in the first target scene display area.
The interaction indication information refers to at least one interaction control which is displayed in the first target scene display area and is operated by an end user. The end user may respond to the first target information by interacting with the indication information. In some embodiments, the interaction indication information may be information for confirming at least one piece of interaction information in the first target information, for example, when the first target information is "today is mother's day, is a bundle of flowers to be given to a small princess? In the case of "interactive instruction information may be interactive controls that provide the user with a choice, e.g., a" good "or" unused "interactive control.
Step S1022, displaying at least one piece of second target information matched with the second target scene in the second target scene display area.
Here, the second target information is information matching the second target scene, for example, in the case where the second target scene is a smart home scene, the second target information may include play information of a music application, movie recommendation information of a video application, smart device control information of a smart device control application, and the like.
In practical application, at least one second target information is interactable information, and a user can control the running condition of an application program corresponding to the second target information in a clicking, touching, sliding and other modes, or can directly open and enter the corresponding application program. For example, in the case that the second target information is audio playing information, the user may start playing audio by clicking or touching a play button, or adjust the audio playing position by sliding an audio playing progress bar, or enter the audio application by clicking other positions in the audio playing interface; in the case where the second target information is video play information, the user may enter the video application by clicking or touching the second target information, and so on.
In the above embodiment, by displaying at least one first target information with the interaction indication identifier in the first target scene display area, displaying interactable information generated for the user based on the first target scene, active interaction with the user may be achieved; displaying at least one second target information in a second target scene display area, and displaying information possibly interested by the user in the second target scene to the user for selection by the user; simultaneously, at least one first target information and at least one second target information are simultaneously displayed in a desktop display area of the terminal, so that a user can simultaneously acquire information actively recommended by the system and application information in various application programs in a target scene, the information acquisition range of the user is enlarged, and the use experience of the user is further improved.
In some embodiments, the at least one second target information comprises at least one of:
interface state information of at least one controlled device application;
more than two levels of interface state information for at least one application; and the interface state information with more than two levels is not displayed to the user in the state that the application program is not opened.
Here, the interface state information of the controlled device application refers to state information for displaying a control interface of the intelligent device in the application for controlling the intelligent device. For example, in a control application of the intelligent device, status information of a control interface of the intelligent air conditioner may include an animated image of the intelligent air conditioner, a current air conditioner temperature, etc.; for another example, the status information of the control interface of the intelligent light fixture may include information of an animated image, a current brightness, etc. of the intelligent light fixture.
The second-level or higher interface state of the application program refers to a second-level or higher control interface of the application program, and in a state that the application program is not opened, the second-level or higher interface state information cannot be checked by a user.
In some embodiments, the at least one second target information may also include content page information of the at least one application program, such as content of a new mail in the mail application, a new video push message in the video application, and so on.
In this way, in this embodiment, the interface state information of the controlled device application program or the interface state information of at least one application program with more than two levels is directly displayed in the desktop display area, so that the user can directly view the state of the controlled device or the control information or page information sealed inside the application program, thereby omitting a complicated operation process, saving operation time for the user, and improving the use experience.
In some embodiments, the first target scene is capable of automatically changing based on current environmental information of the terminal.
Here, the current environment information refers to environment information in which the terminal is located, for example, a geographical location in which the terminal is located, current time information, and the like. The geographic position of the terminal can be determined based on a Beidou satellite navigation system, a global positioning system (Global Positioning System, GPS for short) and the like; the current time information may be determined from a clock server of the internet.
In practical application, the first target scene can be determined to be an intelligent home scene based on the current geographic position of the terminal; determining a first target scene as a working scene based on the current geographic position of the terminal as a company; meanwhile, the first target scene can be determined to be the scene of the Internet of things based on the habit that the user turns on the air conditioner in the home every day at 17 pm under the condition that the current time is 17 pm.
Therefore, when the change of the current environment information is detected, the first target scene can be automatically changed, and then at least one corresponding first target information is displayed according to the first target scene, namely, at least one item of interactable information generated for a user in the current environment is displayed, so that the user can respond to the first target information by utilizing the interaction indication mark in the first target information without searching for items needing to be processed in the current environment in each application program, the mode of searching for the target information by the user is adjusted to be a mode of actively pushing the application information to the user, the memory burden of the user is reduced, and the use experience is improved.
The second target scene is changeable based on a user selection operation of a plurality of second target scene options; wherein the plurality of second target scene options are displayed in a third display region of the desktop display region.
Here, the second target scene option is an option including a plurality of application scenes, for example, the second target scene option includes a smart home scene option, a work scene option, an internet of things scene option, and other scene options, and so on.
The third display area is an area of the desktop display area for displaying the second target scene option. In some embodiments, the third display region is adjacent to the second target scene display region; in some embodiments, the third display area is disposed inside the second target scene display area.
In some embodiments, the size, shape, and/or background of the third display area may be set. For example, the size of the third display area may be set according to how much the second target scene option is, and the font; the shape of the third display area may be set to any one of a rectangle, a diamond, an oval, or the like; the background of the third display area may be set to be transparent, translucent or any color.
Here, the second target scene is changed based on the user selection operation of the plurality of second target scene options, so that the second target scene can be determined through the user selection operation of the plurality of second target scene options, and at least one piece of second target information displayed in the second target scene display area is changed, so that the user can autonomously view information which may be of interest in each target scene.
In some embodiments, the display method further comprises: and in case that a preset condition is met, in response to a user selecting one second target scene from the plurality of second target scenes, synchronously updating at least one first target information displayed in the first target scene display area and at least one second target information displayed in the second target scene display area.
Here, the preset condition refers to a condition that enables the first target scene and the second target scene to be updated synchronously. In some embodiments, the preset condition may include associating the first target scene with the second target scene setting by associating the setting option. In some embodiments, the preset condition may further include an operation instruction by the user for each of the first target information before the update.
In practical applications, the user may select a second target scene option as the first target scene and the second target scene after the user selects the second target scene option from the plurality of second target scenes in response to the setting of the first target scene and the second target scene as the associated target scene. And then, displaying at least one piece of first target information and at least one piece of second target information corresponding to one second target scene selected by the user in the first target scene display area and the second target scene display area respectively.
Through the embodiment, at least one piece of target information corresponding to the target scene selected by the user can be displayed in both the first target scene display area and the second target scene display area, so that the display of the target information is more in accordance with the personalized requirements of the user, and the use experience of the user is further improved.
In some embodiments, the step S1021 may further include the following steps S1023 to S1024:
step S1023, determining at least one first target information matched with the first target scene from the plurality of application programs of the terminal based on the operation information of the plurality of application programs of the user in a first time period, and determining the display priority of each first target information;
Here, the first period of time is a period of time before the current time, for example, the first period of time may be a week, a month, or a year of time before the current time, or the like.
The user's operation information for the plurality of applications may include user's operation records in the plurality of applications, for example, user's control records for the smart device in the smart device control application, user's play records in the video/audio application, user's purchase records in the shopping application, and so on.
Here, the operation information of the plurality of applications by the user in the first period may be operation information of the applications by the user in a specific history period. For example, during the past month, the user controls the intelligent light fixture to be turned off through the intelligent device control application program at 23 pm every day; alternatively, the user turns on the video software at home for 20 pm every day to watch a certain television series during the past week, and so on.
Thus, by counting the operation records of the user on a plurality of application programs, the behavior habit of the user can be determined. Using the behavior habits of the user, it is possible to predict what the user may need to process or what he desires to complete in the first target scenario. For example, based on the user turning off the light fixture at 23 pm in the past month, the target information prompting the user to turn off the light may be displayed in the first target scene display area in the case where the first target scene is the smart home scene and the current time is approximately 23 pm.
Meanwhile, in the case that the first target scene corresponds to a plurality of first target information, a display priority may be determined for the plurality of first target moods. For example, in the case where the first target scenario is a working scenario, the plurality of first target information may be ordered according to the temporal urgency of the plurality of first target information, for example, according to a temporal urgency procedure, a higher priority is set for emergency mails to be replied to, and a lower priority is set for coffee orders.
Step S1024, displaying the at least one first target information in the first target display area based on the display priority of each first target information.
Here, the first target information having the highest display priority may be displayed in the first target scene display area among the desktop display areas, and then the other first target information may be sequentially displayed in the first target scene display area according to the display priority.
In some embodiments, displaying at least one piece of first target information in the desktop display area according to the display priority of each piece of first target information may include: in response to a sliding operation of the user within the first target scene display area, a next piece of first target information of the first target information currently displayed within the first target scene display area is displayed in the first target scene display area.
Next, a detailed description will be given of a layout of the display screen 200 of the terminal in the display method according to the embodiment of the present application with reference to fig. 2.
As shown in fig. 2, the display screen 200 includes a desktop display area including a first target scene display area 201, a second target display area 202, and a third display area 203.
The first target scene display area 201 is configured to display at least one piece of first target information corresponding to a first target scene;
the second target scene display area 202 is configured to display at least one piece of second target information corresponding to a second target scene;
the third display area 203 is for displaying a plurality of second target scene options.
Here, the manner of setting the first target scene display area 201, the second target display area 202, and the third display area 203 is described in detail above, and will not be repeated here.
The display screen 200 includes a fourth display area 204. The fourth display area 204 may be used to display icon controls for applications such as settings or calendars. Meanwhile, the shape, size and background of the fourth display area 204 may be set in practical applications, which is not limited herein.
The display screen 200 also includes a fifth display area 205. In some embodiments, the fifth display area 205 may be used as a status bar for displaying information such as network signals, power, etc. Meanwhile, the shape, size and background of the fifth display area 205 may be set in practical applications, which is not limited herein.
The display screen 200 also includes a sixth display area 206. In some embodiments, the sixth display area 206 is used for displaying icon controls corresponding to the system auxiliary information and the account switching information, and the system auxiliary information displayed in the sixth display area 206 may be personalized by the user according to the usage habit. The shape, size and background of the fourth display area 206 may be set in practical applications, and are not limited herein.
The layout of the display screen of the terminal in different target scenes will be described in detail with reference to fig. 3 to 5.
A schematic layout of the display 300 is shown in fig. 3 when the first target scene and the second target scene are both smart home scenes.
In the first target scene display area, based on the current date being the mother's day, relevant information of store information "flower plant aesthetic laboratory" is pushed to the user, and at the same time, interactive information "today being the mother's day, is the little princess to be given a bundle of flowers? "; in addition, the interactive information is generated, and meanwhile, the audio application program is called to display the interactive information to the user in a voice mode; under the store recommendation information, two interactive indication identifiers of "good" and "unused" are displayed, through which the user can respond to the interactive information.
In the third display area, three second target scene options are displayed, namely a "home" scene, a "work" scene and an "internet of things" scene. Meanwhile, the user selects a "home" scene option in the third display area, that is, takes the "home" scene as the second target scene.
And displaying information of a plurality of application programs related to the 'family' scene in a second target scene display area based on the fact that the user selects the 'family' scene option in a third display area, wherein the information comprises an audio playing interface of an audio application program, a latest watching recording interface corresponding to a video application program, state information of intelligent lamps and intelligent air conditioners and the like.
In the audio playing interface for displaying the audio application program, the playing of the currently displayed audio can be controlled by operating a playing button, the playing progress of the current audio can be controlled by sliding a progress bar, and the played audio can be switched by touching the last button or the next button. In the latest viewing record information of the video application program, by touching any one of the video viewing record information, the corresponding video application program can be opened and the corresponding video can be played. The state information of the intelligent equipment can be directly checked through the state information interfaces of the intelligent lamp and the intelligent air conditioner, and the intelligent equipment control application program can be entered through touching the corresponding state information interface, and the corresponding intelligent equipment is controlled and operated.
Meanwhile, current date information is displayed in a fourth display area, current network signal information and battery power information are displayed in a fifth display area, and currently played audio information, weather information, geographical position information and user information are displayed in a sixth display area.
A schematic layout of the display 400 is shown in fig. 4 when the first target scene and the second target scene are both working scenes.
In the first target scene display area, commodity information, namely, "cappuccino 29? "; in addition, the interactive information is generated, and meanwhile, the audio application program is called to display the interactive information to the user in a voice mode; under the commodity recommendation information, two interactive indication identifiers of "no need" and "immediate order" are displayed, and the user can respond to the interactive information through the two interactive indication identifiers.
In the third display area, three second target scene options are displayed, namely a "home" scene, a "work" scene and an "internet of things" scene. Meanwhile, the user selects a "work" scene option in the third display area, that is, takes the "work" scene as the second target scene.
In the second target scene display area, information of a plurality of application programs related to the 'work' scene is displayed based on the fact that the user selects the 'work' scene option in the third display area, wherein the information comprises air ticket information, time information of all places, mail information, calendar and backlog information, contact information and the like.
Wherein the ticket information shows departure place, arrival place, passenger information, etc.; the local time and the city of interest (i.e., nokes) time are shown in the time information; mail information in a key inbox is displayed in the mail information; calendar and backlog information are displayed in the calendar information; the contact information is displayed, and new contacts, videos, voices, messages and other functional icon controls are added.
Meanwhile, current date information is displayed in a fourth display area, current network signal information and battery power information are displayed in a fifth display area, and currently played audio information, weather information, geographical position information and user information are displayed in a sixth display area.
Fig. 5 shows a layout diagram of the display screen 500 when the first target scene and the second target scene are both internet of things scenes.
In the first target scene display area, based on the fact that the current first target scene is a working scene, purifier information is recommended to a user, namely, a purifier PRO filter element is used for XX days, and interaction information of an air purifier for suggesting replacement of the filter element is generated for the user; in addition, the interactive information is generated, and meanwhile, the audio application program is called to display the interactive information to the user in a voice mode; under the purifier information, two interactive indication identifiers of "join to do" and "purchase" are displayed, through which the user can respond to the interactive information.
In the third display area, three second target scene options are displayed, namely a "home" scene, a "work" scene and an "internet of things" scene. Meanwhile, the user selects an 'Internet of things' scene option in the third display area, namely, takes the 'Internet of things' scene as a second target scene.
In the second target scene display area, based on the user selecting the "internet of things" scene option in the third display area, information of a plurality of application programs related to the "internet of things" scene is displayed, including game handle information, smart watch information, wireless earphone information, electronic pen information, headset information and the like.
The game handle information comprises electric quantity information of a main handle and an auxiliary handle of the game handle, the intelligent watch information comprises task information and electric quantity information, the wireless earphone information comprises connection state information and electric quantity information of the wireless earphone, the electronic pen information comprises connection state information and electric quantity information of the electronic pen, and the headset information comprises connection information of the headset.
Meanwhile, current date information is displayed in a fourth display area, current network signal information and battery power information are displayed in a fifth display area, and currently played audio information, weather information, geographical position information and user information are displayed in a sixth display area.
Through the description of the desktop display area of the terminal realized based on the display method of the embodiment of the application, it can be seen that the display method provided by the embodiment of the application intuitively displays the information of a plurality of application programs in the desktop display area of the terminal, thereby being convenient for users to view and operate.
Based on the foregoing embodiments, the embodiments of the present application provide a display apparatus, where the apparatus includes units included, and modules included in the units may be implemented by a processor in a computer device; of course, the method can also be realized by a specific logic circuit; in practice, the processor may be a central processing unit (Central Processing Unit, CPU), microprocessor (Microprocessor Unit, MPU), digital signal processor (Digital Signal Processor, DSP) or field programmable gate array (Field Programmable Gate Array, FPGA), etc.
Fig. 6 is a schematic diagram of a composition structure of a display device according to an embodiment of the present application, and as shown in fig. 6, a display device 600 includes: a determination module 610 and a display module 620, wherein:
a determining module 610, configured to determine at least one target information matching a target scene from a plurality of applications of the terminal; wherein the target information includes at least one of: notification information, interface state information and content page information of the application program;
and a display module 620, configured to display the at least one target information in a desktop display area of the terminal.
In some embodiments, the desktop display region includes a first target scene display region and a second target scene display region adjacent to the first target scene display region;
the display module 620 is configured to:
displaying at least one first target information matched with a first target scene in the first target scene display area; wherein, each piece of first target information comprises an interactive indication mark; the interaction indication identifier is used for responding to an operation indication of a user;
and displaying at least one piece of second target information matched with the second target scene in the second target scene display area.
In some embodiments, the first target scene is capable of automatically changing based on current environmental information of the terminal; the second target scene is changeable based on a user selection operation of a plurality of second target scene options; wherein the plurality of second target scene options are displayed in a third display region of the desktop display region.
In some embodiments, the display module 620 is further configured to:
and in response to the user selecting one second target scene from the plurality of second target scenes, synchronously updating at least one first target information displayed in the first target scene display area and at least one second target information displayed in the second target scene display area.
In some embodiments, the at least one second target information comprises at least one of:
interface state information of at least one controlled device application;
more than two levels of interface state information for at least one application; and the interface state information with more than two levels is not displayed to the user in the state that the application program is not opened.
In some embodiments, the display module 620 is configured to:
Determining at least one first target information matched with the first target scene from the plurality of application programs of the terminal based on the operation information of the user on the plurality of application programs in a first time period, and determining the display priority of each first target information;
the at least one first target information is displayed in the desktop display area based on a display priority of each of the first target information.
In some embodiments, the first target information is notification information from at least one application and is based on at least one piece of interaction information generated by the first target scene.
The description of the apparatus embodiments above is similar to that of the method embodiments above, with similar advantageous effects as the method embodiments. In some embodiments, the functions or modules included in the apparatus provided by the embodiments of the present disclosure may be used to perform the methods described in the embodiments of the methods, and for technical details that are not disclosed in the embodiments of the apparatus of the present disclosure, please refer to the description of the embodiments of the methods of the present disclosure for understanding.
It should be noted that, in the embodiment of the present application, if the display method is implemented in the form of a software functional module, and sold or used as a separate product, the display method may also be stored in a computer readable storage medium. Based on such understanding, the technical solution of the embodiments of the present application may be essentially or some of contributing to the related art may be embodied in the form of a software product stored in a storage medium, including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read Only Memory (ROM), a magnetic disk, an optical disk, or other various media capable of storing program codes. Thus, embodiments of the application are not limited to any specific hardware, software, or firmware, or any combination of hardware, software, and firmware.
The embodiment of the application provides a terminal, which comprises a memory and a processor, wherein the memory stores a computer program capable of running on the processor, and the processor realizes part or all of the steps in the method when executing the program.
Embodiments of the present application provide a computer readable storage medium having stored thereon a computer program which, when executed by a processor, performs some or all of the steps of the above-described method. The computer readable storage medium may be transitory or non-transitory.
Embodiments of the present application provide a computer program comprising computer readable code which, when run in a computer device, causes a processor in the computer device to perform some or all of the steps for carrying out the above method.
Embodiments of the present application provide a computer program product comprising a non-transitory computer-readable storage medium storing a computer program which, when read and executed by a computer, performs some or all of the steps of the above-described method. The computer program product may be realized in particular by means of hardware, software or a combination thereof. In some embodiments, the computer program product is embodied as a computer storage medium, in other embodiments the computer program product is embodied as a software product, such as a software development kit (Software Development Kit, SDK), or the like.
It should be noted here that: the above description of various embodiments is intended to emphasize the differences between the various embodiments, the same or similar features being referred to each other. The above description of apparatus, storage medium, computer program and computer program product embodiments is similar to that of method embodiments described above, with similar advantageous effects as the method embodiments. For technical details not disclosed in the embodiments of the apparatus, the storage medium, the computer program and the computer program product of the present application, reference should be made to the description of the embodiments of the method of the present application.
It should be noted that fig. 7 is a schematic diagram of a hardware entity of a terminal in an embodiment of the present application, and as shown in fig. 7, the hardware entity of the terminal 700 includes: a processor 701, a communication interface 702, and a memory 703, wherein:
the processor 701 generally controls the overall operation of the terminal 700.
Communication interface 702 may enable the computer device to communicate with other terminals or servers over a network.
The memory 703 is configured to store instructions and applications executable by the processor 701, and may also cache data (e.g., image data, audio data, voice communication data, and video communication data) to be processed or processed by each module in the processor 701 and the terminal 700, which may be implemented by a FLASH memory (FLASH) or a random access memory (Random Access Memory, RAM). Data transfer may occur between the processor 701, the communication interface 702 and the memory 703 via the bus 704.
It should be appreciated that reference throughout this specification to "one embodiment" or "an embodiment" means that a particular feature, structure or characteristic described in connection with the embodiment is included in at least one embodiment of the present application. Thus, the appearances of the phrases "in one embodiment" or "in an embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. It should be understood that, in various embodiments of the present application, the sequence number of each step/process described above does not mean that the execution sequence of each step/process should be determined by its functions and inherent logic, and should not constitute any limitation on the implementation process of the embodiments of the present application. The foregoing embodiment numbers of the present application are merely for the purpose of description, and do not represent the advantages or disadvantages of the embodiments.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
In the several embodiments provided by the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. The above described device embodiments are only illustrative, e.g. the division of the units is only one logical function division, and there may be other divisions in practice, such as: multiple units or components may be combined or may be integrated into another system, or some features may be omitted, or not performed. In addition, the various components shown or discussed may be coupled or directly coupled or communicatively coupled to each other via some interface, whether indirectly coupled or communicatively coupled to devices or units, whether electrically, mechanically, or otherwise.
The units described above as separate components may or may not be physically separate, and components shown as units may or may not be physical units; can be located in one place or distributed to a plurality of network units; some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, or each unit may be separately used as one unit, or two or more units may be integrated in one unit; the integrated units may be implemented in hardware or in hardware plus software functional units.
Those of ordinary skill in the art will appreciate that: all or part of the steps for implementing the above method embodiments may be implemented by hardware related to program instructions, and the foregoing program may be stored in a computer readable storage medium, where the program, when executed, performs steps including the above method embodiments; and the aforementioned storage medium includes: a mobile storage device, a Read Only Memory (ROM), a magnetic disk or an optical disk, or the like, which can store program codes.
Alternatively, the above-described integrated units of the present application may be stored in a computer-readable storage medium if implemented in the form of software functional modules and sold or used as separate products. Based on such understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the related art in the form of a software product stored in a storage medium, including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a removable storage device, a ROM, a magnetic disk, or an optical disk.
The foregoing is merely an embodiment of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily think about changes or substitutions within the technical scope of the present application, and the changes and substitutions are intended to be covered by the scope of the present application.

Claims (10)

1. The display method is applied to a terminal, wherein the method comprises the following steps:
determining at least one target information matched with a target scene from a plurality of application programs of the terminal; wherein the target information includes at least one of: notification information, interface state information, or content page information of the application;
and displaying the at least one piece of target information in a desktop display area of the terminal.
2. The method of claim 1, wherein the desktop display region comprises a first target scene display region and a second target scene display region adjacent to the first target scene display region;
the displaying the target information in the desktop display area of the terminal comprises the following steps:
displaying at least one first target information matched with a first target scene in the first target scene display area; wherein, each piece of first target information comprises an interactive indication mark; the interaction indication identifier is used for responding to an operation indication of a user;
And displaying at least one piece of second target information matched with the second target scene in the second target scene display area.
3. The method of claim 2, wherein,
the first target scene can be automatically changed based on the current environment information of the terminal;
the second target scene is changeable based on a user selection operation of a plurality of second target scene options; wherein the plurality of second target scene options are displayed in a third display region of the desktop display region.
4. A method according to claim 3, the method further comprising:
and in response to the user selecting one second target scene from the plurality of second target scenes, synchronously updating at least one first target information displayed in the first target scene display area and at least one second target information displayed in the second target scene display area.
5. The method of claim 2, wherein the at least one second target information comprises at least one of:
interface state information of at least one controlled device application;
more than two levels of interface state information for at least one application; and the interface state information with more than two levels is not displayed to the user in the state that the application program is not opened.
6. The method of claim 2, wherein displaying at least one first target information matching a first target scene in the first target scene display area comprises:
determining at least one first target information matched with the first target scene from the plurality of application programs of the terminal based on the operation information of the user on the plurality of application programs in a first time period, and determining the display priority of each first target information;
the at least one first target information is displayed in the first target scene display area based on a display priority of each of the first target information.
7. The method of claim 2, wherein the first target information is notification information from at least one application and is based on at least one interaction information generated by the target scene.
8. A display device, comprising:
a determining module, configured to determine at least one target information matched with a target scene from a plurality of application programs of the terminal; wherein the target information includes at least one of: notification information, interface state information and content page information of the application program;
And the display module is used for displaying the at least one piece of target information in a desktop display area of the terminal.
9. A terminal, comprising: comprising a memory and a processor, the memory storing a computer program executable on the processor, wherein the processor implements the steps of the method of any of claims 1 to 7 when the program is executed.
10. A computer readable storage medium having stored thereon a computer program, wherein the computer program when executed by a processor realizes the steps in the method of any of claims 1 to 7.
CN202311110398.XA 2023-08-30 2023-08-30 Display method, device, terminal and storage medium Pending CN117032522A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311110398.XA CN117032522A (en) 2023-08-30 2023-08-30 Display method, device, terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311110398.XA CN117032522A (en) 2023-08-30 2023-08-30 Display method, device, terminal and storage medium

Publications (1)

Publication Number Publication Date
CN117032522A true CN117032522A (en) 2023-11-10

Family

ID=88637320

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311110398.XA Pending CN117032522A (en) 2023-08-30 2023-08-30 Display method, device, terminal and storage medium

Country Status (1)

Country Link
CN (1) CN117032522A (en)

Similar Documents

Publication Publication Date Title
US11301107B2 (en) Method and electronic apparatus for displaying information
US9020565B2 (en) Tile space user interface for mobile devices
RU2567503C2 (en) Method and apparatus for providing information history associated with time information
CN107957831B (en) Data processing method, device and processing equipment for displaying interface content
US9143625B2 (en) Selection of wireless devices and service plans
US9477378B2 (en) Method and apparatus for providing a user interface
TWI485614B (en) Portable electronic device and user interface display method
EP3200145A1 (en) Electronic device and information processing method of electronic device
CN105453596A (en) Intelligent SIM selection supporting rich context of input factors
WO2014172054A2 (en) Virtual assistant focused user interfaces
KR20070112986A (en) Method for providing idle screen layer given an visual effect and method of providing idle screen
KR20140105738A (en) Adjusting user interface screen order and composition
TW200419437A (en) Glanceable information system and method
CN101772895A (en) The method of the user interface of Remote configuration portable set
CN106557331B (en) Mobile terminal notification control method and device and mobile terminal
CN103186677A (en) Information display method and information display device
CN112181220A (en) Icon display method, equipment and system
JP2017528858A (en) Ticket information display method, apparatus, program, and recording medium
KR20040104752A (en) Management of interaction opportunity data
WO2018010337A1 (en) Display method and device
JP6721200B1 (en) Program, method and information processing apparatus
US20100262493A1 (en) Adaptive soft key functionality for display devices
CN117032522A (en) Display method, device, terminal and storage medium
CN111510554A (en) Dial switching method, dial switching equipment and storage medium
CN115562556A (en) Interface display method, device, terminal and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination