WO2024021691A9 - 显示方法及电子设备 - Google Patents

显示方法及电子设备 Download PDF

Info

Publication number
WO2024021691A9
WO2024021691A9 PCT/CN2023/088873 CN2023088873W WO2024021691A9 WO 2024021691 A9 WO2024021691 A9 WO 2024021691A9 CN 2023088873 W CN2023088873 W CN 2023088873W WO 2024021691 A9 WO2024021691 A9 WO 2024021691A9
Authority
WO
WIPO (PCT)
Prior art keywords
card
electronic device
user
area
desktop
Prior art date
Application number
PCT/CN2023/088873
Other languages
English (en)
French (fr)
Other versions
WO2024021691A1 (zh
Inventor
黄丽薇
焦骏婷
Original Assignee
荣耀终端有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 荣耀终端有限公司 filed Critical 荣耀终端有限公司
Publication of WO2024021691A1 publication Critical patent/WO2024021691A1/zh
Publication of WO2024021691A9 publication Critical patent/WO2024021691A9/zh

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range

Definitions

  • the present application relates to the field of terminal technology, and in particular to a display method and an electronic device.
  • the purpose of the present application is to provide a display method and an electronic device.
  • the electronic device can dynamically present application and service cards that the user may use in the current scenario according to the actual needs of the user in the current scenario, which can help the user locate the application he needs to browse more quickly and provide a better user experience.
  • the present application provides a display method, which is applied to an electronic device, and the method comprises: in a first scene, displaying a first desktop, the first desktop including multiple first application icons and at least one first card; recognizing that the first scene changes to a second scene, displaying a second desktop, the second desktop including multiple second application icons and at least one second card, the multiple second application icons are different from the multiple first application icons, and/or the at least one second card is different from the at least one first card.
  • the content displayed in the first desktop is determined by the first scene. That is, the multiple first application icons are applications that the user may need to browse or open in the first scene, and the at least one first card is a card generated by an application that the user may need to browse or open in the first scene. According to the actual needs of the user in the current scene, presenting the application and service cards that the user may use in the current scene can help the user locate the application he needs to browse more quickly and save the user's time.
  • users check their mobile phone screens for a clear purpose may be strongly related to factors such as time and place and the user's personal usage preferences. For example, users generally check their schedules in the morning and open games, sports and health applications in the evening. For another example, users generally open the boarding code at the bus station and the payment code at the store. Therefore, in this embodiment, when the scene changes, the electronic device The content presented on the desktop will also change with the change of the scene. In this way, the electronic device can dynamically present the application and service cards that the user may use in the current scene according to the scene, further improving the user experience.
  • the first desktop includes a first dock area
  • the method further includes: receiving a first operation acting on the first dock area; in response to the first operation, displaying a third desktop, the third desktop including a second dock area, the area of the second dock area being different from the area of the first dock area; when the area of the second dock area is larger than the area of the first dock area, the number of application icons in the second dock area is larger than the number of application icons in the first dock area; when the area of the second dock area is smaller than the area of the first dock area, the number of application icons in the second dock area is smaller than the number of application icons in the first dock area.
  • the at least one first application icon can be divided into two parts, namely, the at least one third application icon and the remaining application icons except the at least one third application.
  • the at least one third application icon is displayed in the dock area.
  • the dock area is generally set at the bottom of the screen of the electronic device, which can accommodate a number of application icons. No matter how the user switches the screen interface, the application icons contained in the dock area on the screen interface will never change and will always be displayed on the current interface. Similarly, when the electronic device changes the scene, even if the electronic device updates the application icons in the interface, it will not update the application icons contained in the dock area.
  • the electronic device will not update the application icons displayed in the first dock area in the at least one first application, and will only update the application icons displayed outside the first dock area in the at least one first application icon.
  • the user can adjust the size of the dock area by, for example, pulling up or pulling down, and the number of programs that can be accommodated in the dock area will increase accordingly. In this way, the user can put more icons of their frequently used applications in the dock area so that the user can directly and quickly start more applications.
  • the number of application icons located outside the first dock area is N
  • the number of application icons located outside the second dock area is M
  • the electronic device can adaptively increase or delete application icons outside the dock area to provide users with a better visual experience. For example, when the user enlarges the first dock area and obtains the second dock area, the electronic device can reduce the application icons originally displayed in the first dock area in the first desktop, and only display some of the application icons outside the second dock area in the third desktop.
  • the electronic device can also adaptively adjust the size of the application icons outside the dock area, or change the arrangement of the application icons, to provide the user with a better browsing experience.
  • the third desktop includes at least one third card, and the at least one third card is all or part of the cards in the at least first card, and the display methods of the at least one first card and the at least one third card are different.
  • the electronic device when the electronic device changes the area of the first dock area, the electronic device can adaptively Change the display mode of the cards in the card area in the scenario desktop, the display mode includes the number, shape and arrangement of the cards, to provide users with a better browsing effect.
  • the at least one first card contains 3 cards (referred to as Card 1, Card 2 and Card 3)
  • the electronic device when the user enlarges the first dock area (the enlarged dock area is the second dock area), then in the third desktop, the electronic device can adaptively reduce the area of one or more cards in Card 1, Card 2 and Card 3 (at this time Card 1, Card 2 and Card 3 are the at least one third card).
  • the method further includes: receiving a second operation acting on a fourth card, the fourth card being any one of the at least one first card; and deleting the fourth card in response to the second operation.
  • the method after receiving the second operation acting on the fourth card, the method also includes: displaying a fourth desktop, the fourth desktop including at least one fifth card, and the display mode of the at least one fifth card is different from that of the at least one first card.
  • the electronic device can delete any one of the at least one card in response to the user's operation on the card. After deletion, the electronic device can adjust the remaining cards in the at least one card (i.e., the at least one fifth card) and change the display mode of the at least one fifth card, wherein the display mode includes the quantity, shape, and arrangement of the cards, so as to provide the user with a further browsing effect. For example, assuming that in the first desktop, the at least one first card contains 3 cards (referred to as card 4, card 5, and card 6), when the user deletes card 4 (at this time, card 4 is the fourth card), then in the fourth desktop, the electronic device can adaptively reduce the area of card 5 and card 6 (at this time, card 5 and card 6 are the at least one fifth card).
  • the at least one first card contains 3 cards (referred to as card 4, card 5, and card 6)
  • the electronic device can adaptively reduce the area of card 5 and card 6 (at this time, card 5 and card 6 are the at least one fifth card).
  • the method further includes: receiving a third operation acting on the first desktop, and generating a sixth card in response to the third operation.
  • the method also includes: displaying a fifth desktop, the fifth desktop including at least one seventh card, the at least one seventh card including the at least one first card and the sixth card; the display mode of the at least one seventh card and the at least one first card is different.
  • the electronic device can add cards (i.e., the cards to the sixth card) to the first desktop in response to the user's operation on the first desktop. After adding the sixth card, the sixth card and the at least one card constitute the at least one seventh card. To ensure the browsing effect of the user, the electronic device can adjust the display mode of the card in the at least one seventh card, which is actually to change the display mode of the at least one card in the first desktop, and the display mode includes the number, shape, and arrangement of the cards, to provide the user with a further browsing effect.
  • the at least one first card contains 2 cards (referred to as card 7 and card 8)
  • the electronic device can adaptively reduce the area of card 7 and card 8, and display cards 7-9 together in the fifth desktop (at this time, cards 7-9 are the at least one seventh card).
  • the method further includes: receiving a fourth operation acting on an eighth card, the eighth card being any one of the at least one first card; and in response to the fourth operation, enlarging or reducing the eighth card.
  • the electronic device can enlarge or reduce the area of any card (i.e., the eighth card) in response to the user's operation on any card among the at least one card.
  • the card can be enlarged when the user needs to enable the user to browse the information of the card more clearly; or the card can be reduced to highlight the information of other more important cards.
  • the method after receiving the fourth operation acting on the eighth card, the method also includes: displaying a sixth desktop, the sixth desktop including the at least one first card, and the display mode of the at least one first card in the sixth desktop is different from that of the at least one first card in the first desktop.
  • the electronic device can adaptively adjust the display mode of other cards in the at least one first card, and the display mode includes the quantity, shape, and arrangement of the cards, to provide the user with a further browsing effect.
  • the electronic device can adaptively reduce the area of card B (at this time, card A and card B are the at least one first card in the six desktops), or no longer display card B (at this time, card A is the at least one first card in the six desktops).
  • the identification that the first scene changes to the second scene includes: identifying that the time changes from a first moment to a second moment, the first moment corresponds to the first scene, and the second moment corresponds to the second scene; or, identifying that the location of the electronic device changes from a first location to a second location, the first location corresponds to the first scene, and the second location corresponds to the second scene; or, at a third moment, identifying that the user performs a fifth operation on the electronic device, and at a fourth moment, identifying that the user performs a sixth operation on the electronic device, the fifth operation performed by the user on the electronic device corresponds to the first scene, and the sixth operation performed by the user on the electronic device corresponds to the second scene.
  • the basis for the electronic device to determine the scene change may include but is not limited to the following:
  • the electronic device can update the content of the scenario desktop at different times based on the user's habits, so that the user can find the application that needs to be opened at the current time more quickly.
  • the applications that users need to open are often directly related to the location of the users. For example, users usually open the boarding code at the bus station, open the payment code at the store, open the navigation software on the highway, etc. Therefore, in this embodiment, the electronic device can obtain the location information of the current environment in real time, and update the content of the scenario desktop at different locations based on the location information, so that users can find the applications that need to be opened at the current location more quickly.
  • the electronic device can search for movie theaters based on the user's last visit.
  • the application or card browsed when using the electronic device for the first time is used to update the content of the scenario-based desktop, and recommend cards or applications that meet the user's current usage needs.
  • an embodiment of the present application provides an electronic device, comprising: one or more processors and a memory; the memory is coupled to the one or more processors, the memory is used to store computer program code, the computer program code includes computer instructions, and the one or more processors call the computer instructions to enable the electronic device to execute the method in the first aspect or any possible implementation of the first aspect.
  • a chip system which is applied to an electronic device, and the chip system includes one or more processors, and the processors are used to call computer instructions so that the electronic device executes a method as in the first aspect or any possible implementation of the first aspect.
  • a computer-readable storage medium comprising instructions, which, when executed on an electronic device, enable the electronic device to execute the method according to the first aspect or any possible implementation manner of the first aspect.
  • beneficial effects of the technical solutions provided in the second to fourth aspects of the present application can refer to the beneficial effects of the technical solutions provided in the first aspect, and will not be repeated here.
  • FIG1 is a schematic diagram of a desktop of an electronic device provided in an embodiment of the present application.
  • FIG2 is a schematic diagram of a desktop of an electronic device provided in an embodiment of the present application.
  • FIG3 is a schematic diagram of the structure of an electronic device 100 provided in an embodiment of the present application.
  • FIG4 is a schematic diagram of a scenario-based desktop provided in an embodiment of the present application.
  • FIG5 is a diagram of a user interface for desktop selection provided in an embodiment of the present application.
  • FIG6 is a schematic diagram of a process of setting a scenario-based desktop according to an embodiment of the present application.
  • FIG. 7 is a schematic diagram of a process for adjusting a dock area provided in an embodiment of the present application.
  • FIG8 is a schematic diagram of a process of scaling a card in a scenario-based desktop provided by an embodiment of the present application.
  • FIG9 is a schematic diagram of a process of deleting a card in a scenario-based desktop provided by an embodiment of the present application.
  • FIG10 is a schematic diagram of a process of adding a card to a scenario-based desktop provided in an embodiment of the present application
  • FIG11 is a schematic diagram of a process of updating desktop content in a scenario-based desktop according to an embodiment of the present application.
  • FIG12 is a schematic diagram of a process of updating desktop content in a scenario-based desktop according to an embodiment of the present application.
  • FIG13 is a schematic diagram of a process of updating desktop content in a scenario-based desktop according to an embodiment of the present application.
  • FIG. 14 is a flow chart of a display method provided in an embodiment of the present application.
  • the dock area can also be called a dock bar, which is the abbreviation of dockbar. It is a functional interface in a graphical user interface for starting and switching running applications.
  • the dock area is generally set below the screen of the electronic device, which can accommodate several application icons. For most electronic devices, no matter how the user switches the screen interface, the size of the dock area and the applications it contains on the screen interface will never change, and will always be displayed on the current interface. Therefore, the dock area can facilitate users to quickly start the application corresponding to the application icon therein. Of course, for some electronic devices that can display the negative one screen, when the user switches the interface of the electronic device to the negative one screen, the dock area may not be displayed on the negative one screen.
  • the user can long press any program icon and drag it into the dock area.
  • the size of the dock area is not adjustable, so the number of application icons in the dock area is also limited, generally not more than 5.
  • the user can adjust the size of the dock area by, for example, pulling up or pulling down, and the dock area can accommodate more programs, so that the user can directly and quickly start more applications. For details, please refer to the subsequent embodiments, which will not be repeated here.
  • Cards are actually a new form of mobile application page content display.
  • the content of the application page can be placed on the card.
  • Users can directly operate the card to achieve the application experience, so as to achieve direct service and reduce the experience level.
  • Cards can be placed anywhere, and users can customize their own desktop style.
  • Cards are often embedded in other applications as part of their interface (you can also use atomic services to save applications to the service center, which does not require the installation of applications), and support basic interactive functions such as pulling up pages and sending messages. Card users and providers are not required to run resident. When adding/deleting/requesting to update a card, the card management service will pull up the card provider to obtain the card information.
  • the card function has three features: 1. Easy to use and visible, that is, the card can expose the content that highlights the service information, reducing user experience problems caused by level jumps. 2. Intelligent and optional, that is, the card can display data information that can be changed throughout the day, support custom types of service card design, and users can set the style of the card themselves. 3. Multi-terminal variable, that is, the card can adapt to the adaptive properties of multiple terminal devices. Mobile phones, bracelets, and tablets all support service card attributes.
  • FIG. 1 and FIG. 2 respectively show two desktop presentation forms commonly used in current electronic devices.
  • FIG. 1 is a schematic diagram of a user interface provided in an embodiment of the present application.
  • FIG. 1 (A) The main screen interface 10 for presenting an application menu on a current electronic device is exemplarily shown.
  • the main screen interface 10 includes a status bar 101, a calendar and weather widget 102, an application area 103 and a dock area 104, wherein:
  • the status bar 101 may include the name of the operator (eg, China Mobile), time, WI-FI icon, signal strength, and current remaining power.
  • the calendar weather widget 102 can be used to indicate the current time, such as date, day of the week, hour and minute information, etc., and can be used to indicate weather type, such as cloudy to sunny, light rain, etc., and can also be used to indicate information such as temperature, and can also be used to indicate location.
  • weather type such as cloudy to sunny, light rain, etc.
  • information such as temperature, and can also be used to indicate location.
  • the application area 103 may include, for example, an icon of "memo”, an icon of "recorder”, an icon of "switch clone", an icon of "calculator”, an icon of "gallery”, an icon of "ruler”, an icon of "remote control”, an icon of "contacts”, etc. It may also include icons of other applications, which are not limited in the embodiments of the present application. Any icon of an application may be used to respond to a user operation, such as a touch operation, so that the electronic device starts the application corresponding to the icon.
  • the dock area 104 may include, for example, a "phone” icon, a "text message” icon, a “browser” icon, and a “camera” icon, and may also include icons of other applications, which are not limited in the present embodiment. Any application icon may be used to respond to a user operation, such as a touch operation, so that the electronic device starts the application corresponding to the icon.
  • the interface provided in the embodiment of the present application and subsequent embodiments may also include a navigation bar, which may include system navigation keys such as a return button, a home screen button, and an outgoing task history button.
  • a navigation bar which may include system navigation keys such as a return button, a home screen button, and an outgoing task history button.
  • the electronic device may display the previous user interface of the current user interface.
  • the electronic device may display the home screen interface (such as the home screen interface 10) set by the user for the electronic device.
  • the outgoing task history button the electronic device may display the tasks that the user has recently opened.
  • each navigation key may also be other, for example, the above-mentioned return button may be called Back Button, the above-mentioned home screen button may be called Home button, and the above-mentioned outgoing task history button may be called Menu Button, and this application does not limit this.
  • the navigation keys in the above-mentioned navigation bar are not limited to virtual buttons, and may also be implemented as physical buttons.
  • FIG2 is a schematic diagram of another application interface provided in an embodiment of the present application.
  • FIG2(A) exemplarily shows a main screen interface 20 for presenting an application menu on a current electronic device.
  • the main screen interface 20 includes a status bar 201, a calendar weather widget 202, an application area 203, and a dock area 204, wherein:
  • the application area 203 shown in (A) of FIG. 2 may include an application folder 2031 and an application folder 2032 .
  • Application folder 2031 may contain, for example (Kugou) Icon 2031A, (Tik Tok) icon and (Wechat) icons, etc., may also include icons of other applications, which are not limited in the present embodiment.
  • the icons in the application folder 2031 may be displayed in the form of cards, but the shape of the card is fixed, such as the one shown in (A) of FIG. (Kugou) icon 2031A, the user can directly obtain the relevant information of the application corresponding to the icon from the current interface.
  • the icons in the application folder 2031 can be updated according to the user's use time of the application in the electronic device; for example, when the user uses the application "QQ" in the electronic device, the icons in the application folder 2031 are updated.
  • the icon of (Wechat) can be updated to (Tencent QQ) icon.
  • the application folder 2031 may include icons of multiple applications such as a "memo" icon, a "recorder” icon, and a “calendar” icon. These application icons may be presented in the application folder 2031 as small icons, and the electronic device may display the application icons in the application folder 2031 as large icons in response to a touch operation on the application folder 2031, such as a click operation; after these small icons are displayed as large icons, any application icon may be used to respond to a user operation, such as a touch operation, so that the electronic device starts the application corresponding to the icon.
  • a touch operation such as a click operation
  • the dock area displayed by the same electronic device is generally set at the bottom of the screen of the electronic device, which can accommodate several applications. No matter how the user switches the screen interface, the size of the dock area on the screen interface and the applications it contains will never change and will always be displayed on the current interface.
  • the electronic device can respond to the user's operation on the main screen interface 10, such as the left sliding operation shown in FIG. 1 (A), and display the user interface 11.
  • the user interface 11 may include an application area 113 and a dock area 114. Among them:
  • Application area 113 may include (Tencent QQ) icon, (Wechat) icon,
  • the icons of Sina Weibo and other applications may also include icons of other applications, which are not limited in the present application.
  • Any application icon may be used to respond to a user operation, such as a touch operation, so that the electronic device starts the application corresponding to the icon.
  • the dock area 114 includes, for example, an icon for "phone”, an icon for "text messages", an icon for "browser”, and an icon for "camera”.
  • the size of the dock area i.e., dock area 104 and dock area 114 in the two interfaces and the applications contained therein remain unchanged.
  • the size of the dock area and the applications contained therein on the screen interface are fixed and unchanged, and the dock area is always displayed on the interface currently displayed by the electronic device.
  • the user can long press any program icon and drag it into the dock area.
  • the size of the dock area is not adjustable, so the number of applications in the dock area is also limited, for example, the number of applications in the dock area generally cannot exceed 5.
  • the user can drag the icon 2033 of the application "application market" into the dock area 204 by long pressing it.
  • the dock area 204 can finally include the icon of "phone”, the icon of "text message”, the icon of "browser”, the icon of "camera” and the icon of the application market.
  • the number of applications that can be accommodated in the dock area 204 is saturated, and the user cannot subsequently drag the icons of other applications outside the dock area 204 into the dock area 204 without moving the icons in the dock area 204 out.
  • the user interfaces shown in Figures 1 and 2 are relatively simple in form, and the applications or cards displayed in each interface are basically unchanged. This means that when the user is not familiar with the interface content, especially when the electronic device has a large number of applications, the user needs to waste a lot of time browsing multiple interfaces before finding the application he wants to open or browse.
  • users due to the characteristics of the dock, users usually put applications that they need to use frequently into the dock area so that they can quickly open these applications at any time.
  • the number of applications that can be accommodated in the dock area in the user interfaces shown in Figures 1 and 2 is limited. When the number of frequently used applications of the user is greater than a certain number, the size of the dock area is too large to accommodate. The dock area that cannot be changed cannot meet the needs of users, which undoubtedly wastes the user's time and reduces the user experience.
  • the present application provides an electronic device that can be used to display a scenario-based desktop that can dynamically present applications or service cards that the user may use in the current scenario according to user needs; and the dock area size in the above-mentioned scenario-based desktop is adjusted to effectively reduce the time users spend looking for applications they need to open, thereby improving the user experience.
  • the electronic device may be a mobile phone, a tablet computer, a wearable device, a car-mounted device, an augmented reality (AR)/virtual reality (VR) device, a laptop computer, an ultra-mobile personal computer (UMPC), a netbook, a personal digital assistant (PDA) or a dedicated camera (such as a SLR camera, a card camera), etc.
  • AR augmented reality
  • VR virtual reality
  • UMPC ultra-mobile personal computer
  • PDA personal digital assistant
  • a dedicated camera such as a SLR camera, a card camera
  • FIG. 3 exemplarily shows the structure of the electronic device.
  • the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a button 190, a motor 191, an indicator 192, a camera 193, a display screen 194, and a subscriber identification module (SIM) card interface 195, etc.
  • SIM subscriber identification module
  • the sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, etc.
  • the structure illustrated in the embodiment of the present invention does not constitute a specific limitation on the electronic device 100.
  • the electronic device 100 may include more or fewer components than shown in the figure, or combine some components, or split some components, or arrange the components differently.
  • the components shown in the figure may be implemented in hardware, software, or a combination of software and hardware.
  • the processor 110 may include one or more processing units, for example, the processor 110 may include an application processor (AP), a modem processor, a graphics processor (GPU), an image signal processor (ISP), a controller, a memory, a video codec, a digital signal processor (DSP), a baseband processor, and/or a neural-network processing unit (NPU), etc.
  • AP application processor
  • GPU graphics processor
  • ISP image signal processor
  • controller a memory
  • video codec a digital signal processor
  • DSP digital signal processor
  • NPU neural-network processing unit
  • Different processing units may be independent devices or integrated in one or more processors.
  • the controller may be the nerve center and command center of the electronic device 100.
  • the controller may generate an operation control signal according to the instruction operation code and the timing signal to complete the control of fetching and executing instructions.
  • the processor 110 may also be provided with a memory for storing instructions and data.
  • the memory in the memory is a high-speed cache memory.
  • the memory can store instructions or data that have just been used or cyclically used by the processor 110. If the processor 110 needs to use the instruction or data again, it can be directly called from the memory. Repeated access is avoided, the waiting time of the processor 110 is reduced, and the efficiency of the system is improved.
  • the processor 110 may include one or more interfaces.
  • the interface may include an inter-integrated circuit (I2C) interface, an inter-integrated circuit sound (I2S) interface, a pulse code modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a mobile industry processor interface (MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (SIM) interface, and/or a universal serial bus (USB) interface, etc.
  • I2C inter-integrated circuit
  • I2S inter-integrated circuit sound
  • PCM pulse code modulation
  • UART universal asynchronous receiver/transmitter
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • SIM subscriber identity module
  • USB universal serial bus
  • the I2C interface is a bidirectional synchronous serial bus, including a serial data line (SDA) and a serial clock line (SCL).
  • the processor 110 may include multiple groups of I2C buses.
  • the processor 110 may be coupled to the touch sensor 180K, the charger, the flash, the camera 193, etc. through different I2C bus interfaces.
  • the processor 110 may be coupled to the touch sensor 180K through the I2C interface, so that the processor 110 communicates with the touch sensor 180K through the I2C bus interface, thereby realizing the touch function of the electronic device 100.
  • the I2S interface can be used for audio communication.
  • the processor 110 can include multiple I2S buses.
  • the processor 110 can be coupled to the audio module 170 via the I2S bus to achieve communication between the processor 110 and the audio module 170.
  • the audio module 170 can transmit an audio signal to the wireless communication module 160 via the I2S interface to achieve the function of answering a call through a Bluetooth headset.
  • the PCM interface can also be used for audio communication, sampling, quantizing and encoding analog signals.
  • the audio module 170 and the wireless communication module 160 can be coupled via a PCM bus interface.
  • the audio module 170 can also transmit audio signals to the wireless communication module 160 via the PCM interface to realize the function of answering calls via a Bluetooth headset. Both the I2S interface and the PCM interface can be used for audio communication.
  • the UART interface is a universal serial data bus for asynchronous communication.
  • the bus can be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication.
  • the UART interface is generally used to connect the processor 110 and the wireless communication module 160.
  • the processor 110 communicates with the Bluetooth module in the wireless communication module 160 through the UART interface to implement the Bluetooth function.
  • the audio module 170 can transmit an audio signal to the wireless communication module 160 through the UART interface to implement the function of playing music through a Bluetooth headset.
  • the MIPI interface can be used to connect the processor 110 with peripheral devices such as the display screen 194 and the camera 193.
  • the MIPI interface includes a camera serial interface (CSI), a display serial interface (DSI), etc.
  • the processor 110 and the camera 193 communicate via the CSI interface to implement the shooting function of the electronic device 100.
  • the processor 110 and the display screen 194 communicate via the DSI interface to implement the display function of the electronic device 100.
  • the GPIO interface can be configured by software.
  • the GPIO interface can be configured as a control signal or as a data signal.
  • the GPIO interface can be used to connect the processor 110 with the camera 193, the display 194, the wireless communication module 160, the audio module 170, the sensor module 180, etc.
  • the GPIO interface can also be configured as an I2C interface, an I2S interface, a UART interface, a MIPI interface, etc.
  • the USB interface 130 is an interface that complies with the USB standard specification, and can be a Mini USB interface, a Micro USB
  • the USB interface 130 can be used to connect a charger to charge the electronic device 100, or to transfer data between the electronic device 100 and a peripheral device. It can also be used to connect headphones to play audio through the headphones.
  • the interface can also be used to connect other electronic devices, such as AR devices.
  • the interface connection relationship between the modules illustrated in the embodiment of the present invention is only a schematic illustration and does not constitute a structural limitation on the electronic device 100.
  • the electronic device 100 may also adopt different interface connection methods in the above embodiments, or a combination of multiple interface connection methods.
  • the charging management module 140 is used to receive charging input from a charger.
  • the charger may be a wireless charger or a wired charger.
  • the charging management module 140 may receive charging input from a wired charger through the USB interface 130.
  • the charging management module 140 may receive wireless charging input through a wireless charging coil of the electronic device 100. While the charging management module 140 is charging the battery 142, it may also power the electronic device through the power management module 141.
  • the power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110.
  • the power management module 141 receives input from the battery 142 and/or the charging management module 140, and supplies power to the processor 110, the internal memory 121, the external memory, the display screen 194, the camera 193, and the wireless communication module 160.
  • the power management module 141 can also be used to monitor parameters such as battery capacity, battery cycle number, battery health status (leakage, impedance), etc.
  • the power management module 141 can also be set in the processor 110.
  • the power management module 141 and the charging management module 140 can also be set in the same device.
  • the wireless communication function of the electronic device 100 can be implemented through the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor and the baseband processor.
  • Antenna 1 and antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in electronic device 100 can be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve the utilization of antennas.
  • antenna 1 can be reused as a diversity antenna for a wireless local area network.
  • the antenna can be used in combination with a tuning switch.
  • the mobile communication module 150 can provide solutions for wireless communications including 2G/3G/4G/5G, etc., applied to the electronic device 100.
  • the mobile communication module 150 may include at least one filter, a switch, a power amplifier, a low noise amplifier (LNA), etc.
  • the mobile communication module 150 may receive electromagnetic waves from the antenna 1, and perform filtering, amplification, and other processing on the received electromagnetic waves, and transmit them to the modulation and demodulation processor for demodulation.
  • the mobile communication module 150 may also amplify the signal modulated by the modulation and demodulation processor, and convert it into electromagnetic waves for radiation through the antenna 1.
  • at least some of the functional modules of the mobile communication module 150 may be arranged in the processor 110.
  • at least some of the functional modules of the mobile communication module 150 may be arranged in the same device as at least some of the modules of the processor 110.
  • the modem processor may include a modulator and a demodulator.
  • the modulator is used to modulate the low-frequency baseband signal to be sent into a medium-high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low-frequency baseband signal.
  • the demodulator then transmits the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the application processor outputs a sound signal through an audio device (not limited to a speaker 170A, a receiver 170B, etc.), or displays an image or video through a display screen 194.
  • the modem processor may be an independent device.
  • the modem processor may be independent of the processor 110 and be set in the same device as the mobile communication module 150 or other functional modules.
  • the wireless communication module 160 can provide the electronic device 100 with wireless local area network (WLAN)
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2, modulates the frequency of the electromagnetic wave signal, performs filtering processing, and sends the processed signal to the processor 110.
  • the wireless communication module 160 may also receive a signal to be sent from the processor 110, modulate the frequency of the signal, amplify the signal, and convert it into an electromagnetic wave for radiation through the antenna 2.
  • the antenna 1 of the electronic device 100 is coupled to the mobile communication module 150, and the antenna 2 is coupled to the wireless communication module 160, so that the electronic device 100 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technology may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), wideband code division multiple access (WCDMA), time-division code division multiple access (TD-SCDMA), long term evolution (LTE), BT, GNSS, WLAN, NFC, FM, and/or IR technology.
  • the GNSS may include a global positioning system (GPS), a global navigation satellite system (GLONASS), a Beidou navigation satellite system (BDS), a quasi-zenith satellite system (QZSS) and/or a satellite based augmentation system (SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • BDS Beidou navigation satellite system
  • QZSS quasi-zenith satellite system
  • SBAS satellite based augmentation system
  • the electronic device 100 implements the display function through a GPU, a display screen 194, and an application processor.
  • the GPU is a microprocessor for image processing, which connects the display screen 194 and the application processor.
  • the GPU is used to perform mathematical and geometric calculations for graphics rendering.
  • the processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
  • the display screen 194 is used to display images, videos, etc.
  • the display screen 194 includes a display panel.
  • the display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode or an active-matrix organic light-emitting diode (AMOLED), a flexible light-emitting diode (FLED), Miniled, MicroLed, Micro-oLed, quantum dot light-emitting diodes (QLED), etc.
  • the electronic device 100 may include 1 or N display screens 194, where N is a positive integer greater than 1.
  • the electronic device 100 can realize the shooting function through ISP, camera 193, video codec, GPU, display screen 194 and application processor.
  • ISP is used to process the data fed back by camera 193. For example, when taking a photo, the shutter is opened, and the light is transmitted to the camera photosensitive element through the lens. The light signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to ISP for processing and converts it into an image visible to the naked eye. ISP can also perform algorithm optimization on the noise, brightness, and skin color of the image. ISP can also optimize the exposure, color temperature and other parameters of the shooting scene. In some embodiments, ISP can be set in camera 193.
  • the camera 193 is used to capture still images or videos.
  • the object is projected onto the photosensitive element through the lens to generate an optical image.
  • the photosensitive element can be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the optical signal into an electrical signal, which is then transmitted to the ISP for conversion into a digital image signal.
  • the ISP outputs the digital image signal to the DSP. Processing.
  • DSP converts the digital image signal into an image signal in a standard RGB, YUV or other format.
  • the electronic device 100 may include 1 or N cameras 193, where N is a positive integer greater than 1.
  • the digital signal processor is used to process digital signals, and can process not only digital image signals but also other digital signals. For example, when the electronic device 100 is selecting a frequency point, the digital signal processor is used to perform Fourier transform on the frequency point energy.
  • Video codecs are used to compress or decompress digital videos.
  • the electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record videos in a variety of coding formats, such as Moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
  • MPEG Moving Picture Experts Group
  • MPEG2 MPEG2, MPEG3, MPEG4, etc.
  • NPU is a neural network (NN) computing processor.
  • NN neural network
  • NPU can realize applications such as intelligent cognition of electronic device 100, such as image recognition, face recognition, voice recognition, text understanding, etc.
  • NPU can also realize the decision model provided in the embodiment of this application.
  • the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the electronic device 100.
  • the external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music and videos can be stored in the external memory card.
  • the internal memory 121 can be used to store computer executable program codes, which include instructions.
  • the processor 110 executes various functional applications and data processing of the electronic device 100 by running the instructions stored in the internal memory 121.
  • the internal memory 121 may include a program storage area and a data storage area.
  • the program storage area may store an operating system, applications required for at least one function (such as a sound playback function, an image playback function, etc.), etc.
  • the data storage area may store data created during the use of the electronic device 100 (such as audio data, a phone book, etc.), etc.
  • the internal memory 121 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one disk storage device, a flash memory device, a universal flash storage (UFS), etc.
  • UFS universal flash storage
  • the electronic device 100 can implement audio functions such as music playing and recording through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone jack 170D, and the application processor.
  • the audio module 170 is used to convert digital audio information into analog audio signal output, and is also used to convert analog audio input into digital audio signals.
  • the audio module 170 can also be used to encode and decode audio signals.
  • the audio module 170 can be arranged in the processor 110, or some functional modules of the audio module 170 can be arranged in the processor 110.
  • the speaker 170A also called a "speaker" is used to convert an audio electrical signal into a sound signal.
  • the electronic device 100 can listen to music or listen to a hands-free call through the speaker 170A.
  • the receiver 170B also called a "earpiece" is used to convert audio electrical signals into sound signals.
  • the voice can be received by placing the receiver 170B close to the human ear.
  • Microphone 170C also called “microphone” or “microphone” is used to convert sound signals into electrical signals. When making a call or sending a voice message, the user can speak by putting their mouth close to microphone 170C to input the sound signal into microphone 170C.
  • the electronic device 100 can be provided with at least one microphone 170C. In other embodiments, the electronic device 100 can be provided with two microphones 170C, which can not only collect sound signals but also realize noise reduction function. In other embodiments, the electronic device 100 can also be provided with three, four or more microphones 170C to collect sound signals, reduce noise, identify the sound source, realize directional recording function, etc.
  • the earphone interface 170D is used to connect a wired earphone.
  • the earphone interface 170D may be the USB interface 130, or may be a 3.5 mm open mobile terminal platform (OMTP) standard interface or a cellular telecommunications industry association of the USA (CTIA) standard interface.
  • OMTP open mobile terminal platform
  • CTIA cellular telecommunications industry association of the USA
  • the pressure sensor 180A is used to sense the pressure signal and can convert the pressure signal into an electrical signal.
  • the pressure sensor 180A can be set on the display screen 194.
  • the capacitive pressure sensor can be a parallel plate including at least two conductive materials.
  • the electronic device 100 determines the intensity of the pressure according to the change in capacitance.
  • the electronic device 100 detects the touch operation intensity according to the pressure sensor 180A.
  • the electronic device 100 can also calculate the touch position according to the detection signal of the pressure sensor 180A.
  • touch operations acting on the same touch position but with different touch operation intensities can correspond to different operation instructions. For example: when a touch operation with a touch operation intensity less than the first pressure threshold acts on the short message application icon, an instruction to view the short message is executed. When a touch operation with a touch operation intensity greater than or equal to the first pressure threshold acts on the short message application icon, an instruction to create a new short message is executed.
  • the gyro sensor 180B can be used to determine the motion posture of the electronic device 100.
  • the angular velocity of the electronic device 100 around three axes i.e., x, y, and z axes
  • the gyro sensor 180B can be used for anti-shake shooting. For example, when the shutter is pressed, the gyro sensor 180B detects the angle of the electronic device 100 shaking, calculates the distance that the lens module needs to compensate based on the angle, and allows the lens to offset the shaking of the electronic device 100 through reverse movement to achieve anti-shake.
  • the gyro sensor 180B can also be used for navigation and somatosensory game scenes.
  • the air pressure sensor 180C is used to measure air pressure.
  • the electronic device 100 calculates the altitude through the air pressure value measured by the air pressure sensor 180C to assist in positioning and navigation.
  • the magnetic sensor 180D includes a Hall sensor.
  • the electronic device 100 can use the magnetic sensor 180D to detect the opening and closing of the flip leather case.
  • the electronic device 100 when the electronic device 100 is a flip phone, the electronic device 100 can detect the opening and closing of the flip cover according to the magnetic sensor 180D. Then, according to the detected opening and closing state of the leather case or the opening and closing state of the flip cover, the flip cover can be automatically unlocked.
  • the acceleration sensor 180E can detect the magnitude of the acceleration of the electronic device 100 in all directions (generally three axes). When the electronic device 100 is stationary, the magnitude and direction of gravity can be detected. It can also be used to identify the posture of the electronic device and is applied to applications such as horizontal and vertical screen switching and pedometers.
  • the distance sensor 180F is used to measure the distance.
  • the electronic device 100 can measure the distance by infrared or laser. In some embodiments, when shooting a scene, the electronic device 100 can use the distance sensor 180F to measure the distance to achieve fast focusing.
  • the proximity light sensor 180G may include, for example, a light emitting diode (LED) and a light detector, such as a photodiode.
  • the light emitting diode may be an infrared light emitting diode.
  • the electronic device 100 emits infrared light outward through the light emitting diode.
  • the electronic device 100 uses a photodiode to detect infrared reflected light from nearby objects. When sufficient reflected light is detected, it can be determined that there is an object near the electronic device 100. When insufficient reflected light is detected, the electronic device 100 can determine that there is no object near the electronic device 100.
  • the electronic device 100 can use the proximity light sensor 180G to detect that the user holds the electronic device 100 close to the ear to talk, so as to automatically turn off the screen to save power.
  • the proximity light sensor 180G can also be used in leather case mode and pocket mode to automatically unlock and lock the screen.
  • the ambient light sensor 180L is used to sense the brightness of the ambient light.
  • the electronic device 100 can automatically The brightness of the display screen 194 is adjusted accordingly.
  • the ambient light sensor 180L can also be used to automatically adjust the white balance when taking photos.
  • the ambient light sensor 180L can also cooperate with the proximity light sensor 180G to detect whether the electronic device 100 is in a pocket to prevent accidental touches.
  • the fingerprint sensor 180H is used to collect fingerprints.
  • the electronic device 100 can use the collected fingerprint characteristics to achieve fingerprint unlocking, access application locks, fingerprint photography, fingerprint call answering, etc.
  • the temperature sensor 180J is used to detect temperature.
  • the electronic device 100 uses the temperature detected by the temperature sensor 180J to execute a temperature processing strategy. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold, the electronic device 100 reduces the performance of a processor located near the temperature sensor 180J to reduce power consumption and implement thermal protection. In other embodiments, when the temperature is lower than another threshold, the electronic device 100 heats the battery 142 to avoid abnormal shutdown of the electronic device 100 due to low temperature. In other embodiments, when the temperature is lower than another threshold, the electronic device 100 boosts the output voltage of the battery 142 to avoid abnormal shutdown caused by low temperature.
  • the touch sensor 180K is also called a "touch panel”.
  • the touch sensor 180K can be set on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, also called a "touch screen”.
  • the touch sensor 180K is used to detect touch operations acting on or near it.
  • the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
  • Visual output related to the touch operation can be provided through the display screen 194.
  • the touch sensor 180K can also be set on the surface of the electronic device 100, which is different from the position of the display screen 194.
  • the bone conduction sensor 180M can obtain a vibration signal. In some embodiments, the bone conduction sensor 180M can obtain a vibration signal of a vibrating bone block of the vocal part of the human body. The bone conduction sensor 180M can also contact the human pulse to receive a blood pressure beat signal. In some embodiments, the bone conduction sensor 180M can also be set in an earphone and combined into a bone conduction earphone.
  • the audio module 170 can parse out a voice signal based on the vibration signal of the vibrating bone block of the vocal part obtained by the bone conduction sensor 180M to realize a voice function.
  • the application processor can parse the heart rate information based on the blood pressure beat signal obtained by the bone conduction sensor 180M to realize a heart rate detection function.
  • the key 190 includes a power key, a volume key, etc.
  • the key 190 may be a mechanical key or a touch key.
  • the electronic device 100 may receive key input and generate key signal input related to user settings and function control of the electronic device 100.
  • Motor 191 can generate vibration prompts.
  • Motor 191 can be used for incoming call vibration prompts, and can also be used for touch vibration feedback.
  • touch operations acting on different applications can correspond to different vibration feedback effects.
  • touch operations acting on different areas of the display screen 194 can also correspond to different vibration feedback effects.
  • Different application scenarios for example: time reminders, receiving messages, alarm clocks, games, etc.
  • the touch vibration feedback effect can also support customization.
  • the indicator 192 may be an indicator light, which may be used to indicate the charging status, power changes, messages, missed calls, notifications, etc.
  • the SIM card interface 195 is used to connect a SIM card.
  • the SIM card can be connected to and separated from the electronic device 100 by inserting it into the SIM card interface 195 or pulling it out from the SIM card interface 195.
  • the electronic device 100 can support 1 or N SIM card interfaces, where N is a positive integer greater than 1.
  • the SIM card interface 195 can support Nano SIM cards, Micro SIM cards, SIM cards, and the like. Multiple cards can be inserted into the same SIM card interface 195 at the same time. The types of the multiple cards can be the same or different.
  • the SIM card interface 195 can also be compatible with different types of SIM cards.
  • the SIM card interface 195 can also be compatible with external memory cards.
  • the electronic device 100 interacts with the network through the SIM card to implement functions such as calls and data communications.
  • the electronic device 100 uses an eSIM, i.e., an embedded SIM card.
  • the eSIM card can be embedded in the electronic device 100, Cannot be separated from the electronic device 100.
  • the processor 110 can generate program instructions in combination with time information to change the display information of the display screen 194.
  • the processor 110 can also generate program instructions in combination with the geographic location information obtained by the wireless communication module 160 to change the display information of the display screen.
  • the memory in the processor 110 can also be used to store the user's historical operation information on the electronic device 100, which can include the number of views, access frequency, access time, etc. of the application in the user's electronic device, and can also include information of other dimensions, which is not limited in this application; the processor 110 can analyze the above historical operation information and generate program instructions to change the display information of the display screen.
  • FIG4 is a schematic diagram of a scenario-based desktop provided in an embodiment of the present application.
  • the user interface 40 (in some embodiments of the present application, the user interface 40 may be referred to as the first desktop) may provide the user with application or service cards that may be used in the current scenario in the form of cards plus application icons.
  • the user interface 40 may include a card area 401, an application area 402, and a dock area 403. Among them:
  • Card area 401 may include card 4011, card 4012, and card 4013, which may display page contents in their corresponding applications.
  • card 4011 is a card generated by application “Video”, which may be used to display content that a user has recently watched in application “Video”
  • card 4012 is a card generated by application “Gallery”, which may be used to display pictures taken by a user and stored in application “Gallery”
  • card 4011 is a card generated by application "Game”, which may be used to display content recommended by application "Game” to the user.
  • Each card may respond to user operations, causing electronic device 100 to open the application corresponding to the card and display the corresponding application page.
  • the cards in the card area 401 can also respond to user operations and implement certain functions provided by the application corresponding to the card without opening the application corresponding to the card in the electronic device 100.
  • card 4012 can update its image displayed in card 4012 in response to the user's click operation on update control 4012A.
  • card 4012 can share the image in card 4012 with others in response to the user's long press operation.
  • the electronic device 100 can respond to the user's operation on any card in the card area 401, such as a two-finger zoom operation, to adjust the shape or size of the corresponding card.
  • the electronic device 100 can simultaneously adjust the size or shape of other cards in the card area 401.
  • the electronic device 100 can delete the cards in the card area 401 or add cards corresponding to other applications to the card area 401 in response to the user's operation.
  • the electronic device 100 can adjust the size or shape of the cards in the card area 401 accordingly to provide a better visual effect for the user.
  • the relevant description of the subsequent embodiments which will not be repeated here.
  • the application area 402 may include, for example, an icon of "memo”, an icon of "recorder”, an icon of "switch clone", an icon of "calculator”, an icon of "gallery”, an icon of "ruler”, an icon of "remote control”, an icon of "contacts”, etc. It may also include icons of other applications, which are not limited in the embodiments of the present application. Any icon of an application may be used to respond to a user operation, such as a touch operation, so that the electronic device starts the application corresponding to the icon.
  • the dock area 403 may include, for example, a "phone” icon, a "text message” icon, a “browser” icon, and a "camera” icon.
  • a "phone” icon When the user switches the interface displayed by the electronic device 100 from the user interface 40 to another interface, The dock area 403 and the icons of the applications contained therein may always be displayed on the interface currently displayed by the electronic device 100 .
  • the size of the dock area 403 is adjustable.
  • the dock area 403 may include an adjustment control 403A, and the electronic device 100 may increase the area of the dock area 403 in response to the user's operation on the adjustment control 403A, such as a pull-up operation, and accordingly, the number of applications included in the dock area may also be increased.
  • the adjustment control 403A such as a pull-up operation
  • the number of applications included in the dock area may also be increased.
  • the electronic device 100 can automatically update the cards in the card area 401 and the icons in the application area 402 according to time changes, location changes, the context of events, and user habits. It should be understood that the update mentioned here is not limited to the change of the shape and content of the card, but can also include updating the card corresponding to a certain application in the card area 401 to the card corresponding to other applications. For details, please refer to the relevant description of the subsequent embodiments.
  • the size and position relationship of the card area 401, the application area 402 and the dock area 403 in the user interface 40 can also be expressed in other forms.
  • the user interface 40 only exemplifies the scenario-based desktop provided by the embodiment of the present application and should not constitute a limitation on the embodiment of the present application.
  • FIG 5 shows a process in which the electronic device 100 sets the desktop to the scenario-based desktop provided in the present application (eg, the user interface 40 in the aforementioned description) in response to user operations when the electronic device 100 is used for the first time after leaving the factory.
  • the scenario-based desktop provided in the present application (eg, the user interface 40 in the aforementioned description)
  • the user interface 50 may be an interface displayed by the electronic device 100 after the user turns on the electronic device 100 for the first time after the electronic device 100 leaves the factory.
  • the user interface 50 may include an interface thumbnail 501, an interface thumbnail 502, and an "enter new desktop" control 503.
  • the interface thumbnail 501 is a thumbnail of the scenario desktop provided by the present application
  • the interface thumbnail 502 is a thumbnail of the ordinary desktop.
  • a selection control is provided below the two thumbnails, namely, a selection control 501A and a selection control 502A. Any selection control can respond to the user's click operation to enable the electronic device 100 to set the desktop displayed by its corresponding interface thumbnail as the main screen interface of the electronic device 100.
  • the electronic device 100 can respond to the user's click operation on the selection control 501A in FIG5 , and after checking the interface thumbnail 501, the electronic device 100 can respond to the user's click operation on the "enter new desktop" control 503, and display the scenario desktop corresponding to the interface thumbnail 501 (such as the user interface 40 in the aforementioned description).
  • FIG6 illustrates a process in which the electronic device 100, in response to a user operation, sets the main screen interface from a normal desktop to a scenario-based desktop provided by the present application (eg, the user interface 40 in the aforementioned description).
  • the user interface 60 shown in (A) of FIG6 is the main screen interface of the electronic device 100. Combined with the above descriptions of FIG1 and FIG2 , it can be seen that the user interface 60 is a traditional common desktop.
  • the specific functions of the contents included in the user interface 60 can refer to the above descriptions of the main screen interface 10 or the main screen interface 20, which will not be repeated here.
  • the electronic device 100 may display a user interface 61 as shown in (B) of FIG6 in response to a user operation on the user interface 60, such as a two-finger zoom operation as shown in (A) of FIG6 .
  • the user interface 61 may include: an interface thumbnail 611, an interface thumbnail 612, an interface thumbnail 613, an interface thumbnail 614, and a function option area 615. Among them:
  • the interface thumbnail 611 is a thumbnail of the scenario desktop provided by the present application, and the interface thumbnails 612-614 are thumbnails of the ordinary desktop.
  • the interface thumbnails 613-614 are not fully displayed due to the screen area limitation of the electronic device 100, and the user
  • the interface thumbnails 613 - 614 can be fully displayed on the screen by pulling down.
  • Function option area 615 may include wallpaper option 615A, desktop card option 615B, switching effect option 615C and desktop setting option 615D.
  • wallpaper option 615A can be used to set the desktop wallpaper of electronic device 100
  • desktop card option 615B can be used to add a card corresponding to an application to the desktop of electronic device 100
  • switching effect option 615C can set the switching effect between interfaces when the number of interfaces in the interface list of electronic device 100 is greater than 1, such as flip switching, rotation switching, etc.
  • desktop setting option 615D can be used to set parameters such as icon size and desktop layout of applications on the desktop of electronic device 100.
  • each interface thumbnail there is a selection control at the bottom of each interface thumbnail, such as the selection control 611B at the bottom of the interface thumbnail 611 and the selection control 612B at the bottom of the interface thumbnail 612.
  • a selection control is checked (i.e., the selection control is in a dark state, such as the selection control 611B)
  • the selection control is in a dark state, such as the selection control 611B
  • any selection control can respond to the user's click operation, so that the electronic device 100 adds or removes the user interface displayed by its corresponding interface thumbnail to the interface list of the electronic device 100.
  • the electronic device 100 can respond to the user's click operation on the selection control 611B, and add the scenario interface displayed by the interface thumbnail 611A to the interface list of the electronic device 100.
  • the upper area in each interface thumbnail can include a main interface control, such as the main interface control 611A in the interface thumbnail 611 and the main interface control 612A in the interface thumbnail 612.
  • a main interface control is selected (i.e., the main interface control is in a dark state, such as selecting control 612A), it means that the interface displayed by the interface thumbnail corresponding to the main interface control will be set as the main screen interface of the electronic device 100.
  • the main interface control on any interface thumbnail can respond to a user click operation, so that the electronic device 100 sets the interface corresponding to the interface thumbnail as the main screen interface of the electronic device 100.
  • the specific style (i.e., appearance) of the main interface control 611A can be distinguished from the main interface control 612A and the main interface controls in other interface thumbnails to prompt the user that the interface thumbnail corresponding to the main interface control 611A is a thumbnail of the scenario-based desktop.
  • the electronic device 100 can also use other methods to distinguish the thumbnails of the scenario-based desktop from the thumbnails of the ordinary desktop, and this application is not limited to this.
  • the electronic device 100 can set the scenario-based interface displayed by the interface thumbnail 611A as the main screen interface of the electronic device 100 in response to the user's click operation on the main interface control 611A; at this time, as shown in (D) of FIG. 6 , the main interface control 611A is in a selected state, and the main interface control 612A is updated from the historical selected state to an unselected state, that is, the interface displayed by the interface thumbnail 612 is no longer set as the main screen interface of the electronic device 100.
  • the electronic device 100 can set the scenario-based desktop corresponding to the interface thumbnail 501 (such as the user interface 40 in the aforementioned description) as the main screen interface of the electronic device 100 in response to the user operation, such as the user's click operation on the blank area in the interface 61 as shown in (D) of FIG. 6 .
  • the size of the dock area can be adjusted, and accordingly, the number of application icons contained in the dock area can also increase as the dock area increases.
  • the user interface shown in FIG7 shows the process in which the electronic device 100 adjusts the size of the dock area in response to user operation, and adaptively adjusts the shape of the card in the interface as the size of the dock area changes.
  • the user interface 70 (in some embodiments of the present application, the user interface 70 may be referred to as the first desktop) is a form of expression of the scenario-based desktop provided by the embodiment of the present application.
  • the user interface 70 may include
  • the user interface 40 in FIG. 4 includes a card area 701, an application area 702, and a dock area 703 (in some embodiments of the present application, the dock area 703 may be referred to as a first dock area).
  • the specific content of each area and the function of each area may be specifically referred to in the above description of the user interface 40 in FIG. 4, which will not be repeated here.
  • the dock area 703 may include an adjustment control 703A.
  • the electronic device 100 may increase the area of the dock area 703 in response to the user's operation on the adjustment control 703A, such as the pull-up operation shown in (A) of FIG. 7 (in some embodiments of the present application, the pull-up operation may be referred to as the first operation). Accordingly, the number of application icons included in the dock area 703 may also increase, and the types of application icons added may be set before the electronic device 100 is launched.
  • the user interface 71 may be referred to as the third desktop), compared with the dock area 703 shown in (A) of FIG. 7, the area of the dock area 713 shown in (B) of FIG.
  • the dock area 713 (in some embodiments of the present application, the dock area 713 may be referred to as the second dock area) becomes larger. After the dock area 713 becomes larger, the dock area 713 also adds icons 7131 of "music", 7132 of "e-book", 7133 of "mail” and 7134 of "mobile phone theme”.
  • the area of the dock area 713 will continue to increase, and the number of application icons contained in the dock area 713 will continue to increase.
  • the area of the dock area 723 shown in FIG7 (C) is larger.
  • the dock area 723 is further added with the "clock” icon 7235, the "calendar” icon 7236, the "settings” icon 7237 and the "security center” icon 7238.
  • the electronic device 100 can adaptively change the number and shape of cards in the card area in the scenario-based desktop, and can also adaptively increase or delete application icons in the application area.
  • the electronic device 100 may no longer display the “Gallery” icon, the “Ruler” icon, the “Remote Control” icon, and the “Contacts” icon in the application area.
  • the electronic device 100 may delete the “Gallery” icon, the “Ruler” icon, the “Remote Control” icon, and the “Contacts” icon in the dock area 702; or, the electronic device 100 may not delete the “Gallery” icon, the “Ruler” icon, the “Remote Control” icon, and the “Contacts” icon in the dock area 702, but hide them under the dock area 713, that is, cover the “Gallery” icon, the “Ruler” icon, the “Remote Control” icon, and the “Contacts” icon with the dock area 713, and the present application does not limit this.
  • the area of the area will affect the area of the card area, so the electronic device 100 can adaptively change the number, shape, position and other parameters of the cards in the card area in the scene desktop.
  • the electronic device 100 can adjust the size, shape and position of each card in the card area 721 from the style shown in (C) of Figure 7 to the style shown in (D) of Figure 7.
  • Figure 7 only exemplarily reflects the process of the electronic device 100 adjusting the various areas of the scenario-based interface.
  • the process of the electronic device 100 adjusting the various areas of the scenario-based interface can also be expressed in other forms, which should not constitute a limitation on the embodiments of the present application.
  • the user can delete, add, and adjust the shape and size of the cards in the card area. Accordingly, when the user deletes a card in the card area, adds a card corresponding to another application to the card area, or adjusts the shape and size of a card, in order to ensure the user's browsing experience, the electronic device 100 can also adaptively adjust the shape, size, and specific position of other cards in the card area.
  • Figures 8-10 respectively and exemplarily show the specific process of the electronic device 100 adaptively adjusting the shape, size and specific position of other cards in the card area after the user adjusts the shape and size of a card, deletes a card in the card area, and adds a card to the card area.
  • the card area of the user interface 80 includes cards 801, 802, and 803.
  • the electronic device can shrink card 803 in response to the user's operation on card 803 (in some embodiments of the present application, card 803 can be referred to as the eighth card), such as the two-finger zoom-in and zoom-out operation shown in (A) of FIG8 (in some embodiments of the present application, the two-finger zoom-out operation can be referred to as the fourth operation).
  • the user interface 80 after the card 803 is reduced can refer to (B) in FIG8 .
  • the card 813 is the card obtained after the card 803 is reduced. Since the card 803 is reduced, the electronic device 100 adaptively enlarges the card 802 to obtain the card 812 shown in (B) in FIG8 . Accordingly, the shapes of the cards 802 and 803 and their positions in the card area have also changed.
  • the electronic device 100 may display the user interface 90 shown in (A) of FIG. 9 in response to the user's long press operation on the card 4013 (in some embodiments of the present application, the card 4013 may be referred to as the fourth card) included in the interface 40 shown in FIG. 4.
  • the user interface 90 may include a card 901, a card 902, a card 903, and a control card 9031, wherein:
  • Card 901 corresponds to card 4011 in user interface 40;
  • card 902 corresponds to card 4012 in user interface 40;
  • card 903 corresponds to card 4013 in user interface 40;
  • Control card 9031 may include a remove control 9031A.
  • the electronic device 100 can delete the card 903 from the dock area in the user interface 90 and display the user interface 91 shown in (B) of FIG. 9 (in some embodiments of the present application, the user interface 91 can be referred to as the fourth desktop).
  • the user interface 91 may include cards 911, 912 (in In some embodiments of the present application, cards 911-912 may be referred to as at least one fifth card), wherein:
  • Card 911 corresponds to card 901 in user interface 90
  • card 912 corresponds to card 902 in user interface 40 ;
  • electronic device 100 adaptively enlarges card 901 and card 902, and adjusts the shapes of card 802 and card 803 and their positions in the card area accordingly.
  • the electronic device 100 may also display the user interface 92 shown in (A) of FIG. 10 in response to the user's long press operation on the interface 40 shown in FIG. 4.
  • the user interface 92 may include a card 921, a card 922, a card 923, and a control card 9231, wherein:
  • Card 921 corresponds to card 4011 in user interface 40;
  • card 922 corresponds to card 4012 in user interface 40;
  • card 923 corresponds to card 4013 in user interface 40;
  • Control card 9231 can contain more card controls 9231B.
  • the electronic device 100 may respond to the user's operation on the more card control 9231B, such as the click operation shown in (A) of FIG. 10 (in some embodiments of the present application, the aforementioned long press operation on the card 4013 and the click operation shown in (A) of FIG. 10 may be collectively referred to as the third operation), add other cards to the card area, and display the user interface 93 shown in (B) of FIG. 10 (in some embodiments of the present application, the user interface 93 may be referred to as the fifth desktop).
  • the user interface 93 may include card 931, card 932, card 933, and card 934 (in some embodiments of the present application, card 934 may be referred to as the sixth card), where:
  • electronic device 100 adaptively reduces the size of card 923 and adjusts the shape of card 923 and its position in the card area accordingly.
  • Figures 8 to 10 are only exemplary embodiments of the process of adjusting the scene-based interface cards by the electronic device 100.
  • the electronic device 100 may also adjust the scene-based interface cards in other situations and in other forms, which are not limited in this application. For example, after deleting a card, the electronic device 100 may not enlarge all the remaining cards, but may enlarge some of the remaining cards, or even reduce the remaining cards to obtain some cards, which will not be illustrated one by one here.
  • the electronic device 100 can automatically update the content in the scenario-based desktop according to parameters such as time changes, location changes, the context of events, and user habits.
  • the electronic device 100 can automatically update the content in the scenario-based desktop according to changes in other parameters, which is not limited in the present application.
  • the content may include cards in the card area and icons in the application area.
  • the electronic device 100 updates the content in the scene-based desktop according to time
  • the user's daily schedule is likely to be regular. For example, for most people, 7:00 a.m. is the time to get up, 8:00-9:00 a.m. need to leave home to take a car or drive to the company, and get off work after 6:00 p.m.
  • the content of the scenario-based desktop is updated so that the user can more quickly find the application that needs to be opened at the current time. Please refer to Figure 11 for details.
  • the user interface 01 (in some embodiments of the present application, the user interface 01 may be referred to as the first desktop) is a scenario-based desktop provided in the embodiment of the present application, and the content included in it is recommended by the electronic device 100 to the user when the user goes out to work at 8:00 a.m. according to the user's habits.
  • the user interface 01 includes a card area 011, an application area 012, and a dock area 013, wherein:
  • Card area 011 may include card 0111, card 0112, and card 0113 (in some embodiments of the present application, cards 0111-0113 may be referred to as at least one first card).
  • Card 0111 is a schedule reminder card generated by the application "Calendar”, which is used to display the schedule information preset by the user;
  • card 0112 is a route navigation card generated by the application “Map”, which is used to display information such as the route and duration planned for the user to reach the destination;
  • card 0113 is a player card generated by the application "Music”, which displays information about the currently playing song to the user and provides the user with functions such as switching songs, playing songs, and pausing playing songs.
  • Each card can respond to user operations to enable the electronic device 100 to open the application corresponding to the card and display the corresponding application page.
  • the application area 012 may include an icon of a "clock",
  • the icon of any application can be used to respond to the user's operation, such as a touch operation, so that the electronic device 100 starts the application corresponding to the icon.
  • the dock area 013 may include, for example, an icon for “phone”, an icon for “text message”, an icon for “browser”, and an icon for “camera”, and may also include icons for other applications, which is not limited in the embodiments of the present application (in some embodiments of the present application, all applications in the application area 012 and the dock area 013 may be referred to as at least one first application).
  • Any application icon can be used to respond to user operations, such as touch operations, so that the electronic device 100 starts the application corresponding to the icon.
  • the dock area 013 can include an adjustment control.
  • the electronic device 100 can increase the area of the dock area 013 in response to the user's operation on the adjustment control.
  • the number of application icons included in the dock area 013 can also be increased.
  • the types of application icons added can be set before the electronic device 100 is launched.
  • the dock areas shown in the subsequent embodiments can all refer to the relevant description of Figure 7 in the above embodiment.
  • the electronic device 100 can update the content displayed in the card area and the application area in the scenario desktop. That is, the electronic device 100 can update the scenario desktop from the user interface 01 shown in (A) of Figure 11 to the user interface 02 shown in (B) of Figure 11 (in some embodiments of the present application, the user interface 02 can be referred to as the second desktop).
  • the user interface 02 may include card areas 021-023, application areas 024-025 and dock area 026, where:
  • Card areas 021-023 (in some embodiments of the present application, all cards in cards 021-023 may be referred to as at least one second card) may be collectively referred to as the card area of the user interface 02, and application areas 024-025 may be collectively referred to as the application area of the user interface 02. That is, in the present application, the card area or application area of the scenario-based desktop may be In order to be divided into multiple, multiple card areas in the same interface may not be presented as a connected state, and multiple application areas in the same interface may not be presented as a connected state, but may be presented as a staggered arrangement as shown in the user interface 02.
  • the card area 021 may include a card for a subway code generated by the application "WeChat", which is used to display information about subway stations near the user's current location and provide the user with a function to quickly open the subway code; the card area 022 may include a news card generated by the application "Daily News” to display current news of the day; the card area 023 may include a stock quote card generated by the application "Stocks” to display the stock market trend of the day for the user. Each card can respond to user operations to enable the electronic device 100 to open the application corresponding to the card and display the corresponding application page.
  • Application area 024 may include Icons, , the icon of "novel reading” and the icon of vibrato; the application area 025 may include the icon of "radio", Icons,
  • the icon of any application can be used to respond to the user's operation, such as a touch operation, so that the electronic device 100 starts the application corresponding to the icon.
  • the dock area 026 may include, for example, an icon of "phone”, an icon of "text message”, an icon of "browser", and an icon of "camera”, and may also include icons of other applications, which are not limited in the embodiments of the present application (in some embodiments of the present application, all applications in the application areas 024-025 and the dock area 026 may be referred to as at least one second application). Any application icon may be used to respond to a user operation, such as a touch operation, so that the electronic device 100 starts the application corresponding to the icon. For details, please refer to the aforementioned description of the dock area 013.
  • the electronic device 100 can display the user interface 02 for the user to meet the user's needs at this time.
  • the electronic device 100 can update the content displayed in the card area and the application area in the scenario desktop again. That is, the electronic device 100 can update the scenario desktop from the user interface 01 shown in (B) of Figure 11 to the user interface 03 shown in (C) of Figure 11.
  • the user interface 02 may include card areas 031-033, application areas 035-036 and dock area 036, where:
  • the card areas 021-023 may be collectively referred to as the card areas of the user interface 02
  • the application areas 023-024 may be collectively referred to as the application areas of the user interface 02. That is, in the present application, the card area or application area of the scenario-based desktop may be divided into multiple areas, and multiple card areas in the same interface may not be presented as a connected state, and multiple application areas in the same interface may not be presented as a connected state, but may be presented as a staggered arrangement as shown in the user interface 02.
  • the card area 031 may include cards generated by the application "Video” to recommend currently popular TV series or TV series that the user has recently watched to the user, cards generated by the application "Gallery” to show the user the photos that the user has taken in the past, cards generated by the application “Game” to recommend currently popular games to the user, and cards generated by the application “Game” to recommend currently popular games to the user.
  • the generated cards are used to recommend hot-selling products or discounted products to users;
  • the card area 032 may include cards generated by the application "sports health” to show the user's daily exercise volume;
  • the card area 033 may include applications
  • the generated cards recommend videos that the user may like or display prompt information to the user, such as content updates of accounts followed by the user, etc. Each card can respond to user operations to enable the electronic device 100 to open the application corresponding to the card and display the corresponding application page.
  • Application area 034 may include icons for "games” and “videos”; application area 035 may include icons for "Yoga” Icons, The icon of any application can be used to respond to a user operation, such as a touch operation, so that the electronic device 100 starts the application corresponding to the icon.
  • the dock area 036 may include, for example, an icon of a "phone”, an icon of a “text message”, an icon of a “browser”, and an icon of a “camera”, and may also include icons of other applications, which are not limited in the present embodiment. Any icon of an application may be used to respond to a user operation, such as a touch operation, so that the electronic device 100 starts the application corresponding to the icon, and for details, reference may be made to the aforementioned description of the dock area 013.
  • the electronic device 100 can display the user interface 03 for the user to meet the user's needs at this time.
  • the embodiment of the present application only exemplifies the process of adjusting the content of the scenario desktop according to time by the electronic device 100, which should not constitute a limitation of the present application.
  • the electronic device 100 can also update the content of the scenario desktop at other times of the day, for example, at 12:00 noon, the electronic device 100 can update the content in the scenario desktop and display the card generated by the application "Meituan" to the user.
  • the user does not need to get up early to go to work, so the content contained in the scenario desktop displayed by the electronic device 100 at 8:00 in the morning may be different from the content contained in the user interface 01.
  • the personal habits of the same user may also change.
  • the electronic device 100 can record the buried point data in real time, and the buried point data can include the user's most recently frequently used applications and the time to open the application.
  • the electronic device 100 can analyze the newly acquired buried point data and update the content of the scenario desktop according to the user's latest usage habits to provide the user with a better user experience.
  • the electronic device 100 updates the content in the scene-based desktop according to the location
  • the electronic device 100 may be an electronic device with a GPS positioning function. Therefore, the electronic device 100 can obtain the location information of the current environment in real time, and update the content of the scenario desktop at different locations in combination with the location information, so that the user can find the application that needs to be opened at the current location more quickly. Please refer to FIG. 12 for details.
  • the user interface 04 is a scenario-based desktop provided in an embodiment of the present application, and the content included in the desktop is recommended by the electronic device 100 to the user when the user approaches a bus stop.
  • the user interface 04 includes a card area 041, an application area 042, and a dock area 043, wherein:
  • Card area 041 may include card 0411, card 0412, and card 0413.
  • Card 0411 is a transportation recommendation card generated by the application “Map”, which is used to display information about the relevant bus routes of the bus station currently located by the user;
  • card 0412 is a traffic recommendation card generated by the application “Map”, which is used to display information about the relevant bus routes of the bus station currently located by the user; The generated card is used to display the information of the bus stops near the user's current location, and provide the user with the function of quickly opening the subway boarding code;
  • card 0413 is a player card generated by the application "Music”, which displays the information of the currently playing song to the user and provides the user with functions such as switching songs, playing songs, and pausing songs. Each card can respond to user operations, allowing the electronic device 100 to open the application corresponding to the card and display the corresponding application page.
  • Application area 042 may include Icons, The icon of any application can be used to respond to the user's operation, such as a touch operation, so that the electronic device 100 starts the application corresponding to the icon.
  • the dock area 043 may include, for example, an icon of "phone”, an icon of "text message”, an icon of "browser”, and an icon of "camera”, and may also include icons of other applications, which are not limited in the present embodiment. Any icon of an application may be used to respond to a user operation, such as a touch operation, so that the electronic device 100 starts the application corresponding to the icon. For details, please refer to the above description of the dock area 013.
  • the electronic device 100 can display route information related to nearby stations to the user, and provide the user with a function card for quickly opening the boarding code.
  • the electronic device 100 can display the user interface 04 to meet the user's needs when taking the bus or subway.
  • the electronic device 100 can update the content displayed in the card area and the application area in the scenario desktop. That is, the electronic device 100 can update the scenario desktop from the user interface 04 shown in (A) of FIG. 12 to the user interface 05 shown in (B) of FIG. 12.
  • the user interface 05 may include a card area 051, application areas 052-053, and a dock area 054, where:
  • the card area 051 may include card 0511, card 0512, and card 0513.
  • card 0511 is an application The generated card is used to display the information of the currently popular movies and provide users with a quick way to buy tickets
  • card 0512 is the application The generated card is used to show the user the high-scoring movies selected by the public, providing the user with a reference for selecting movies
  • card 0513 is a card generated by the application "map", which shows the user the location information of the nearby cinema, the various activities currently provided by the cinema, and the entertainment projects near the cinema.
  • Each card can respond to the user's operation, so that the electronic device 100 opens the application corresponding to the card and displays the corresponding application page.
  • Application area 052 may include Icons, Icon; application area 053 can contain Icons,
  • the icon of any application can be used to respond to a user operation, such as a touch operation, so that the electronic device 100 starts the application corresponding to the icon.
  • the dock area 054 may include, for example, an icon of a "phone”, an icon of a “text message”, an icon of a “browser”, and an icon of a “camera”, and may also include icons of other applications, which are not limited in the present embodiment. Any icon of an application may be used to respond to a user operation, such as a touch operation, so that the electronic device 100 starts the application corresponding to the icon, and for details, reference may be made to the aforementioned description of the dock area 013.
  • the electronic device 100 can show the user relevant information about the recently released movies, provide the user with some high-scoring movies selected by the public, and provide reference information when the user chooses a movie; in addition, based on the user's ticket purchase needs, the electronic device 100 can display applications for ticket purchase or cards generated by applications, such as applications such as "Tao Piao" or "WeChat". Therefore, when the user is near a movie theater, the electronic device 100 can display the user interface 05 to the user to meet the user's movie-watching needs.
  • the electronic device 100 can update the contents displayed in the card area and the application area in the scenario desktop again. That is, the electronic device 100 can update the scenario desktop from the user interface 05 shown in (B) of FIG. 12 to the user interface 06 shown in (C) of FIG. 12.
  • the user interface 06 may include a card area 061, an application area 062, and a dock area 063, where:
  • Card area 061 may include card 0611, which is an application Generated cards for Displays the express information that the user has not currently picked up, such as the pickup location, pickup code and waybill number of the express information, etc.
  • Card 0611 can respond to user operations and enable the electronic device 100 to open the application corresponding to the card and display the corresponding application page.
  • Application area 062 may include Icons, Icons, Icons and The icon of any application can be used to respond to a user operation, such as a touch operation, so that the electronic device 100 starts the application corresponding to the icon.
  • the dock area 063 may include, for example, an icon of "phone”, an icon of "text message”, an icon of "browser”, and an icon of "camera”, and may also include icons of other applications, which are not limited in the present embodiment. Any icon of an application may be used to respond to a user operation, such as a touch operation, so that the electronic device 100 starts the application corresponding to the icon. For details, please refer to the above description of the dock area 013.
  • the electronic device 100 can display the information of the express delivery that the user has not taken out to the user; at the same time, the electronic device 100 can also display the application that can be used to view the detailed information of the express delivery to the user, such as the application or Etc. to meet users' needs for quick pickup.
  • the application can be used to view the detailed information of the express delivery to the user, such as the application or Etc. to meet users' needs for quick pickup.
  • the electronic device 100 updates the content in the scenario desktop according to the event context
  • the electronic device 100 can update the content of the scenario desktop according to the application or card that the user browsed when using the electronic device last time, and recommend cards or applications that meet the user's current usage needs to the user. Please refer to Figure 13 for details.
  • the user interface 07 is a scenario-based desktop provided in an embodiment of the present application, and the content included in it is recommended to the user when the electronic device 100 recognizes that the user's daily schedule includes the schedule of "watching a movie".
  • the user interface 07 includes a card area 071, an application area 072, and a dock area 073, wherein:
  • Card area 071 may include card 0711 and card 0712.
  • card 7111 is a schedule card generated by the application "Calendar”, which is used to display the schedule information pre-set by the user;
  • card 7111 is a recommendation card generated by the application "Map”, which may be a card displayed to the user based on the schedule information recognized by the electronic device 100, that is, the schedule information "09:00-11:00 Watch a movie with Xiaomei” displayed in card 7111, and which can be used to recommend entertainment venues such as cinemas or movie theaters around the user's location.
  • Each card can respond to user operations, causing the electronic device 100 to open the application corresponding to the card and display the corresponding application page.
  • Application area 072 may include Icons, The icon of any application can be used to respond to the user's operation, such as a touch operation, so that the electronic device 100 starts the application corresponding to the icon.
  • the dock area 743 may include, for example, an icon of "phone”, an icon of "text message”, an icon of "browser”, and an icon of "camera”, and may also include icons of other applications, which are not limited in the present embodiment. Any icon of an application may be used to respond to a user operation, such as a touch operation, so that the electronic device 100 starts the application corresponding to the icon. For details, please refer to the above description of the dock area 013.
  • the electronic device 100 can remind the user of the start time of the schedule, and based on the schedule, display applications or cards related to the schedule in the scenario interface, such as displaying card 0711, to show the user information about nearby cinemas so that the user can make corresponding plans for the schedule.
  • the electronic device 100 may respond to the operation and determine that the user has plans to go to the cinema in the near future. Accordingly, the electronic device 100 may update the content displayed in the card area and the application area in the scenario-based desktop, that is, the electronic device 100 may update the scenario-based desktop from the user interface 07 shown in (A) in FIG. 13 to the user interface 08 shown in (B) in FIG. 13 .
  • the user interface 08 may include a card area 081 , an application area 082 , and a dock area 083 , wherein:
  • Card area 081 may include card 0811, card 0812, card 0513, and card 0814.
  • card 0812 is a card generated by the application "Calendar”, which is used to display the schedule information pre-set by the user. It can be seen from interface 08 that the time is 8:30, and there are still 30 minutes before the start time of the schedule "Watching a movie with Xiaomei”.
  • card 0811 is different from the information displayed in card 0711;
  • card 0812 is a card generated by the application "Map", which shows the user the location information of the nearby attached movie theater and the route map to the movie theater, so that the user can determine the departure time;
  • card 0813 is a card generated by the application "Map”, which shows the user the location information of the nearby attached movie theater and the route map to the movie theater, so that the user can determine the departure time;
  • the generated player card is used to play music for users on their way to the cinema;
  • card 0814 is the application The generated cards are used to show users the high-scoring movies selected by the public, providing users with references for movie selection. Each card can respond to user operations, causing the electronic device 100 to open the application corresponding to the card and display the corresponding application page.
  • Application area 052 may include Icons, The icon of any application can be used to respond to a user operation, such as a touch operation, so that the electronic device 100 starts the application corresponding to the icon.
  • the dock area 083 may include, for example, an icon of "phone”, an icon of "text message”, an icon of "browser”, and an icon of "camera”, and may also include icons of other applications, which are not limited in the present embodiment. Any icon of an application may be used to respond to a user operation, such as a touch operation, so that the electronic device 100 starts the application corresponding to the icon. For details, please refer to the above description of the dock area 013.
  • the electronic device 100 can display the location information of nearby cinemas to the user, and display relevant information about recently released movies, and provide the user with some high-scoring movies selected by the public, such as displaying card 0812, showing the user information about nearby cinemas so that the user can make corresponding plans for his schedule.
  • the electronic device 100 may respond to the operation and believe that the user has plans to watch a movie in the near future. Accordingly, the electronic device may update the content displayed in the card area and the application area in the scenario-based desktop, that is, the electronic device 100 may update the scenario-based desktop from the user interface 08 shown in (B) in FIG. 13 to the user interface 09 shown in (C) in FIG. 13 .
  • the user interface 09 may include a card area 091 , an application area 092 , and a dock area 093 , wherein:
  • Card area 091 may include card 0911, card 0912 and card 0913.
  • card 0911 is a card generated by the application "Calendar” and is used to display the schedule information pre-set by the user. As can be seen from interface 09, at this time The time is 08:50, and there are still 10 minutes to the start time of the schedule "Watch a movie with Xiaomei", which means that the user may be about to execute the schedule;
  • card 0912 is the application The generated card is used to display the information of the currently popular movies and provide users with a quick way to buy tickets;
  • card 0913 is the application The generated cards are used to show the user the movie-watching activity information of each cinema on that day. Each card can respond to the user's operation, so that the electronic device 100 opens the application corresponding to the card and displays the corresponding application page.
  • Application area 092 may include Icons, Icons, Icons and The icon of any application can be used to respond to a user operation, such as a touch operation, so that the electronic device 100 starts the application corresponding to the icon.
  • the dock area 063 may include, for example, an icon of "phone”, an icon of "text message”, an icon of "browser”, and an icon of "camera”, and may also include icons of other applications, which are not limited in the present embodiment. Any icon of an application may be used to respond to a user operation, such as a touch operation, so that the electronic device 100 starts the application corresponding to the icon. For details, please refer to the above description of the dock area 013.
  • the electronic device 100 can display relevant information about the recently released movies to the user.
  • the electronic device 100 can display an application for purchasing tickets or a card generated by the application to the user, such as an application or Etc. to meet the viewing needs of users.
  • Figure 14 is a flow chart of a display method provided by the present application.
  • the electronic device can dynamically present the application and service cards that the user may use in the current scene according to the actual needs of the user in the current scene, which can help the user locate the application that he needs to browse more quickly.
  • the display method may include the following steps:
  • the electronic device identifies the first scene.
  • the electronic device may be a mobile phone, an on-board device (e.g., an on-board unit (On Board Unit, OBU)), a tablet computer (pad), a computer with a display function (e.g., a laptop computer, a PDA, etc.), etc.
  • OBU on Board Unit
  • the electronic device may be the electronic device 100 provided in the embodiment of the present application. It is understandable that the present application does not limit the specific form of the above-mentioned electronic device.
  • the electronic device can identify the first scene based on dimensions such as time, location, and the relationship between events. For example, when the electronic device identifies the first scene based on time, the electronic device can determine what the user is used to doing at this time point based on the current time; when the electronic device identifies the first scene based on location, the electronic device can determine the activities that the user may be doing at the moment based on the current location and the characteristics of the location; when the electronic device identifies the first scene based on location, the electronic device can determine the user's next specific purpose (such as watching a movie) based on the user's last operation on the electronic device (such as opening a map to navigate to a movie theater).
  • the electronic device can determine what the user is used to doing at this time point based on the current time; when the electronic device identifies the first scene based on location, the electronic device can determine the activities that the user may be doing at the moment based on the current location and the characteristics of the location; when the electronic device identifies the first scene based on
  • S102 Displaying a first desktop, where the first desktop includes a plurality of first application icons and at least one first card.
  • the content displayed in the first desktop is determined based on the first scene. That is, the multiple first applications are applications that the user may need to browse or open in the first scene, and the at least one first card is a card generated by an application that the user may need to browse or open in the first scene.
  • the electronic device presents the first desktop according to the actual needs of the user in the current scene.
  • the application and service cards that the user may use in the current scenario can help users locate the applications they need to browse more quickly and save users’ time.
  • S104 Displaying a second desktop, wherein the second desktop includes multiple second applications and at least one second card, the multiple second application icons are different from the multiple first application icons, and/or the at least one second card is different from the at least one first card.
  • the above-mentioned multiple second applications and the above-mentioned at least one second card are determined based on the above-mentioned second scene, and after the scene changes, the content displayed by the electronic device will also be different, that is, the above-mentioned multiple second application icons are different from the above-mentioned multiple first application icons, and/or the above-mentioned at least one second card is different from the above-mentioned at least one first card.
  • the basis for the electronic device to determine that the scene changes from the first scene to the second scene may include but is not limited to the following:
  • the electronic device can update the content of the scenario desktop at different times based on the user's habits, so that the user can find the application that needs to be opened at the current time more quickly.
  • the electronic device can update the content of the scenario desktop at different times based on the user's habits, so that the user can find the application that needs to be opened at the current time more quickly.
  • the applications that users need to open are often directly related to the location of the users. For example, users generally open the boarding code at the bus station, the payment code at the store, and the navigation software on the highway. Therefore, in this embodiment, the electronic device can obtain the location information of the current environment in real time, and update the content of the scenario desktop in different locations in combination with the location information, so that users can find the applications that need to be opened at the current location more quickly. For details, please refer to the above description of Figure 12, which will not be repeated here.
  • the electronic device can update the content of the scenario desktop based on the applications or cards that the user browsed the last time he used the electronic device, and recommend cards or applications that meet the user's current usage needs. For details, please refer to the aforementioned description of Figure 13, which will not be repeated here.
  • the first desktop and the second desktop may include a dock area
  • the user may adjust the size of the dock area by, for example, pulling up or pulling down, and the number of programs that can be accommodated in the dock area may also be limited. There will be more. In this way, users can put more of their frequently used applications into the dock area, so that users can directly and quickly start more applications.
  • the electronic device can adaptively add or delete icons of applications outside the dock area to provide users with a better visual experience.
  • the electronic device when the electronic device changes the area of the dock area in the above-mentioned first desktop or the above-mentioned second desktop, the electronic device can adaptively change the display mode of the cards in the above-mentioned first desktop or the second desktop, and the display mode includes the number, shape and arrangement of the cards to provide users with a better browsing effect.
  • the display mode includes the number, shape and arrangement of the cards to provide users with a better browsing effect.
  • the electronic device can delete, add, scale, and perform other operations on the cards in the first desktop or the second desktop in response to user operations. Accordingly, when the user deletes a card in the first desktop or the second desktop, adds cards corresponding to other applications to the first desktop or the second desktop, or adjusts the shape and size of a card in the first desktop or the second desktop, in order to ensure the user's browsing experience, the electronic device can adaptively adjust the shape, size, and specific position of other cards in the first desktop or the second desktop in the card area. For details, please refer to the aforementioned description of Figures 8-10, which will not be repeated here.
  • An embodiment of the present application also provides an electronic device, which includes: one or more processors and a memory; wherein the memory is coupled to the one or more processors, the memory is used to store computer program code, the computer program code includes computer instructions, and the one or more processors call the computer instructions to enable the electronic device to execute the method shown in the aforementioned embodiment.
  • the term "when" may be interpreted to mean “if" or “after" or “in response to determining" or “in response to detecting", depending on the context.
  • the phrases “upon determining" or “if (the stated condition or event) is detected” may be interpreted to mean “if determining" or “in response to determining" or “upon detecting (the stated condition or event)” or “in response to detecting (the stated condition or event)", depending on the context.
  • the computer program product includes one or more computer instructions.
  • the computer can be a general-purpose computer, a special-purpose computer, a computer network, or other programmable device.
  • the computer instructions can be stored in a computer-readable storage medium, or transmitted from one computer-readable storage medium to another computer-readable storage medium.
  • the computer instructions can be transmitted from a website site, computer, server or data center by wired (e.g., coaxial cable, optical fiber, digital subscriber line) or wireless (e.g., infrared, wireless, microwave, etc.) mode to another website site, computer, server or data center.
  • the computer-readable storage medium can be any available medium that a computer can access or a data storage device such as a server or data center that includes one or more available media integration.
  • the available medium can be a magnetic medium, (e.g., a floppy disk, a hard disk, a tape), an optical medium (e.g., a DVD), or a semiconductor medium (e.g., a solid-state hard disk), etc.
  • the processes can be completed by a computer program to instruct the relevant hardware, and the program can be stored in a computer-readable storage medium.
  • the program When the program is executed, it can include the processes of the above-mentioned method embodiments.
  • the aforementioned storage medium includes: ROM or random access memory RAM, magnetic disk or optical disk and other media that can store program codes.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

本申请提供了一种显示方法及电子设备,该电子设备可以根据在当前场景下用户的实际需求,动态的呈现用户当前场景下的可能会使用的应用和服务卡片,能帮助用户更快定位到自己需要浏览的应用。

Description

显示方法及电子设备
本申请要求于2022年07月28日提交中国专利局、申请号为202210904347.3、申请名称为“显示方法及电子设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及终端技术领域,尤其涉及显示方法及电子设备。
背景技术
随着电子技术的飞速发展,手机、平板、电脑等电子设备已经成为人们生活、工作中不可或缺的工具。据统计,人们查看手机的次数在一天150次左右。然而大多数情况下用户查看手机屏幕都是带有确定的目的性的,例如查看日程表,打开支付宝给商家付款等等。
但是,目前电子设备的桌面的表现形式比较单一,在设备应用APP越来越繁杂、应用APP的功能越来越丰富的情况下,用户往往需要耗费大量的时间来寻找自己需要打开的应用APP,这无疑浪费了用户的时间,降低了用户体验。因此,需要为电子设备设计更为便捷的显示界面。
发明内容
本申请的目的在于提供一种显示方法及电子设备。所述电子设备可以根据在当前场景下用户的实际需求,动态的呈现用户当前场景下的可能会使用的应用和服务卡片,能帮助用户更快定位到自己需要浏览的应用,提供更好的用户体验感。
上述目标和其他目标将通过独立权利要求中的特征来达成。进一步的实现方式在从属权利要求、说明书和附图中体现。
第一方面,本申请提供了一种显示方法,所述方法应用于电子设备,所述方法包括:在第一场景下,显示第一桌面,所述第一桌面包括多个第一应用图标和至少一个第一卡片;识别所述第一场景变化为第二场景,显示第二桌面,所述第二桌面包括多个第二应用图标和至少一个第二卡片,所述多个第二应用图标与所述多个第一应用图标不同,和/或所述至少一个第二卡片与所述至少一个第一卡片不同。
在本方法中,所述第一桌面的中所显示的内容,即桌面中的应用图标和卡片,是通过所述第一场景确定的。即所述多个第一应用图标是用户在所述第一场景下可能需要浏览或者打开的应用,所述至少一个第一卡片是用户在所述第一场景下可能需要浏览或者打开的应用生成的卡片。根据在当前场景下用户的实际需求,呈现用户当前场景下的可能会使用的应用和服务卡片,能帮助用户更快定位到自己需要浏览的应用,节省用户的时间。
此外,可以理解的,在大多数情况下,用户查看手机屏幕都是带有明确的目的性的,且该目的性可能与时间和地点以及用户的个人使用偏好等因素强相关。例如用户一般会在在早上查看日程表,在晚上打开游戏、运动健康等应用。再比如,用户一般会在汽车站旁打开乘车码、在商店打开付款码等等。因此,在本实施方式中,当场景变化时,电子设备 的桌面所呈现的内容也会随着场景的变化而变化。这样,电子设备可以根据场景动态的呈现用户当前场景下的可能会使用的应用和服务卡片,进一步提升了用户体验。
结合第一方面,在一种可能的实施方式中,所述第一桌面包括第一dock区域,所述方法还包括:接收作用于所述第一dock区域的第一操作;响应于所述第一操作,显示第三桌面,所述第三桌面包括第二dock区域,所述第二dock区域的面积与所述第一dock区域的面积不同;在所述第二dock区域的面积大于所述第一dock区域的面积的情况下,所述第二dock区域中应用图标的数量大于所述第一dock区域中应用图标的数量;在所述第二dock区域的面积小于所述第一dock区域的面积的情况下,所述第二dock区域中应用图标的数量小于所述第一dock区域中应用图标的数量。
在本实施方式中,所述至少一个第一应用图标可以被划分为两部分,即所述至少一个第三应用图标和除所述至少一个第三应用之外剩余的应用图标。其中,所述至少一个第三应用图标显示于所述第dock区域中。dock区域一般被设定在电子设备的屏幕下方,其中可以容纳若干个应用图标,无论用户如何切换屏幕界面,在屏幕界面上dock区域包含的应用图标永远不变,且一直显示在当前界面上。同理,当电子设备在场景变化时,即使电子设备会对界面中的应用图标进行更新,也不会更新包含在dock区域中的应用图标。也就是说,假如场景由所述第一场景变化为所述第二场景,电子设备不会更新对所述至少一个第一应用中的显示在所述第一dock区域中的应用图标,只会更新所述至少一个第一应用图标中显示在所述第一dock区域外的应用图标。
此外,在本实施方式中,用户可以通过例如上拉、下拉的操作对dock区域的大小进行调节,dock区域可以容纳的程序数量也会随之更多。这样,用户可以将更多自己常用的应用的图标放入dock区域中,以便用户能够直接快速启动更多的应用。
结合第一方面,在一种可能的实施方式中,在所述第一桌面的所述至少一个第一应用图标中,位于所述第一dock区域外的应用图标的数量为N个,在所述第三桌面中,位于所述第二dock区域外的应用图标的数量为M个;在所述第二dock区域的面积大于所述第一dock区域的面积的情况下,所述N大于所述M;在所述第二dock区域的面积小于所述第一dock区域的面积的情况下,所述N小于所述M。
可以理解的,对于电子设备而言,其屏幕的面积大小是不变的。因此,当用户对dock区域的面积进行调节后,电子设备可以适应性地适应性的增加或删减dock区域外的应用图标,以为用户提供更好的视觉体验。例如,当用户将所述第一dock区域调大,得到所述第二dock区域后,电子设备可以将原本显示在所述第一桌面中的所述的第一dock区域中的应用图标减少,只将其中的部分应用图标显示在在所述第三桌面中的所述第二dock区域外。
可选的,当用户对所述第一dock区域的面积进行调节后,电子设备也适应性可以调节所述dock区域外的应用图标的大小,或者改变应用图标的排列方式,为用户提供更好的浏览效果。
结合第一方面,在一种可能的实施方式中,所述第三桌面包括至少一个第三卡片,所述至少一个第三卡片为所述至少以第一卡片中的全部卡片或者部分卡片,所述至少一个第一卡片和所述至少一个第三卡片的显示方式不同。
在本实施方式中,当电子设备改变所述第一dock区域面积时,电子设备可以适应性地 更改场景化桌面中卡片区域中的卡片的显示方式,所述显示方式包括所述数量、形状以及卡片的排列方式,为用户提供更好的浏览效果。例如,假设在所述第一桌面中,所述至少一张第一卡片包含3张卡片(称其为卡片1、卡片2以及卡片3)时,当用户将所述第一dock区域增大后(增大后的dock区域即为所述第二dock区域),则在所述第三桌面中,电子设备可以并适应性的减小卡片1、卡片2和卡片3中一个或者多个卡片面积(此时卡片1、卡片2和卡片3即为所述至少一个第三卡片)。
结合第一方面,在一种可能的实施方式中,所述方法还包括:接收作用于第四卡片的第二操作,所述第四卡片为所述至少一个第一卡片中的任意一个卡片;响应于所述第二操作,删除所述第四卡片。
结合第一方面,在一种可能的实施方式中,在接收作用于第四卡片的第二操作之后,所述方法还包括:显示第四桌面,所述第四桌面包括至少一个第五卡片,所述至少一个第五卡片和所述至少一个第一卡片的显示方式不同。
在本实施方式中,电子设备可以响应于用户对卡片的操作,删除所述至少一个卡片中的任意一个卡片。在删除之后,电子设备可以对所述至少一张卡片中剩余的卡片(即所述至少一个第五卡片)进行调整,更改所述至少一个第五卡片的显示方式,所述显示方式包括所述数量、形状以及卡片的排列方式,为用户提供更进一步的浏览效果。例如,假设在所述第一桌面中,所述至少一张第一卡片包含3张卡片(称其为卡片4、卡片5以及卡片6)时,当用户将所述卡片4(此时卡片4即为所述第四卡片)删除后,则在所述第四桌面中,电子设备可以适应性的减小卡片5和卡片6的面积(此时卡片5和卡片6即为所述至少一个第五卡片)。
结合第一方面,在一种可能的实施方式中,所述方法还包括:接收作用于所述第一桌面的第三操作,响应于所述第三操作,生成第六卡片。
结合第一方面,在一种可能的实施方式中,所述方法还包括:显示第五桌面,所述第五桌面包括至少一个第七卡片,所述至少一个第七卡片包括所述至少一个第一卡片和所述第六卡片;所述至少一个第七卡片和所述至少一个第一卡片的显示方式不同。
在本实施方式中,电子设备可以响应于用户对所述第一桌面的操作,向所述第一桌面中增加卡片(即所述至第六卡片)。在增加所述第六卡片之后,所述第六卡片和所述至少一个卡片即组成所述至少一个第七卡片。为确保用户的浏览效果,电子设备可以对所述至少一个第七卡片中卡片显示方式进行调整,实际上也就是更改所述至少一个卡片的显示方式在所述第一桌面中的显示方式,所述显示方式包括所述数量、形状以及卡片的排列方式,为用户提供更进一步的浏览效果。例如,假设在所述第一桌面中,所述至少一张第一卡片包含2张卡片(称其为卡片7以及卡片8)时,当用户将所述卡片9(此时卡片9即为所述第6卡片)加入桌面后,则在所述第五桌面中,电子设备可以适应性的减小卡片7和卡片8的面积,并将卡片7-9一起显示在所述第五桌面中(此时卡片7-9即为所述至少一个第七卡片)。
结合第一方面,在一种可能的实施方式中,所述方法还包括:接收作用于第八卡片的第四操作,所述第八卡片为所述至少一个第一卡片中的任意一个卡片;响应于所述第四操作,放大或缩小所述第八卡片。
在本实施方式中,电子设备可以响应于用户对所述至少一张卡片中任意一张卡片的操作,将该卡片(即所述第八卡片)的面积进行放大或者缩小。这样,可以在用户有需要时放大卡片,以使用户能够更为清楚的浏览该卡片的信息;或者缩小卡片以使凸显其他更为重要的卡片的信息。
结合第一方面,在一种可能的实施方式中,在接收作用于第八卡片的第四操作之后,所述方法还包括:显示第六桌面,所述六桌面包括所述至少一个第一卡片,所述第六桌面中的所述至少一个第一卡片和所述第一桌面中的所述至少一个第一卡片的显示方式不同。
在本实施方式中,在用户对所述至少一个卡片中的任意一个卡片(即所述第八卡片)进行放大或者缩小之后,为确保用户的浏览效果,电子设备可以适应性地对所述至少一个第一卡片中其他卡片显示方式也进行调整,所述显示方式包括所述数量、形状以及卡片的排列方式,为用户提供更进一步的浏览效果。例如,假设在所述第一桌面中,所述至少一张第一卡片包含2张卡片(称其为卡片A以及卡片B)时,当用户将所述卡片A(此时卡片B即为所述第八卡片)放大后,则在所述第六桌面中,电子设备可以适应性地缩小卡片B的面积(此时卡片A和卡片B即为所述六桌面中的所述至少一个第一卡片),或者不再显示所述卡片B(此时卡片A即为所述六桌面中的所述至少一个第一卡片)。
在一个可选实施方式中,所述识别所述第一场景变化为第二场景,包括:识别到时间从第一时刻变化为第二时刻,所述第一时刻对应所述第一场景,所述第二时刻对应所述第二场景;或,识别到所述电子设备所处的地点从第一地点变化为第二地点,所述第一地点对应所述第一场景,所述第二地点对应所述第二场景;或,在第三时刻,识别用户对所述电子设备进行第五操作,在第四时刻,识别用户对所述电子设备进行第六操作,所述用户对所述电子设备进行第五操作对应所述第一场景,所述用户对所述电子设备进行第六操作对应所述第二场景。
在本实施方式中,电子设备判定场景改变的依据可以包括但不限于以下几种:
1.时间变化
可以理解的,用户在日常的一天中,其日程大概率是有规律的。例如,对于大多数人而言,早上7:00为起床时间,上午8:00-9:00需要离开家坐车或者开车赶往公司,晚上6:00之后下班回家。因此,在本实施方式中,电子设备可以结合用户习性,在不同时间对场景化桌面的内容进行更新,以便于用户在能在更为迅速的找到在当前时间需要打开的应用。
2.地点变化
用户需要打开的应用往往和用户所处的地点有直接的联系。例如,用户一般会在汽车站旁打开乘车码、在商店打开付款码,在高速路上打开导航软件等。因此,在本实施方式中,电子设备可以实时获取当前所处环境的位置信息,并结合该位置信息,在不同地点对场景化桌面的内容进行更新,以便于用户在能在更为迅速的找到在当前地点需要打开的应用。
3.接收到用户操作
可以理解的,用户前后两次查看电子设备的目的之前可能存在联系。例如,当用户在使用应用“地图”搜寻附近的电影院后,用户下一次查看电子设备时,可能希望打开用程序“豆瓣”或者“美团”购买电影票。因此,在本实施例中,电子设备可以根据用户上一 次使用电子设备时浏览过的应用或者卡片,对场景化桌面的内容进行更新,为用户推荐符合用户当前使用需求的卡片或者应用。
第二方面,本申请实施例提供一种电子设备,所述电子设备包括:一个或多个处理器和存储器;所述存储器与所述一个或多个处理器耦合,所述存储器用于存储计算机程序代码,所述计算机程序代码包括计算机指令,所述一个或多个处理器调用所述计算机指令以使得所述电子设备执行第一方面或第一方面的任一可能的实现方式中的方法。
第三方面,提供一种芯片系统,所述芯片系统应用于电子设备,所述芯片系统包括一个或多个处理器,所述处理器用于调用计算机指令以使得所述电子设备执行如第一方面或第一方面的任一可能的实现方式中的方法。
第四方面,提供一种计算机可读存储介质,包括指令,当所述指令在电子设备上运行时,使得所述电子设备执行如第一方面或第一方面的任一可能的实现方式中的方法。
本申请第二至四方面所提供的技术方案,其有益效果可以参考第一方面所提供的技术方案的有益效果,此处不再赘述。
附图说明
图1为本申请实施例提供的一种电子设备的桌面的示意图;
图2为本申请实施例提供的一种电子设备的桌面的示意图;
图3为本申请实施例提供的电子设备100的结构示意图;
图4为本申请实施例提供的一种场景化桌面的示意图;
图5为本申请实施例提供的一种用于桌面选择的用户界面图;
图6为本申请实施例提供的一种设置场景化桌面的过程示意图;
图7为本申请实施例提供的一种调节dock区域的过程示意图;
图8为本申请实施例提供的一种缩放场景化桌面中卡片的过程示意图;
图9为本申请实施例提供的一种删除场景化桌面中卡片的过程示意图;
图10为本申请实施例提供的一种向场景化桌面中添加卡片的过程示意图;
图11为本申请实施例提供的一种场景化桌面更新桌面内容的过程示意图;
图12为本申请实施例提供的一种场景化桌面更新桌面内容的过程示意图;
图13为本申请实施例提供的一种场景化桌面更新桌面内容的过程示意图;
图14为本申请实施例提供的一种显示方法的流程图。
具体实施方式
为了使本技术领域的人员更好地理解本申请实施例方案,下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚地描述,显然,所描述的实施例仅仅是本申请一部分的实施例,而不是全部的实施例。
本申请的说明书实施例和权利要求书及上述附图中的术语“第一”、“第二”、和“第三”等是用于区别类似的对象,而不必用于描述特定的顺序或先后次序。此外,术语“包括”和“具有”以及他们的任何变形,意图在于覆盖不排他的包含,例如,包含了一系列步骤或单元。 方法、系统、产品或设备不必限于清楚地列出的那些步骤或单元,而是可包括没有清楚地列出的或对于这些过程、方法、产品或设备固有的其它步骤或单元。
为了便于理解,下面先对本申请实施例涉及的相关术语进行介绍。
(1)dock区域/dock栏
dock区域也可称为dock栏,即dockbar(停靠栏)的缩写。其为图形用户界面中用于启动、切换运行中的应用的一种功能界面。
dock区域一般被设定在电子设备的屏幕下方,其中可以容纳若干个应用图标。对于大多数电子设备而言,无论用户如何切换屏幕界面,在屏幕界面上dock区域的大小和其包含的应用永远不变,且一直显示在当前界面上。因此,dock区域可以便于用户快速启动其中的应用图标对应的应用。当然,对于一些能够显示负一屏的电子设备而言,当用户将该电子设备的界面切至负一屏的时候,dock区域可以不显示在负一屏上。
此外,用户可以长按任意一个程序图标,将它拖入dock区域中。在传统的电子设备中,dock区域的大小是不可调节的,因此,dock区域中应用图标的数量也有限制,一般不能超过5个。但是在本申请提供的电子设备中,用户可以通过例如上拉、下拉的操作对dock区域的大小进行调节,dock区域可以容纳的程序数量也更多,以便用户能够直接快速启动更多的应用。具体可以参考后续实施例,此处先不赘述。
需要理解的是,“dock区域“以及“dock栏”只是本申请实施例所使用的一些名称,其代表的含义在本申请实施例中已经记载,其名称并不能对本实施例构成任何限制。
(2)应用/服务卡片
应用/服务卡片(以下简称“卡片”)其实就是手机应用展示页面内容的一种新形式,可以将应用页面的内容前置到卡片上,用户通过直接操作卡片就可以达到应用的使用体验,以达到服务直达、减少体验层级的目的。卡片可以摆放到任何位置,用户可以自定义属于自己的桌面风格。
卡片常用于嵌入到其他应用中作为其界面的一部分显示(也可以使用原子化服务将应用保存到服务中心中,这种方式不需要安装应用),并支持拉起页面,发送消息等基础的交互功能。卡片使用方和提供方不要求常驻运行,在需要添加/删除/请求更新卡片时,卡片管理服务会拉起卡片提供方获取卡片信息。
卡片功能有三个特征:1.易用可见,即卡片可以让凸显服务信息的内容外露,减少层级跳转导致的用户体验问题。2.智能可选,即卡片可以展示全天可变的数据信息,支持自定义类型的服务卡片设计,用户可以自己设置卡片的样式。3.多端可变,即卡片可以适配多端设备的自适应属性,手机、手环、平板都支持服务卡片属性。
如今手机、平板、电脑等电子设备已经成为人们生活、工作中不可或缺的工具。为了满足用户在工作、生活以及娱乐上的需求,用户在设备中所安装的应用的数量不断增加。采用何种形式将这些应用呈现在电子设备的界面上,以提升用户体验,已经成为电子设备产品设计中的重要一环。
图1和图2分别展示了目前电子设备常用的两种桌面呈现形式。
图1为本申请实施例提供的一种用户界面的示意图。如图1的(A)所示,图1中(A) 示例性示出了目前电子设备上用于呈现应用菜单的主屏幕界面10。如图1中(A)所示,主屏幕界面10包括状态栏101、日历天气小工具102、应用区域103和dock区104,其中:
状态栏101中可以包括运营商的名称(例如中国移动)、时间、WI-FI图标、信号强度和当前剩余电量。
日历天气小工具102可用于指示当前时间,例如日期、星期几、时分信息等,并可用于指示天气类型,例如多云转晴、小雨等,还可以用于指示气温等信息,还可以用于指示地点。
应用区域103可以包含例如“备忘录”的图标、“录音机”的图标、“换机克隆”的图标、“计算器”的图标、“图库”的图标、“尺子”的图标、“遥控器”的图标、“联系人”的图标等,还可以包含其他应用的图标,本申请实施例对此不作限定。任一个应用的图标可用于响应用户的操作,例如触摸操作,使得电子设备启动图标对应的应用。
dock区104可以包含例如“电话”的图标、“短信”的图标、“浏览器”的图标以及“相机”的图标,还可以包含其他应用的图标,本申请实施例对此不作限定。任何一个应用的图标可用于响应用户的操作,例如触摸操作,使得电子设备启动图标对应的应用。
可以理解的,尽管未示出,在本申请实施例以及后续实施例所提供的界面中,还可以包含导航栏,该导航栏可以包括:返回按键、主界面(home screen)按键、呼出任务历史按键等系统导航键。当检测到用户点击上述返回按键时,电子设备可显示当前用户界面的上一个用户界面。当检测到用户点击上述主界面按键时,电子设备可显示用户为电子设备设定的主屏幕界面(例如主屏幕界面10)。当检测到用户点击呼出上述任务历史按键时,电子设备可显示用户最近打开过的任务。各导航键的命名还可以为其他,比如,上述返回按键可以叫Back Button,上述主界面按键可以叫Home button,上述呼出任务历史按键可以叫Menu Button,本申请对此不做限制。此外,上述导航栏中的各导航键不限于虚拟按键,也可以实现为物理按键。
图2为本申请实施例提供的另一种应用界面的示意图。如图2中的(A)所示,图2中(A)示例性示出了目前电子设备上用于呈现应用菜单的主屏幕界面20。如图2中的(A)所示,主屏幕界面20包括状态栏201、日历天气小工具202、应用区域203和dock区204,其中:
状态栏201、日历天气小工具202以及dock区204的具体功能可以参考前述对图1中的(A)的相关说明,此处不再赘述。
需要说明的是,与图1中的(A)所示的应用区域103不同,图2中的(A)所示的应用区域203可以包含应用文件夹2031和应用文件夹2032。
应用文件夹2031可以包含例如(Kugou)的图标2031A、(Tik Tok)的图标以及(Wechat)的图标等,还可以包含其他应用的图标,本申请实施例对此不作限定。此外,应用文件夹2031中的图标可以以卡片的形式显示,但是卡片的形状是固定的,例如图2中的(A)所示的(Kugou)的图标2031A,用户可以直接从当前界面上获取与该图标对应的应用的相关信息。可选的,应用文件夹2031中的图标可以根据用户对电子设备中应用的使用时间进行更新;例如,当用户使用电子设备中的应用“QQ”后,应用文件夹2031中(Wechat)的图标可以被更新为(Tencent QQ) 的图标。
应用文件夹2031中的可以包含“备忘录”的图标、“录音机”的图标、“日历”的图标等多个应用的图标,这些应用的图标可以以小图标的样式呈现在应用文件夹2031,电子设备可以相应于用于对应用文件夹2031的触摸操作,例如点击操作,将应用文件夹2031中的应用的图标以大图标的样式显示;在这些小图标以大图标显示之后,任一个应用的图标可用于响应用户的操作,例如触摸操作,使得电子设备启动图标对应的应用。
结合前述说明可知,同一电子设备所显示的dock区域一般被设定在电子设备的屏幕下方,其可以容纳若干个应用。无论用户如何切换屏幕界面,在屏幕界面上dock区域的大小和其包含的应用永远不变,且一直显示在当前界面上。
以图1中所示的主屏幕界面10和用户界面11为例,电子设备可以响应用户作用于主屏幕界面10上的操作,例如图1中的(A)所示的向左滑动的操作,显示用户界面11。如图1中的(B)所示,用户界面11可以包括应用区域113以及dock区114。其中:
应用区域113可以包含(Tencent QQ)的图标、(Wechat)的图标、 (Sina Weibo)的图标等,还可以包含其他应用的图标,本申请实施例对此不作限定。任一个应用的图标可用于响应用户的操作,例如触摸操作,使得电子设备启动图标对应的应用。
dock区114包含例如“电话”的图标、“短信”的图标、“浏览器”的图标以及“相机”的图标。当用户将电子设备显示的界面从主屏幕界面10切换至用户界面11后,虽然两个界面中应用区域的大小以及应用区域所包含的应用可以不同,但是两个界面中dock区域(即dock区域104和dock区域114)的大小和其包含的应用均不变。也就是说,当用户切换屏幕界面时,在屏幕界面上dock区域的大小和其包含的应用是固定不变的,且dock区域一直显示在电子设备当前显示的界面上。
此外,用户可以长按任意一个程序图标,将它拖入dock区域中。但是在传统的电子设备中,dock区域的大小是不可调节的,因此,dock区域中应用的数量也有限制,例如dock区域中应用的数量一般不能超过5个。以图2中所示的主屏幕界面20为例,用户可以通过对应用“应用市场”的图标2033进行长按后,将其拖入dock区域204中。如图2中的(B)所示,在主屏幕界面20中,dock区域204中最后可以包括“电话”的图标、“短信”的图标、“浏览器”的图标、“相机”的图标以及应用市场的图标。此时dock区域204中可容纳的应用的数量已经饱和,在不将dock区域204中的图标移出的情况下,用户后续无法将dock区域204外的其他应用的图标拖入dock区域204中。
结合前述说明可知,图1和图2中所示的用户界面的表现形式比较单一,其每个界面展示的应用或者卡片基本是不变的。这意味着在用户对界面内容不熟悉的情况下,尤其是在电子设备中的应用繁杂的情况下,用户需要浪费大量的时间来浏览多个界面,才能找到自己希望打开或浏览的应用。此外,由于dock的特性,用户通常会将自己需要频繁使用的应用放入dock区域中,以便自己能随时快速打开这些应用。但图1和图2中所示的用户界面中dock区域所能容纳的应用的数量是有限的,在用户的常用应用大于一定数量时,大小 无法改变的dock区域也就不能满足用户的需求。这无疑浪费了用户的时间,降低了用户体验。
不难理解的,在大多数情况下,用户查看手机屏幕都是带有明确的目的性的,且该目的性可能与时间和地点以及用户的个人使用偏好等因素强相关。例如用户一般会在在早上查看日程表,用户一般会在汽车站旁打开乘车码等等。因此,针对例如图1和图2所示的固化的桌面呈现形式,本申请提供了一种电子设备,该电子设备可用于显示一种场景化桌面,该场景化桌面能够根据用户需要动态的呈现用户当前场景下的可能会使用的应用或服务卡片;且上述场景化桌面中的dock区域大小而调节,能有效减少用户寻找需要打开的应用的时间,提升用户体验。
首先,介绍本申请实施例提供的电子设备。
该电子设备以是手机、平板电脑、可穿戴设备、车载设备、增强现实(augmented reality,AR)/虚拟现实(virtual reality,VR)设备、笔记本电脑、超级移动个人计算机(ultra-mobile personal computer,UMPC)、上网本、个人数字助理(personaldigital assistant,PDA)或专门的照相机(例如单反相机、卡片式相机)等,本申请对该电子设备的具体类型不作任何限制。
图3示例性示出了该电子设备的结构。
如图3所示,电子设备100可以包括处理器110,外部存储器接口120,内部存储器121,通用串行总线(universal serial bus,USB)接口130,充电管理模块140,电源管理模块141,电池142,天线1,天线2,移动通信模块150,无线通信模块160,音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,传感器模块180,按键190,马达191,指示器192,摄像头193,显示屏194,以及用户标识模块(subscriber identification module,SIM)卡接口195等。其中传感器模块180可以包括压力传感器180A,陀螺仪传感器180B,气压传感器180C,磁传感器180D,加速度传感器180E,距离传感器180F,接近光传感器180G,指纹传感器180H,温度传感器180J,触摸传感器180K,环境光传感器180L,骨传导传感器180M等。
可以理解的是,本发明实施例示意的结构并不构成对电子设备100的具体限定。在本申请另一些实施例中,电子设备100可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
处理器110可以包括一个或多个处理单元,例如:处理器110可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),控制器,存储器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。
其中,控制器可以是电子设备100的神经中枢和指挥中心。控制器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。
处理器110中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器110 中的存储器为高速缓冲存储器。该存储器可以保存处理器110刚用过或循环使用的指令或数据。如果处理器110需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器110的等待时间,因而提高了系统的效率。
在一些实施例中,处理器110可以包括一个或多个接口。接口可以包括集成电路(inter-integrated circuit,I2C)接口,集成电路内置音频(inter-integrated circuit sound,I2S)接口,脉冲编码调制(pulse code modulation,PCM)接口,通用异步收发传输器(universal asynchronous receiver/transmitter,UART)接口,移动产业处理器接口(mobile industry processor interface,MIPI),通用输入输出(general-purpose input/output,GPIO)接口,用户标识模块(subscriber identity module,SIM)接口,和/或通用串行总线(universal serial bus,USB)接口等。
I2C接口是一种双向同步串行总线,包括一根串行数据线(serial data line,SDA)和一根串行时钟线(derail clock line,SCL)。在一些实施例中,处理器110可以包含多组I2C总线。处理器110可以通过不同的I2C总线接口分别耦合触摸传感器180K,充电器,闪光灯,摄像头193等。例如:处理器110可以通过I2C接口耦合触摸传感器180K,使处理器110与触摸传感器180K通过I2C总线接口通信,实现电子设备100的触摸功能。
I2S接口可以用于音频通信。在一些实施例中,处理器110可以包含多组I2S总线。处理器110可以通过I2S总线与音频模块170耦合,实现处理器110与音频模块170之间的通信。在一些实施例中,音频模块170可以通过I2S接口向无线通信模块160传递音频信号,实现通过蓝牙耳机接听电话的功能。
PCM接口也可以用于音频通信,将模拟信号抽样,量化和编码。在一些实施例中,音频模块170与无线通信模块160可以通过PCM总线接口耦合。在一些实施例中,音频模块170也可以通过PCM接口向无线通信模块160传递音频信号,实现通过蓝牙耳机接听电话的功能。所述I2S接口和所述PCM接口都可以用于音频通信。
UART接口是一种通用串行数据总线,用于异步通信。该总线可以为双向通信总线。它将要传输的数据在串行通信与并行通信之间转换。在一些实施例中,UART接口通常被用于连接处理器110与无线通信模块160。例如:处理器110通过UART接口与无线通信模块160中的蓝牙模块通信,实现蓝牙功能。在一些实施例中,音频模块170可以通过UART接口向无线通信模块160传递音频信号,实现通过蓝牙耳机播放音乐的功能。
MIPI接口可以被用于连接处理器110与显示屏194,摄像头193等外围器件。MIPI接口包括摄像头串行接口(camera serial interface,CSI),显示屏串行接口(display serial interface,DSI)等。在一些实施例中,处理器110和摄像头193通过CSI接口通信,实现电子设备100的拍摄功能。处理器110和显示屏194通过DSI接口通信,实现电子设备100的显示功能。
GPIO接口可以通过软件配置。GPIO接口可以被配置为控制信号,也可被配置为数据信号。在一些实施例中,GPIO接口可以用于连接处理器110与摄像头193,显示屏194,无线通信模块160,音频模块170,传感器模块180等。GPIO接口还可以被配置为I2C接口,I2S接口,UART接口,MIPI接口等。
USB接口130是符合USB标准规范的接口,具体可以是Mini USB接口,Micro USB 接口,USB Type C接口等。USB接口130可以用于连接充电器为电子设备100充电,也可以用于电子设备100与外围设备之间传输数据。也可以用于连接耳机,通过耳机播放音频。该接口还可以用于连接其他电子设备,例如AR设备等。
可以理解的是,本发明实施例示意的各模块间的接口连接关系,只是示意性说明,并不构成对电子设备100的结构限定。在本申请另一些实施例中,电子设备100也可以采用上述实施例中不同的接口连接方式,或多种接口连接方式的组合。
充电管理模块140用于从充电器接收充电输入。其中,充电器可以是无线充电器,也可以是有线充电器。在一些有线充电的实施例中,充电管理模块140可以通过USB接口130接收有线充电器的充电输入。在一些无线充电的实施例中,充电管理模块140可以通过电子设备100的无线充电线圈接收无线充电输入。充电管理模块140为电池142充电的同时,还可以通过电源管理模块141为电子设备供电。
电源管理模块141用于连接电池142,充电管理模块140与处理器110。电源管理模块141接收电池142和/或充电管理模块140的输入,为处理器110,内部存储器121,外部存储器,显示屏194,摄像头193,和无线通信模块160等供电。电源管理模块141还可以用于监测电池容量,电池循环次数,电池健康状态(漏电,阻抗)等参数。在其他一些实施例中,电源管理模块141也可以设置于处理器110中。在另一些实施例中,电源管理模块141和充电管理模块140也可以设置于同一个器件中。
电子设备100的无线通信功能可以通过天线1,天线2,移动通信模块150,无线通信模块160,调制解调处理器以及基带处理器等实现。
天线1和天线2用于发射和接收电磁波信号。电子设备100中的每个天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。例如:可以将天线1复用为无线局域网的分集天线。在另外一些实施例中,天线可以和调谐开关结合使用。
移动通信模块150可以提供应用在电子设备100上的包括2G/3G/4G/5G等无线通信的解决方案。移动通信模块150可以包括至少一个滤波器,开关,功率放大器,低噪声放大器(low noise amplifier,LNA)等。移动通信模块150可以由天线1接收电磁波,并对接收的电磁波进行滤波,放大等处理,传送至调制解调处理器进行解调。移动通信模块150还可以对经调制解调处理器调制后的信号放大,经天线1转为电磁波辐射出去。在一些实施例中,移动通信模块150的至少部分功能模块可以被设置于处理器110中。在一些实施例中,移动通信模块150的至少部分功能模块可以与处理器110的至少部分模块被设置在同一个器件中。
调制解调处理器可以包括调制器和解调器。其中,调制器用于将待发送的低频基带信号调制成中高频信号。解调器用于将接收的电磁波信号解调为低频基带信号。随后解调器将解调得到的低频基带信号传送至基带处理器处理。低频基带信号经基带处理器处理后,被传递给应用处理器。应用处理器通过音频设备(不限于扬声器170A,受话器170B等)输出声音信号,或通过显示屏194显示图像或视频。在一些实施例中,调制解调处理器可以是独立的器件。在另一些实施例中,调制解调处理器可以独立于处理器110,与移动通信模块150或其他功能模块设置在同一个器件中。
无线通信模块160可以提供应用在电子设备100上的包括无线局域网(wireless local  area networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络),蓝牙(bluetooth,BT),全球导航卫星系统(global navigation satellite system,GNSS),调频(frequency modulation,FM),近距离无线通信技术(near field communication,NFC),红外技术(infrared,IR)等无线通信的解决方案。无线通信模块160可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块160经由天线2接收电磁波,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器110。无线通信模块160还可以从处理器110接收待发送的信号,对其进行调频,放大,经天线2转为电磁波辐射出去。
在一些实施例中,电子设备100的天线1和移动通信模块150耦合,天线2和无线通信模块160耦合,使得电子设备100可以通过无线通信技术与网络以及其他设备通信。所述无线通信技术可以包括全球移动通讯系统(global system for mobile communications,GSM),通用分组无线服务(general packet radio service,GPRS),码分多址接入(code division multiple access,CDMA),宽带码分多址(wideband code division multiple access,WCDMA),时分码分多址(time-division code division multiple access,TD-SCDMA),长期演进(long term evolution,LTE),BT,GNSS,WLAN,NFC,FM,和/或IR技术等。所述GNSS可以包括全球卫星定位系统(global positioning system,GPS),全球导航卫星系统(global navigation satellite system,GLONASS),北斗卫星导航系统(beidou navigation satellite system,BDS),准天顶卫星系统(quasi-zenith satellite system,QZSS)和/或星基增强系统(satellite based augmentation systems,SBAS)。
电子设备100通过GPU,显示屏194,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏194和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器110可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。
显示屏194用于显示图像,视频等。显示屏194包括显示面板。显示面板可以采用液晶显示屏(liquid crystal display,LCD),有机发光二极管(organic light-emitting diode,OLED),有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrix organic light emitting diode的,AMOLED),柔性发光二极管(flex light-emitting diode,FLED),Miniled,MicroLed,Micro-oLed,量子点发光二极管(quantum dot light emitting diodes,QLED)等。在一些实施例中,电子设备100可以包括1个或N个显示屏194,N为大于1的正整数。
电子设备100可以通过ISP,摄像头193,视频编解码器,GPU,显示屏194以及应用处理器等实现拍摄功能。
ISP用于处理摄像头193反馈的数据。例如,拍照时,打开快门,光线通过镜头被传递到摄像头感光元件上,光信号转换为电信号,摄像头感光元件将所述电信号传递给ISP处理,转化为肉眼可见的图像。ISP还可以对图像的噪点,亮度,肤色进行算法优化。ISP还可以对拍摄场景的曝光,色温等参数优化。在一些实施例中,ISP可以设置在摄像头193中。
摄像头193用于捕获静态图像或视频。物体通过镜头生成光学图像投射到感光元件。感光元件可以是电荷耦合器件(charge coupled device,CCD)或互补金属氧化物半导体(complementary metal-oxide-semiconductor,CMOS)光电晶体管。感光元件把光信号转换成电信号,之后将电信号传递给ISP转换成数字图像信号。ISP将数字图像信号输出到DSP 加工处理。DSP将数字图像信号转换成标准的RGB,YUV等格式的图像信号。在一些实施例中,电子设备100可以包括1个或N个摄像头193,N为大于1的正整数。
数字信号处理器用于处理数字信号,除了可以处理数字图像信号,还可以处理其他数字信号。例如,当电子设备100在频点选择时,数字信号处理器用于对频点能量进行傅里叶变换等。
视频编解码器用于对数字视频压缩或解压缩。电子设备100可以支持一种或多种视频编解码器。这样,电子设备100可以播放或录制多种编码格式的视频,例如:动态图像专家组(moving picture experts group,MPEG)1,MPEG2,MPEG3,MPEG4等。
NPU为神经网络(neural-network,NN)计算处理器,通过借鉴生物神经网络结构,例如借鉴人脑神经元之间传递模式,对输入信息快速处理,还可以不断的自学习。通过NPU可以实现电子设备100的智能认知等应用,例如:图像识别,人脸识别,语音识别,文本理解等。通过NPU还可以实现本申请实施例提供的决策模型。
外部存储器接口120可以用于连接外部存储卡,例如Micro SD卡,实现扩展电子设备100的存储能力。外部存储卡通过外部存储器接口120与处理器110通信,实现数据存储功能。例如将音乐,视频等文件保存在外部存储卡中。
内部存储器121可以用于存储计算机可执行程序代码,所述可执行程序代码包括指令。处理器110通过运行存储在内部存储器121的指令,从而执行电子设备100的各种功能应用以及数据处理。内部存储器121可以包括存储程序区和存储数据区。其中,存储程序区可存储操作系统,至少一个功能所需的应用(比如声音播放功能,图像播放功能等)等。存储数据区可存储电子设备100使用过程中所创建的数据(比如音频数据,电话本等)等。此外,内部存储器121可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件,闪存器件,通用闪存存储器(universal flash storage,UFS)等。
电子设备100可以通过音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,以及应用处理器等实现音频功能。例如音乐播放,录音等。
音频模块170用于将数字音频信息转换成模拟音频信号输出,也用于将模拟音频输入转换为数字音频信号。音频模块170还可以用于对音频信号编码和解码。在一些实施例中,音频模块170可以设置于处理器110中,或将音频模块170的部分功能模块设置于处理器110中。
扬声器170A,也称“喇叭”,用于将音频电信号转换为声音信号。电子设备100可以通过扬声器170A收听音乐,或收听免提通话。
受话器170B,也称“听筒”,用于将音频电信号转换成声音信号。当电子设备100接听电话或语音信息时,可以通过将受话器170B靠近人耳接听语音。
麦克风170C,也称“话筒”,“传声器”,用于将声音信号转换为电信号。当拨打电话或发送语音信息时,用户可以通过人嘴靠近麦克风170C发声,将声音信号输入到麦克风170C。电子设备100可以设置至少一个麦克风170C。在另一些实施例中,电子设备100可以设置两个麦克风170C,除了采集声音信号,还可以实现降噪功能。在另一些实施例中,电子设备100还可以设置三个,四个或更多麦克风170C,实现采集声音信号,降噪,还可以识别声音来源,实现定向录音功能等。
耳机接口170D用于连接有线耳机。耳机接口170D可以是USB接口130,也可以是3.5mm的开放移动电子设备平台(open mobile terminal platform,OMTP)标准接口,美国蜂窝电信工业协会(cellular telecommunications industry association of the USA,CTIA)标准接口。
压力传感器180A用于感受压力信号,可以将压力信号转换成电信号。在一些实施例中,压力传感器180A可以设置于显示屏194。压力传感器180A的种类很多,如电阻式压力传感器,电感式压力传感器,电容式压力传感器等。电容式压力传感器可以是包括至少两个具有导电材料的平行板。当有力作用于压力传感器180A,电极之间的电容改变。电子设备100根据电容的变化确定压力的强度。当有触摸操作作用于显示屏194,电子设备100根据压力传感器180A检测所述触摸操作强度。电子设备100也可以根据压力传感器180A的检测信号计算触摸的位置。在一些实施例中,作用于相同触摸位置,但不同触摸操作强度的触摸操作,可以对应不同的操作指令。例如:当有触摸操作强度小于第一压力阈值的触摸操作作用于短消息应用图标时,执行查看短消息的指令。当有触摸操作强度大于或等于第一压力阈值的触摸操作作用于短消息应用图标时,执行新建短消息的指令。
陀螺仪传感器180B可以用于确定电子设备100的运动姿态。在一些实施例中,可以通过陀螺仪传感器180B确定电子设备100围绕三个轴(即,x,y和z轴)的角速度。陀螺仪传感器180B可以用于拍摄防抖。示例性的,当按下快门,陀螺仪传感器180B检测电子设备100抖动的角度,根据角度计算出镜头模组需要补偿的距离,让镜头通过反向运动抵消电子设备100的抖动,实现防抖。陀螺仪传感器180B还可以用于导航,体感游戏场景。
气压传感器180C用于测量气压。在一些实施例中,电子设备100通过气压传感器180C测得的气压值计算海拔高度,辅助定位和导航。
磁传感器180D包括霍尔传感器。电子设备100可以利用磁传感器180D检测翻盖皮套的开合。在一些实施例中,当电子设备100是翻盖机时,电子设备100可以根据磁传感器180D检测翻盖的开合。进而根据检测到的皮套的开合状态或翻盖的开合状态,设置翻盖自动解锁等特性。
加速度传感器180E可检测电子设备100在各个方向上(一般为三轴)加速度的大小。当电子设备100静止时可检测出重力的大小及方向。还可以用于识别电子设备姿态,应用于横竖屏切换,计步器等应用。
距离传感器180F,用于测量距离。电子设备100可以通过红外或激光测量距离。在一些实施例中,拍摄场景,电子设备100可以利用距离传感器180F测距以实现快速对焦。
接近光传感器180G可以包括例如发光二极管(LED)和光检测器,例如光电二极管。发光二极管可以是红外发光二极管。电子设备100通过发光二极管向外发射红外光。电子设备100使用光电二极管检测来自附近物体的红外反射光。当检测到充分的反射光时,可以确定电子设备100附近有物体。当检测到不充分的反射光时,电子设备100可以确定电子设备100附近没有物体。电子设备100可以利用接近光传感器180G检测用户手持电子设备100贴近耳朵通话,以便自动熄灭屏幕达到省电的目的。接近光传感器180G也可用于皮套模式,口袋模式自动解锁与锁屏。
环境光传感器180L用于感知环境光亮度。电子设备100可以根据感知的环境光亮度自 适应调节显示屏194亮度。环境光传感器180L也可用于拍照时自动调节白平衡。环境光传感器180L还可以与接近光传感器180G配合,检测电子设备100是否在口袋里,以防误触。
指纹传感器180H用于采集指纹。电子设备100可以利用采集的指纹特性实现指纹解锁,访问应用锁,指纹拍照,指纹接听来电等。
温度传感器180J用于检测温度。在一些实施例中,电子设备100利用温度传感器180J检测的温度,执行温度处理策略。例如,当温度传感器180J上报的温度超过阈值,电子设备100执行降低位于温度传感器180J附近的处理器的性能,以便降低功耗实施热保护。在另一些实施例中,当温度低于另一阈值时,电子设备100对电池142加热,以避免低温导致电子设备100异常关机。在其他一些实施例中,当温度低于又一阈值时,电子设备100对电池142的输出电压执行升压,以避免低温导致的异常关机。
触摸传感器180K,也称“触控面板”。触摸传感器180K可以设置于显示屏194,由触摸传感器180K与显示屏194组成触摸屏,也称“触控屏”。触摸传感器180K用于检测作用于其上或附近的触摸操作。触摸传感器可以将检测到的触摸操作传递给应用处理器,以确定触摸事件类型。可以通过显示屏194提供与触摸操作相关的视觉输出。在另一些实施例中,触摸传感器180K也可以设置于电子设备100的表面,与显示屏194所处的位置不同。
骨传导传感器180M可以获取振动信号。在一些实施例中,骨传导传感器180M可以获取人体声部振动骨块的振动信号。骨传导传感器180M也可以接触人体脉搏,接收血压跳动信号。在一些实施例中,骨传导传感器180M也可以设置于耳机中,结合成骨传导耳机。音频模块170可以基于所述骨传导传感器180M获取的声部振动骨块的振动信号,解析出语音信号,实现语音功能。应用处理器可以基于所述骨传导传感器180M获取的血压跳动信号解析心率信息,实现心率检测功能。
按键190包括开机键,音量键等。按键190可以是机械按键。也可以是触摸式按键。电子设备100可以接收按键输入,产生与电子设备100的用户设置以及功能控制有关的键信号输入。
马达191可以产生振动提示。马达191可以用于来电振动提示,也可以用于触摸振动反馈。例如,作用于不同应用(例如拍照,音频播放等)的触摸操作,可以对应不同的振动反馈效果。作用于显示屏194不同区域的触摸操作,马达191也可对应不同的振动反馈效果。不同的应用场景(例如:时间提醒,接收信息,闹钟,游戏等)也可以对应不同的振动反馈效果。触摸振动反馈效果还可以支持自定义。
指示器192可以是指示灯,可以用于指示充电状态,电量变化,也可以用于指示消息,未接来电,通知等。
SIM卡接口195用于连接SIM卡。SIM卡可以通过插入SIM卡接口195,或从SIM卡接口195拔出,实现和电子设备100的接触和分离。电子设备100可以支持1个或N个SIM卡接口,N为大于1的正整数。SIM卡接口195可以支持Nano SIM卡,Micro SIM卡,SIM卡等。同一个SIM卡接口195可以同时插入多张卡。所述多张卡的类型可以相同,也可以不同。SIM卡接口195也可以兼容不同类型的SIM卡。SIM卡接口195也可以兼容外部存储卡。电子设备100通过SIM卡和网络交互,实现通话以及数据通信等功能。在一些实施例中,电子设备100采用eSIM,即:嵌入式SIM卡。eSIM卡可以嵌在电子设备100中, 不能和电子设备100分离。
在本申请实施例中,处理器110可以结合时间信息生成程序指令,以改变显示屏194的显示信息。此外,处理器110还可以结合无线通信模块160获取的地理位置信息生成程序指令,以改变显示屏的显示信息。在一些实施例中,处理器110中的存储器还可以用于存储用户对电子设备100的历史操作信息,该历史操作信息可以包括用户电子设备中应用的浏览量,访问频率、访问时间等等,还可以包括其他维度的信息,本申请对此不作限定;处理器110可以对上述历史操作信息进行分析后生成程序指令,以改变显示屏的显示信息。
下面介绍本申请实施例提供的用户界面。
首先,介绍上述场景化桌面的具体界面表现形式,具体请参考图4。
图4为本申请实施例提供的一种场景化桌面的示意图。用户界面40(在本申请的一些实施例中,用户界面40可以被称为第一桌面)可以以卡片加应用图标的形式为用户提供在当前场景下的可能会使用的应用或服务卡片。如图4所示,用户界面40可以包括卡片区域401,应用区域402以及dock区域403。其中:
卡片区域401可以包含卡片4011、卡片4012以及卡片4013,这些卡片可以展示其对应的应用中的页面内容。例如,卡片4011为应用“视频”生成的卡片,其可以用于展示用户最近在应用“视频”中观看过的内容;卡片4012为应用“图库”生成的卡片,其可以用于展示用户拍摄的存储在应用“图库”中的图片;卡片4011为应用“游戏”生成的卡片,其可以用于展示应用“游戏”为用户推荐的内容。每一个卡片都可以响应于用户操作,使电子设备100打开与卡片对应的应用并显示相应的应用页面。
在一些实施例中,卡片区域401中的卡片还可以响应于用户操作,在电子设备100不打开卡片对应的应用的情况下,实现与卡片对应的应用所提供的某些功能。例如,卡片4012可以响应于用户对更新控件4012A的点击操作,更新其展示在卡片4012中的图像。再例如,卡片4012可以响应于用户的长按操作,将卡片4012中的图像分享给他人。
需要说明是,电子设备100可以响应于用户对卡片区域401中的任意一张卡片的操作,例如双指缩放操作,以调整对应卡片的形状或大小。可选的,在卡片区域401中存在多张卡片的情况下,当用户对其中某一张卡片的形状或者大小进行调整时,电子设备100可以同时对卡片区域401中的其他卡片的大小或者形状进行调整。此外,电子设备100可以响应于用户的操作,对卡片区域401中的卡片进行删除,或者将其他应用对应的卡片添加到卡片区域401中。同样,当卡片区域401中的卡片数量增加或者减少时,电子设备100可以相应地对卡片区域401中的卡片的大小或者形状进行调整,为用户提供更好的视觉效果。具体可以参考后续实施例的相关说明,这里先不赘述。
应用区域402,可以包含例如“备忘录”的图标、“录音机”的图标、“换机克隆”的图标、“计算器”的图标、“图库”的图标、“尺子”的图标、“遥控器”的图标、“联系人”的图标等,还可以包含其他应用的图标,本申请实施例对此不作限定。任一个应用的图标可用于响应用户的操作,例如触摸操作,使得电子设备启动图标对应的应用。
dock区域403,可以包含例如“电话”的图标、“短信”的图标、“浏览器”的图标以及“相机”的图标。当用户将电子设备100显示的界面从用户界面40切换至其他界面之后, dock区域403以及其中包含的应用的图标可以一直显示电子设备100当前显示的界面上。
与传统的dock区域不同的是,dock区域403的区域大小是可以调节的。例如,dock区域403可以包含调节控件403A,电子设备100可以响应于用户对调节控件403A的操作,例如上拉操作,将dock区域403的面积调大,相应的,dock区中所包含的应用的数量也可以增加,具体可以参考后续实施例的相关说明,此处先不赘述。
另外,在本申请中,电子设备100可以根据时间变化、地点变化、事件的前后关系以及用户习惯自动更新卡片区域401中的卡片以及应用区域402中的图标。需理解,这里所说的更新不仅仅限于对卡片的形状、内容的变化,还可以包括将卡片区域401中某个应用对应的卡片更新为其他应用对应的卡片,具体可以参考后续实施例的相关说明。
可以理解的,用户界面40中卡片区域401,应用区域402以及dock区域403的大小和位置关系还可以表现为其他形式,用户界面40仅仅示例性示出了本申请实施例提供的场景化桌面,其不应构成对本申请实施例的限定。
接下来介绍电子设备100开启场景化桌面以及将场景化桌面设定为主屏幕界面所涉及的用户界面。请参考图5和图6。
图5示出了电子设备100在出厂后用户首次使用时,电子设备100响应于用户操作,将桌面设置为本申请提供的场景化桌面(例如前述说明中的用户界面40)的过程。
如图5所示,用户界面50可以为在电子设备100出厂后,用户首次将电子设备100开机后,电子设备100显示的界面。用户界面50可以包括界面缩略图501、界面缩略图502、以及“进入新桌面”控件503。其中,界面缩略图501为本申请提供的场景化桌面的缩略图,界面缩略图502为普通桌面的缩略图。两个缩略图的下方分别设有一个选择控件,即选择控件501A和选择控件502A,任一个选择控件均可以响应于用户的点击操作,使电子设备100将其对应的界面缩略图所显示的桌面设定为电子设备100的主屏幕界面。例如,电子设备100可以响应于图5中用户对选择控件501A的点击操作,将界面缩略图501勾选后,电子设备100可以响应于用户对“进入新桌面”控件503的点击操作,显示界面缩略图501对应的场景化桌面(例如前述说明中的用户界面40)。
图6示出了电子设备100响应于用户操作,将主屏幕界面由普通桌面设置为本申请提供的场景化桌面(例如前述说明中的用户界面40)的过程。
如图6所示,图6中的(A)所示的用户界面60为电子设备100的主屏幕界面,结合前述对图1和图2的相关说明,可以看出,用户界面60为传统的普通桌面。用户界面60中所包含的内容的具体作用可以参考前述对主屏幕界面10或者主屏幕界面20的相关说明,此处不再赘述。
如图6中的(A)所示,电子设备100可以响应于用户对用户界面60的操作,例如图6中的(A)所示的双指缩放操作,显示如图6中的(B)所示的用户界面61。如图6中的(B)所示,用户界面61可以包括:界面缩略图611、界面缩略图612、界面缩略图613、界面缩略图614以及功能选项区615。其中:
界面缩略图611为本申请提供的场景化桌面的缩略图,界面缩略图612-614为普通桌面的缩略图。界面缩略图613-614由于电子设备100的屏幕面积限制未能完全显示,用户 可以通过下拉操作使界面缩略图613-614完全显示在屏幕中。
功能选项区615可以包括壁纸选项615A、桌面卡片选项615B、切换效果选项615C以及桌面设置选项615D。其中,壁纸选项615A可以用于对电子设备100的桌面壁纸进行设定;桌面卡片选项615B可用于向电子设备100的桌面上添加应用对应的卡片;切换效果选项615C可以在电子设备100的界面列表中界面数量大于1的情况下,对界面与界面之间的切换效果进行设定,例如翻转切换,旋转切换等;桌面设置选项615D可用于对电子设备100的桌面中的应用的图标大小、桌面布局等参数进行设定。
在用户界面60中,每个界面缩略图中的下方均存在一个选择控件,例如界面缩略图611下方的选择控件611B和界面缩略图612下方的选择控件612B。当某个选择控件被勾选(即该选择控件处于深色状态,例如选择控件611B)时,即表示该选择控件对应的界面缩略图所显示的界面将被加入电子设备100的界面列表中。此外,任一个选择控件均可以响应于用户的点击操作,使电子设备100将其对应的界面缩略图所显示的用户界面加入或者移出电子设备100的界面列表中。例如,电子设备100可以响应于用户对选择控件611B的点击操作,将界面缩略图611A所显示的场景化界面加入电子设备100的界面列表中。此外,在用户界面61中,每个界面缩略图中的上方区域可以均包含一个主界面控件,例如界面缩略图611中的主界面控件611A以及界面缩略图612中的主界面控件612A。当某个主界面控件被选中(即该主界面控件处于深色状态,例如选择控件612A)时,即表示该主界面控件对应的界面缩略图所显示的界面将设定为电子设备100的主屏幕界面。任意一个界面缩略图上的主界面控件均可以响应于用户点击操作,使电子设备100将该界面缩略图对应的界面设定为电子设备100的主屏幕界面。需要说明的是,在用户界面60中,主界面控件611A的具体样式(即外观)可以区别于主界面控件612A以及其他界面缩略图中的主界面控件,以提示用户主界面控件611A对应界面缩略图为场景化桌面的缩略图。需理解,电子设备100还可以使用其他的方式将场景化桌面的缩略图与普通桌面的缩略图进行区分,本申请对此不作限定。
如图6中的(C)所示,电子设备100可以响应于用户对主界面控件611A的点击操作,将界面缩略图611A所显示的场景化界面设定为电子设备100的主屏幕界面;此时,如图6中的(D)所示,主界面控件611A处于被选中状态,主界面控件612A由历史的被选中状态更新为未选中状态,即界面缩略图612所显示的界面不再被设定为电子设备100的主屏幕界面。最后,电子设备100可以相应于用户操作,例如图6中的(D)所示的用户对界面61中空白区域的点击操作,将界面缩略图501对应的场景化桌面(例如前述说明中的用户界面40)设定为电子设备100的主屏幕界面。
结合前述说明可知,在本申请提供的场景化桌面中,dock区域的大小是可以调节的,相应的,dock区中所包含的应用图标的数量也可以随着dock区域的增大而增加。图7所示出的用户界面示出了电子设备100响应于用户操作后,对dock区域的大小进行调节,并随着dock区域大小的变化适应性调整界面中卡片形状的过程。
如图7中的(A)所示,用户界面70(在本申请的一些实施例中,用户界面70可以被称为第一桌面)为本申请实施例所提供的场景化桌面的一种表现形式。用户界面70可以包 括卡片区域701,应用区域702以及dock区域703(在本申请的一些实施例中,dock区域703可以被称为第一dock区域)。其中,各个区域所包含的具体内容以及各个区域的作用具体可以参考前述对图4中用户界面40的相关说明,这里不再赘述。
dock区域703可以包含调节控件703A,电子设备100可以响应于用户对调节控件703A的操作,例如图7中的(A)所示的上拉操作(在本申请的一些实施例中,该上拉操作可以被称为第一操作),将dock区域703的面积调大,相应的,dock区域703中所包含的应用图标的数量也可以增加,其增加的应用图标种类可以是在电子设备100出场之前即设定好的。如图7中的(B)所示的用户界面71(在本申请的一些实施例中,用户界面71可以被称为第三桌面),相比于图7中的(A)所示的dock区域703,图7中的(B)所示出的dock区域713(在本申请的一些实施例中,dock区域713可以被称为第二dock区域)的面积变大,在dock区域713变大之后,dock区域713还增加了“音乐”的图标7131、“电子书”的图标7132、“邮件”的图标7133以及“手机主题”的图标7134。
当用户对调节控件703A再次进行如图7中的(B)所示的上拉操作时,dock区域713的面积也将继续增大,dock区域713中所包含的应用图标的数量也将继续增加。如图7中的(C)所示,相比于图7中的(B)所示的dock区域713,图7中的(C)所示出的dock区域723的面积更大,在dock区域713变大为dock区域723之后,dock区域723还增加了“时钟”的图标7235、“日历”的图标7236、“设置”的图标7237以及“安全中心”的图标7238。
需要说明的是,在本申请提供的场景化桌面中,当dock区域面积的变化时,场景化桌面中卡片区域和应用区域的面积也会相应的改变。因此,在一个可选的实施方式中,当dock区域723面积的变化时,电子设备100可以适应性地更改场景化桌面中卡片区域中的卡片的数量、形状,也可以适应性的增加或删减应用区域中应用的图标。
例如,相比于图7中的(A)所示的dock区域703,图7中的(B)所示的dock区域713的面积变大了,这也意味着相比于图7中的(A)所示的应用区域702,图7中的(B)所示的应用区域712的面积更小。因此,在电子设备100将图7中的(A)所示的dock区域703调整为图7中的(B)所示的dock区域713之后,电子设备100可以不再将“图库”的图标、“尺子”的图标、“遥控器”的图标以及“联系人”的图标显示在应用区域中。具体的,当电子设备100将图7中的(A)所示的dock区域703调整为图7中的(B)所示的dock区域713之后,电子设备100可以删除dock区域702中“图库”的图标、“尺子”的图标、“遥控器”的图标以及“联系人”的图标删除;或者,电子设备100可以不删除dock区域702中“图库”的图标、“尺子”的图标、“遥控器”的图标以及“联系人”的图标,而是将其隐藏在dock区域713的下方,即通过dock区域713覆盖住“图库”的图标、“尺子”的图标、“遥控器”的图标以及“联系人”的图标,本申请对此不作限定。
由于在图7中的(B)所示的用户界面71以及图7中的(C)所示的用户界面72中,dock区域703或dock区域701的面积还未能影响卡片区域的面积,因此,相比于图7中的(A)所示的卡片区域701,图7中的(B)所示的卡片区域711以及图7中的(C)所示的卡片区域721中的卡片的显示方式均未发生变化。但是,当用户对调节控件703A持续进行如图7中的(C)所示的上拉操作,以至将dock区域723面积增大至某个阈值后,dock 区域的面积将会对卡片区域的面积造成影响,则电子设备100可以适应性地更改场景化桌面中卡片区域中的卡片的数量、形状、位置等参数。如图7中的(D)所示的用户界面73,当场景画桌面中的dock区域723增大至如图7中的(D)所示的dock区域733后,则相比于图7中的(A)、(B)和(C)所示的卡片区域701、711以及721,图7中的(D)所示的卡片区域731的面积更小,则电子设备100可以将卡片区域721中各卡片的大小、形状和位置由图7中的(C)所示出的样式调整为图7中的(D)所示的样式。
可以理解的,图7仅示例性地体现了电子设备100对场景化界面各个区域进行调整的过程,电子设备100对场景化界面各个区域进行调整的过程还可以表现为其他形式,其不应构成对本申请实施例的限定。
此外,用户可以对卡片区域中的卡片进行删除、添加以及调整卡片形状大小的操作。相应的,当用户删除卡片区域的某张卡片、将其他应用对应的卡片加入卡片区域或者调整某张卡片的形状大小后,为保证用户的浏览体验,电子设备100也可以适应性地调整卡片区域其他卡片的形状、大小以及在卡片区域中的具体位置。
图8-图10分别示例性的展示了用户在调整某张卡片的形状大小、删除卡片区域的某张卡片以及向卡片区域增加卡片后,电子设备100适应性地调整卡片区域其他卡片的形状、大小以及在卡片区域中的具体位置的具体过程。
如图8中的(A)所示,用户界面80的卡片区域中包含卡片801、卡片802以及卡片803。电子设备可以响应于用户对卡片803(在本申请的一些实施例中,卡片803可以被称为第八卡片)的操作,例如图8中的(A)所示的双指放大缩小(在本申请的一些实施例中,该双指缩小操作可以被称为第四操作),对卡片803进行缩小。
在卡片803被缩小之后的用户界面80可以参考图8中的(B)。在图8中的(B)所示出的用户界面81(在本申请的一些实施例中,用户界面81可以被称为第六桌面)中,卡片813即为卡片803被缩小之后所得的卡片。由于卡片803被缩小,电子设备100适应性的放大了卡片802得到了8中的(B)所示的卡片812。相应的,卡片802以及卡片803的形状以及其在卡片区域中的位置也发生了变化。
此外,电子设备100可以响应于用户对图4所示的界面40中所包含的卡片4013(在本申请的一些实施例中,卡片4013可以被称为第四卡片)的长按操作,显示图9中的(A)所示的用户界面90。如图9中的(A)所示,用户界面90可以包括卡片901、卡片902、卡片903以及控件卡片9031,其中:
卡片901即对应于用户界面40中的卡片4011;卡片902即对应于用户界面40中的卡片4012;卡片903即对应于用户界面40中的卡片4013;
控件卡片9031可以包含移除控件9031A。
电子设备100可以响应于用户对移除控件9031A的操作,例如图9中的(A)所示的点击操作(在本申请的一些实施例中,前述对卡片4013的长按操作以及图9中的(A)所示的点击操作可以一起被称为第二操作),将卡片903从用户界面90中的dock区域中删除,并显示如图9中的(B)所示的用户界面91(在本申请的一些实施例中,用户界面91可以被称为第四桌面)。如图9中的(B)所示,用户界面91中可以包括卡片911、卡片912(在 本申请的一些实施例中,卡片911-912可以被称为至少一个第五卡片),其中:
卡片911即对应于用户界面90中的卡片901;卡片912即对应于用户界面40中的卡片902;
由于卡片903被删除,电子设备100适应性地放大了卡片901以及卡片902,并相应地调整了卡片802以及卡片803的形状以及其在卡片区域中的位置。
另外,电子设备100还可以响应于用户对图4所示的界面40的长按操作,显示图10中的(A)所示的用户界面92。如图10中的(A)所示,用户界面92可以包括卡片921、卡片922、卡片923以及控件卡片9231,其中:
卡片921即对应于用户界面40中的卡片4011;卡片922即对应于用户界面40中的卡片4012;卡片923即对应于用户界面40中的卡片4013;
控件卡片9231可以包含更多卡片控件9231B。
电子设备100可以响应于用户对更多卡片控件9231B的操作,例如图10中的(A)所示的点击操作(在本申请的一些实施例中,前述对卡片4013的长按操作和图10中的(A)所示的点击操作可以被一起被称为第三操作),将其他的卡片加入到卡片区域中,并显示如图10中的(B)所示的用户界面93(在本申请的一些实施例中,用户界面93可以被称为第五桌面)。如图10中的(B)所示,用户界面93中可以包括卡片931、卡片932、卡片933以及卡片934(在本申请的一些实施例中,卡片934可以被称为第六卡片),其中:
卡片931即对应于用户界面92中的卡片921;卡片932即对应于用户界面40中的卡片922;卡片933即对应于用户界面40中的卡片932;卡片934电子设备新添加在卡片区域中的卡片,其为应用生成的卡片(在本申请的一些实施例中,卡片921-934可以被称为至少一个第七卡片)。
由于卡片区域新增了卡片934,电子设备100适应性地减小了卡片923,并相应地调整了卡片923的形状以及其在卡片区域中的位置。
可以理解的,图8-图10仅示例性地体现了在电子设备100对场景化界面卡片进行调整的过程,电子设备100对场景化界面卡片进行调整还可以包括其他情况,也可以表现为其他形式,本申请对此不作限定。例如,在删除卡片后,电子设备100可以不放大剩余的所有卡片,而是放大剩余卡片中的部分卡片,甚至缩小剩余卡片中得到一些卡片,这里不再一一举例说明。
由前述说明可知,在本申请中,电子设备100可以根据时间变化、地点变化、事件的前后关系以及用户习惯等参数自动更新场景化桌面中的内容,此外,电子设备100可以根据其他参数的改变自动更新场景化桌面中的内容,本申请对此不作限定。该内容可以包括卡片区域中的卡片以及应用区域中的图标。接下来结合图11-图13对电子设备100更新场景化桌面中的内容的实施例进行介绍。
一、电子设备100根据时间更新场景化桌面中的内容
可以理解的,用户在日常的一天中,其日程大概率是有规律的。例如,对于大多数人而言,早上7:00为起床时间,上午8:00-9:00需要离开家坐车或者开车赶往公司,晚上6:00之后下班回家。因此,在本申请实施例中,电子设备100可以结合用户习性,在不同时间 对场景化桌面的内容进行更新,以便于用户在能在更为迅速的找到在当前时间需要打开的应用,具体请参考图11。
如图11中的(A)所示,用户界面01(在本申请的一些实施例中,用户界面01可以被称为第一桌面)为本申请实施例提供的一种场景化桌面,其包含的内容为电子设备100根据用户习惯,在上午8:00用户出门上班时为用户推荐的。用户界面01中包括卡片区域011、应用区域012以及dock区域013,其中:
卡片区域011可以包含卡片0111、卡片0112以及卡片0113(在本申请的一些实施例中,卡片0111-0113可以被称为至少一个第一卡片)。其中,卡片0111为应用“日历”生成的日程提醒卡片,用于展示用户预先设定的日程信息;卡片0112为应用“地图”生成的路线导航卡片,用于展示为用户规划到达目的地的路线和时长等信息;卡片0113为应用“音乐”生成的播放器卡片,为用户展示当前播放歌曲的信息,并且为用户提供切换歌曲、播放歌曲、暂停播放歌曲等功能。每一个卡片都可以响应于用户操作,使电子设备100打开与卡片对应的应用并显示相应的应用页面。
应用区域012可以包含“时钟”的图标、的图标、“地图”的图标以及“音乐”的图标。任一个应用的图标可用于响应用户的操作,例如触摸操作,使得电子设备100启动图标对应的应用。
dock区域013可以包含例如“电话”的图标、“短信”的图标、“浏览器”的图标以及“相机”的图标,还可以包含其他应用的图标,本申请实施例对此不作限定(在本申请的一些实施例中,应用区域012以及dock区域013中全部的应用可以被称为至少一个第一应用)。
任何一个应用的图标可用于响应用户的操作,例如触摸操作,使得电子设备100启动图标对应的应用。dock区域013可以包含调节控件,电子设备100可以响应于用户对调节控件的操作,将dock区域013的面积调大,相应的,dock区域013中所包含的应用图标的数量也可以增加,其增加的应用图标种类可以是在电子设备100出场之前即设定好的,具体可以参考前述实施例中对图7的相关说明。此外,在后续实施例所示出的dock区域均可参考前述实施例中对图7的相关说明。
可以理解的,上午8:00左右一般为用户出门上班的时间,此时用户可能需要开车或者搭车前往工作地点;在到达工作地点之前,用户可能还需要查看当天的需要处理的日程;此外,在用户上车之后或者赶往车站的过程中,用户可能会想听音乐,或者通过社交软件与他人进行交流的需求。因此,在上午8:00时用户出门上班时,电子设备100可以为用户展示用户界面01,以满足用户在此时的需求。
当时间到上午8:30,用户可能正处在上班通勤的路上。此时,电子设备100可以对场景化桌面中的卡片区域、应用区域中所显示的内容进行更新。即电子设备100可以将场景化桌面由图11中的(A)所示的用户界面01更新为图11中的(B)所示出的用户界面02(在本申请的一些实施例中,用户界面02可以被称为第二桌面)。如图11中的(B)所示,用户界面02可以包括卡片区域021-023、应用区域024-025以及dock区域026,其中:
卡片区域021-023(在本申请的一些实施例中,卡片021-023中全部的卡片可以被称为至少一个第二卡片)可以被统称为用户界面02的卡片区域,应用区域024-025可以被统称为用户界面02的应用区域。也就是说,在本申请中,场景化桌面的卡片区域或应用区域可 以被划分为多个,同一界面中的多个卡片区域可以不呈现为连接状态,同一界面中的多个应用区域也可以不呈现为连接状态,而是像用户界面02所示的呈现为交错排布的状态。
卡片区域021可以包含应用“微信”生成的乘车码卡片,用于展示用户此时所处位置附近地跌站点的信息,以及为用户提供快速打开地铁乘车码的功能;卡片区域可以022包含应用“每日要闻”生成的新闻卡片,用于展示当日的时事新闻;卡片区域023可以包含应用“股票”生成的股票行情卡片,为用户展示当日的股市行情走势。每一个卡片都可以响应于用户操作,使电子设备100打开与卡片对应的应用并显示相应的应用页面。
应用区域024可以包含的图标、的图标、“小说阅读”的图标以及抖音的图标;应用区域025可以包含“收音机”的图标、的图标、的图标以及“微信”的图标。任一个应用的图标可用于响应用户的操作,例如触摸操作,使得电子设备100启动图标对应的应用。
dock区域026可以包含例如“电话”的图标、“短信”的图标、“浏览器”的图标以及“相机”的图标,还可以包含其他应用的图标,本申请实施例对此不作限定(在本申请的一些实施例中,应用区域024-025以及dock区域026中全部的应用可以被称为至少一个第二应用)。任何一个应用的图标可用于响应用户的操作,例如触摸操作,使得电子设备100启动图标对应的应用,具体可以参考前述对dock区域013的相关说明。
可以理解的,上午8:30左右用户一般正处于上班路上,此时用户可能需要搭乘地铁或公交;在地铁和公交上,用户可能会浏览或者收听当天的新闻、股票资讯或者打开抖音、豆瓣、小说阅读等娱乐软件观看视频或者文章。因此,在上午8:30用户在上班通勤路上时,电子设备100可以为用户展示用户界面02,以满足用户在此时的需求。
在晚上20:00,用户已经下班回到家中。此时,电子设备100可以再次对场景化桌面中的卡片区域、应用区域中所显示的内容进行更新。即电子设备100可以将场景化桌面由图11中的(B)所示的用户界面01更新为图11中的(C)所示出的用户界面03。如图11中的(C)所示,用户界面02可以包括卡片区域031-033、应用区域035-036以及dock区域036,其中:
卡片区域021-023可以被统称为用户界面02的卡片区域,应用区域023-024可以被统称为用户界面02的应用区域。也就是说,在本申请中,场景化桌面的卡片区域或应用区域可以被划分为多个,同一界面中的多个卡片区域可以不呈现为连接状态,同一界面中的多个应用区域也可以不呈现为连接状态,而是像用户界面02所示的呈现为交错排布的状态。
卡片区域031可以包含应用“视频”生成的卡片,用于向用户推荐当前热播的电视剧或者用户最近观看的电视剧,应用“图库”生成的卡片,用于向用户展示用户历史拍摄的照片,应用“游戏”生成的卡片,用于向用户推荐当前热门的游戏,以及应用生成的卡片,用于向用户推荐热销商品或者优惠商品;卡片区域032可以包含应用“运动健康”生成的卡片,用于展示用户当日运动量;卡片区域033可以包含应用生成的卡片,为用户推荐用户可能喜欢的视频或者向用户展示提示信息,例如用户所关注的账号有内容更新等。每一个卡片都可以响应于用户操作,使电子设备100打开与卡片对应的应用并显示相应的应用页面。
应用区域034可以包含“游戏”的图标、“视频”的图标;应用区域035可以包含“Yoga” 的图标、的图标。任一个应用的图标可用于响应用户的操作,例如触摸操作,使得电子设备100启动图标对应的应用。
dock区域036可以包含例如“电话”的图标、“短信”的图标、“浏览器”的图标以及“相机”的图标,还可以包含其他应用的图标,本申请实施例对此不作限定。任何一个应用的图标可用于响应用户的操作,例如触摸操作,使得电子设备100启动图标对应的应用,具体可以参考前述对dock区域013的相关说明。
可以理解的,当用户结束工作回到家中时,用户可能习惯在此时间段中进行一系列的娱乐活动。例如,用在此时可能会打游戏、网上购物、观看电视剧,或者进行瑜伽、跑步等运动。因此,在晚上20:00用户下班到家时,电子设备100可以为用户展示用户界面03,以满足用户在此时的需求。
需要说明的是,本申请实施例仅示例性地体现了在电子设备100根据时间对场景化桌面的内容进行调整的过程,其不应构成对本申请的限定。例如,电子设备100还可以在一天中的其他时刻更新场景化桌面的内容,例如在中午12:00,电子设备100可以更新场景化桌面中的内容,为用户展示应用“美团”所生成的卡片。再比如,在非工作日,用户无需早起上班,则电子设备100在早上8:00所展示的场景化桌面中,其包含的内容可以与用户界面01所包含的内容不同。此外,由于不同的用户的个人习惯不同,同一用户的个人习惯也可能会发生改变,因此,在用户日常使用电子设备的过程中,电子设备100可以实时记录埋点数据,该埋点数据可以包括用户最近常用的应用以及打开该应用的时间,电子设备100可以对新获取的埋点数据进行分析后,根据用户最新的使用习惯对场景化桌面的内容进行更新,为用户提供更好的使用体验。
二、电子设备100根据地点更新场景化桌面中的内容
在本实施例中,电子设备100可以为具备定位GPS定位功能的电子设备。因此,电子设备100可以实时获取当前所处环境的位置信息,并结合该位置信息,在不同地点对场景化桌面的内容进行更新,以便于用户在能在更为迅速的找到在当前地点需要打开的应用,具体请参考图12。
如图12中的(A)所示,用户界面04为本申请实施例提供的一种场景化桌面,其包含的内容为电子设备100在用户靠近公交站时为用户推荐的。用户界面04中包括卡片区域041、应用区域042以及dock区域043,其中:
卡片区域041可以包含卡片0411、卡片0412以及卡片0413。其中,卡片0411为应用“地图”生成的交通推荐卡片,用于展示用户当前所处公交站的相关公交路线的信息;卡片0412为应用生成的卡片,用于展示用户此时所处位置附近公交站点的信息,以及为用户提供快速打开地铁乘车码的功能;卡片0413为应用“音乐”生成的播放器卡片,为用户展示当前播放歌曲的信息,并且为用户提供切换歌曲、播放歌曲、暂停播放歌曲等功能。每一个卡片都可以响应于用户操作,使电子设备100打开与卡片对应的应用并显示相应的应用页面。
应用区域042可以包含的图标、的图标、“地图”的图标以及“音乐”的图标。任一个应用的图标可用于响应用户的操作,例如触摸操作,使得电子设备100启动图标对应的应用。
dock区域043可以包含例如“电话”的图标、“短信”的图标、“浏览器”的图标以及“相机”的图标,还可以包含其他应用的图标,本申请实施例对此不作限定。任何一个应用的图标可用于响应用户的操作,例如触摸操作,使得电子设备100启动图标对应的应用,具体可以参考前述对dock区域013的相关说明。
可以理解的,在公交站或者地铁站旁,用户可能需要搭乘地铁或公交,因此,电子设备100可以向用户展示与附近站点相关的线路信息,以及为用户提供快速打开乘车码的功能卡片;此外在地铁和公交上,用户可能想要收听音乐或者浏览短视频。因此,当用户靠近公交站或者地铁站旁,电子设备100可以为用户展示用户界面04,以满足用户在搭乘公交或者地铁时的需求。
当用户到达电影院旁边,电子设备100可以对场景化桌面中的卡片区域、应用区域中所显示的内容进行更新。即电子设备100可以将场景化桌面由图12中的(A)所示的用户界面04更新为图12中的(B)所示出的用户界面05。如图12中的(B)所示,用户界面05可以包括卡片区域051、应用区域052-053以及dock区域054,其中:
卡片区域051可以包含卡片0511、卡片0512以及卡片0513。其中,卡片0511为应用生成的卡片,用于展示当前热映的电影信息,并为用户提供快速购票的通道;卡片0512为应用生成的卡片,用于向用户展示大众评选出来的高分电影,为用户提供选片参考;卡片0513为应用“地图”生成的卡片,为用户展示附近电影院的所处地点的信息,以及该电影院目前提供的各项活动以及该电影院附近的娱乐项目。每一个卡片都可以响应于用户操作,使电子设备100打开与卡片对应的应用并显示相应的应用页面。
应用区域052可以包含的图标、的图标;应用区域053可以包含的图标、的图标。任一个应用的图标可用于响应用户的操作,例如触摸操作,使得电子设备100启动图标对应的应用。
dock区域054可以包含例如“电话”的图标、“短信”的图标、“浏览器”的图标以及“相机”的图标,还可以包含其他应用的图标,本申请实施例对此不作限定。任何一个应用的图标可用于响应用户的操作,例如触摸操作,使得电子设备100启动图标对应的应用,具体可以参考前述对dock区域013的相关说明。
可以理解的,在用户靠近电影院时,用户可能存在看电影的娱乐需求。因此,电子设备100可以向用户展示最近上映的电影的相关信息,为用户提供一些大众评选出来的高分电影,在用户选择电影的时候提供参考信息;此外,基于用户的购票需求,电子设备100可以为用户显示用于购票的应用或者应用生成的卡片,例如应用“淘票”或者“微信”等。因此,当用户靠近电影院周围时,电子设备100可以为用户展示用户界面05,以满足用户的观影需求。
当用户走近快递站点(例如快递柜或者快递超市等)时,电子设备100可以再次对场景化桌面中的卡片区域、应用区域中所显示的内容进行更新。即电子设备100可以将场景化桌面由图12中的(B)所示的用户界面05更新为图12中的(C)所示出的用户界面06。如图12中的(C)所示,用户界面06可以包括卡片区域061、应用区域062以及dock区域063,其中:
卡片区域061可以包含卡片0611,卡片0611为应用生成的卡片,用于 展示用户当前未取出的快件信息,例如快件信息的取件地点、取件码和运单号等等。卡片0611可以响应于用户操作,使电子设备100打开与卡片对应的应用并显示相应的应用页面。
应用区域062可以包含的图标、的图标、的图标以及的图标。任一个应用的图标可用于响应用户的操作,例如触摸操作,使得电子设备100启动图标对应的应用。
dock区域063可以包含例如“电话”的图标、“短信”的图标、“浏览器”的图标以及“相机”的图标,还可以包含其他应用的图标,本申请实施例对此不作限定。任何一个应用的图标可用于响应用户的操作,例如触摸操作,使得电子设备100启动图标对应的应用,具体可以参考前述对dock区域013的相关说明。
可以理解的,在用户靠近快递站点时,电子设备100可以向用户展示用户未取出的快件信息;同时,电子设备100还可以为用户展示出可用于查看快件详细信息的应用,例如应用或者等,以满足用户的快速取件的需求。
三、电子设备100根据事件前后关系更新场景化桌面中的内容
可以理解的,用户前后两次查看电子设备的目的之前可能存在联系。例如,当用户在使用应用“地图”搜寻附近的电影院后,用户下一次查看电子设备时,可能希望打开用程序或者购买电影票。因此,在本实施例中,电子设备100可以根据用户上一次使用电子设备时浏览过的应用或者卡片,对场景化桌面的内容进行更新,为用户推荐符合用户当前使用需求的卡片或者应用,具体请参考图13。
如图13中的(A)所示,用户界面07为本申请实施例提供的一种场景化桌面,其包含的内容为电子设备100识别到用户当日日程中包含“看电影”的日程时为用户推荐的。用户界面07中包括卡片区域071、应用区域072以及dock区域073,其中:
卡片区域071可以包含卡片0711以及卡片0712。其中,卡片7111为应用“日历”生成的日程卡片,用于展示用户预先设定的日程信息;卡片7111为应用“地图”生成的推荐卡片,其可以是电子设备100识别用户预先设定的日程信息,即卡片7111中显示的“09:00-11:00和小美一起看电影”这一日程信息,并基于该日程信息为用户展示出的卡片,其可以用于为用户推荐用户所在地点周围的电影院或影城等娱乐场所。每一个卡片都可以响应于用户操作,使电子设备100打开与卡片对应的应用并显示相应的应用页面。
应用区域072可以包含的图标、的图标、“地图”的图标以及“音乐”的图标。任一个应用的图标可用于响应用户的操作,例如触摸操作,使得电子设备100启动图标对应的应用。
dock区域743可以包含例如“电话”的图标、“短信”的图标、“浏览器”的图标以及“相机”的图标,还可以包含其他应用的图标,本申请实施例对此不作限定。任何一个应用的图标可用于响应用户的操作,例如触摸操作,使得电子设备100启动图标对应的应用,具体可以参考前述对dock区域013的相关说明。
可以理解的,电子设备100识别用户预先设定“09:00-11:00和小美一起看电影”这一日程信息之后,在该日程的开始时间(即9:00)之前,电子设备100可以基于提醒用户该日程的开始时间,并基于该日程为在场景化界面中显示与该日程相关的应用或者卡片,例如显示卡片0711,向用户展示附近的电影院信息以便用户为日程做出相应的规划。
在用户对用户界面70中的卡片或者应用进行操作,例如图13中的(A)所示的用户对卡片0712的点击操作(在本申请的一些实施例中,对卡片0712的点击操作可以被称为第五操作)之后,电子设备100即可以响应该操作,判定用户近期有前往电影院的计划,则相应的,电子设备100可以对场景化桌面中的卡片区域、应用区域中所显示的内容进行更新,即电子设备100可以将场景化桌面由图13中的(A)所示的用户界面07更新为图13中的(B)所示出的用户界面08。
如图13中的(B)所示,用户界面08可以包括卡片区域081、应用区域082以及dock区域083,其中:
卡片区域081可以包含卡片0811、卡片0812、卡片0513以及卡片0814。其中,卡片0812为应用“日历”生成的卡片,用于展示用户预先设定的日程信息,从界面08中可以看出,此时时间为8:30,距“与小美一起看电影”这一日程的开始时间还有30分钟,为突出该日程的重要性以及提醒用户该日程即将开始,卡片0811中的所展示的信息和卡片0711中展示的信息是不同的;卡片0812为应用“地图”生成的卡片,为用户展示附近附件电影院的所处地点的信息以及前往该电影院的路线图,以便用户确定出发时间;卡片0813为应用生成的播放器卡片,用于在用户前往电影院的路程中为用户播放音乐;卡片0814为应用生成的卡片,用于向用户展示大众评选出来的高分电影,为用户提供选片参考。每一个卡片都可以响应于用户操作,使电子设备100打开与卡片对应的应用并显示相应的应用页面。
应用区域052可以包含的图标、的图标。任一个应用的图标可用于响应用户的操作,例如触摸操作,使得电子设备100启动图标对应的应用。
dock区域083可以包含例如“电话”的图标、“短信”的图标、“浏览器”的图标以及“相机”的图标,还可以包含其他应用的图标,本申请实施例对此不作限定。任何一个应用的图标可用于响应用户的操作,例如触摸操作,使得电子设备100启动图标对应的应用,具体可以参考前述对dock区域013的相关说明。
可以理解的,在电子设备100判定用户有前往电影院的计划之后,在用户前往电影院之前,电子设备100可以向用户展示相附近影院的位置信息,并展示最近上映的电影的相关信息,为用户提供一些大众评选出来的高分电影,例如显示卡片0812,向用户展示附近的电影院信息,以便用户为日程做出相应的规划。
在用户对用户界面80中的卡片或者应用进行操作,例如图13中的(B)所示的用户对卡片0814的点击操作(在本申请的一些实施例中,对卡片0814的点击操作可以被称为第六操作)之后,电子设备100即可以响应该操作,认为用户近期有观看电影的计划,则相应的,电子设备可以对场景化桌面中的卡片区域、应用区域中所显示的内容进行更新,即电子设备100可以将场景化桌面由图13中的(B)所示的用户界面08更新为图13中的(C)所示出的用户界面09。
如图13中的(C)所示,用户界面09可以包括卡片区域091、应用区域092以及dock区域093,其中:
卡片区域091可以包含卡片0911、卡片0912以及卡片0913。其中,卡片0911为应用“日历”生成的卡片,用于展示用户预先设定的日程信息,从界面09中可以看出,此时时 间为08:50,距“与小美一起看电影”这一日程的开始时间还有10分钟,即该用户可能正要执行该日程;卡片0912为应用生成的卡片,用于展示当前热映的电影信息,并为用户提供快速购票的通道;卡片0913为应用生成的卡片,用于向用户展示当天各电影院的开展的观影活动信息。每一个卡片都可以响应于用户操作,使电子设备100打开与卡片对应的应用并显示相应的应用页面。
应用区域092可以包含的图标、的图标、的图标以及 的图标。任一个应用的图标可用于响应用户的操作,例如触摸操作,使得电子设备100启动图标对应的应用。
dock区域063可以包含例如“电话”的图标、“短信”的图标、“浏览器”的图标以及“相机”的图标,还可以包含其他应用的图标,本申请实施例对此不作限定。任何一个应用的图标可用于响应用户的操作,例如触摸操作,使得电子设备100启动图标对应的应用,具体可以参考前述对dock区域013的相关说明。
可以理解的,在电子设备判定用户有观影需求之后,电子设备100可以向用户展示最近上映的电影的相关信息,此外,基于用户的购票需求,电子设备100可以为用户显示用于购票的应用或者应用生成的卡片,例如应用或者等,以满足用户的观影需求。
图14为本申请提供的一种显示方法的流程图。实施本申请实施例提供的显示方法,电子设备可以根据在当前场景下用户的实际需求,动态的呈现用户当前场景下的可能会使用的应用和服务卡片,能帮助用户更快定位到自己需要浏览的应用。
如图14所示,该显示方法可以包括以下步骤:
S101:识别第一场景。
电子设备识别上述第一场景。该电子设备可以为手机(mobile phone)、车载设备(例如车载单元(On Board Unit,OBU))、平板电脑(pad)、带显示功能的电脑(如笔记本电脑、掌上电脑等)等。具体的,该电子设备可以是本申请实施例提供的电子设备100。可理解,对于上述电子设备的具体形态,本申请不作限定。
此外,上述电子设备对根据可以为时间、地点以及事件前后关系等维度识别上述第一场景识。例如,当上述电子设备以时间为根据识别上述第一场景时,电子设备可以根据当前的时间确定用户在这个时间点习惯做的事情;当上述电子设备以地点为根据识别上述第一场景时,电子设备可以根据当前地点以及该地点的特性确定用户此刻可能正在进行的活动;当上述电子设备以地点为根据识别上述第一场景时,电子设备可以根据用户上一次对电子设备的操作(例如打开地图导航去电影院)确定用户接下来的具体目的(例如观看电影)。
S102:显示第一桌面,上述第一桌面包括多个第一应用图标和至少一个第一卡片。
在本方法中,上述第一桌面的中所显示的内容(即上述多个第一应用和上述至少一个第一卡片)是基于上述第一场景确定的。也就是说,所述多个第一应用是用户在所述第一场景下可能需要浏览或者打开的应用,所述至少一个第一卡片是用户在所述第一场景下可能需要浏览或者打开的应用生成的卡片。电子设备根据在当前场景下用户的实际需求,呈 现用户当前场景下的可能会使用的应用和服务卡片,能帮助用户更快定位到自己需要浏览的应用,节省用户的时间。
S103:识别上述第一场景变化为第二场景。
S104:显示第二桌面,上述第二桌面包括多个第二应用和至少一个第二卡片,上述多个第二应用图标与上述多个第一应用图标不同,和/或上述至少一个第二卡片与上述至少一个第一卡片不同。
同理,上述多个第二应用和上述至少一个第二卡片是基于上述第二场景确定的,且在场景变化之后,电子设备所显示的内容也会不同,即上述多个第二应用图标与上述多个第一应用图标不同,和/或上述至少一个第二卡片与上述至少一个第一卡片不同。
可以理解的,在大多数情况下,用户查看手机屏幕都是带有明确的目的性的,且该目的性可能与时间和地点以及用户的个人使用偏好等因素强相关。例如用户一般会在在早上查看日程表,在晚上打开游戏、运动健康等应用。再比如,用户一般会在汽车站旁打开乘车码、在商店打开付款码等等。因此,在本方法中,当场景变化时,电子设备的桌面所呈现的内容也会随着场景的变化而变化。这样,电子设备可以根据场景动态的呈现用户当前场景下的可能会使用的应用和服务卡片,进一步提升了用户体验。
需要说明的是,在方法中,上述电子设备判定场景由上述第一场景变化为上述第二场景的依据可以包括但不限于以下几种:
1.时间变化
可以理解的,用户在日常的一天中,其日程大概率是有规律的。例如,对于大多数人而言,早上7:00为起床时间,上午8:00-9:00需要离开家坐车或者开车赶往公司,晚上6:00之后下班回家。因此,在本实施方式中,电子设备可以结合用户习性,在不同时间对场景化桌面的内容进行更新,以便于用户在能在更为迅速的找到在当前时间需要打开的应用。具体可以参考前述对图11的相关说明,这里不再赘述。
2.地点变化
用户需要打开的应用往往和用户所处的地点有直接的联系。例如,用户一般会在汽车站旁打开乘车码、在商店打开付款码,在高速路上打开导航软件等。因此,在本实施方式中,电子设备可以实时获取当前所处环境的位置信息,并结合该位置信息,在不同地点对场景化桌面的内容进行更新,以便于用户在能在更为迅速的找到在当前地点需要打开的应用。具体可以参考前述对图12的相关说明,这里不再赘述。
3.接收到用户操作
可以理解的,用户前后两次查看电子设备的目的之前可能存在联系。例如,当用户在使用应用“地图”搜寻附近的电影院后,用户下一次查看电子设备时,可能希望打开用程序“豆瓣”或者“美团”购买电影票。因此,在本实施例中,电子设备可以根据用户上一次使用电子设备时浏览过的应用或者卡片,对场景化桌面的内容进行更新,为用户推荐符合用户当前使用需求的卡片或者应用。具体可以参考前述对图13的相关说明,这里不再赘述。
可选的,在本方法中,上述第一桌面和上述第二桌面中可以包含dock区域,用户可以通过例如上拉、下拉的操作对dock区域的大小进行调节,dock区域可以容纳的程序数量也 会随之更多。这样,用户可以将更多自己常用的应用放入dock区域中,以便用户能够直接快速启动更多的应用。可选的,当用户对dock区域的面积进行调节后,电子设备可以适应性地适应性的增加或删减dock区域外应用的图标,以为用户提供更好的视觉体验。可选的,当电子设备改变上述第一桌面或者上述第二桌面中dock区域面积时,电子设备可以适应性地更改上述第一桌面或者第二桌面中卡片的显示方式,该显示方式包括所述数量、形状以及卡片的排列方式,为用户提供更好的浏览效果。具体可以参考前述对图7的相关说明,这里不再赘述。
在一个可选的实施方式中,上述电子设备可以响应于用户操作,对上述第一桌面或上述第二桌面中卡片进行删除、增加、缩放等操作。相应的,当用户删除上述第一桌面或上述第二桌面中的某张卡片、将其他应用对应的卡片加入上述第一桌面或上述第二桌面、或者调整上述第一桌面或上述第二桌面某张卡片的形状大小后,为保证用户的浏览体验,电子设备可以适应性地调整上述第一桌面或上述第二桌面中卡其他卡片的形状、大小以及在卡片区域中的具体位置。具体可以参考前述对图8-图10的相关说明,这里不再赘述。
本申请实施例还提供了一种电子设备,该电子设备包括:一个或多个处理器和存储器;其中,存储器与所述一个或多个处理器耦合,该存储器用于存储计算机程序代码,该计算机程序代码包括计算机指令,该一个或多个处理器调用该计算机指令以使得所述电子设备执行前述实施例中所示的方法。
上述实施例中所用,根据上下文,术语“当…时”可以被解释为意思是“如果…”或“在…后”或“响应于确定…”或“响应于检测到…”。类似地,根据上下文,短语“在确定…时”或“如果检测到(所陈述的条件或事件)”可以被解释为意思是“如果确定…”或“响应于确定…”或“在检测到(所陈述的条件或事件)时”或“响应于检测到(所陈述的条件或事件)”。
在上述实施例中,可以全部或部分地通过软件、硬件、固件或者其任意组合来实现。当使用软件实现时,可以全部或部分地以计算机程序产品的形式实现。所述计算机程序产品包括一个或多个计算机指令。在计算机上加载和执行所述计算机程序指令时,全部或部分地产生按照本申请实施例所述的流程或功能。所述计算机可以是通用计算机、专用计算机、计算机网络、或者其他可编程装置。所述计算机指令可以存储在计算机可读存储介质中,或者从一个计算机可读存储介质向另一个计算机可读存储介质传输,例如,所述计算机指令可以从一个网站站点、计算机、服务器或数据中心通过有线(例如同轴电缆、光纤、数字用户线)或无线(例如红外、无线、微波等)方式向另一个网站站点、计算机、服务器或数据中心进行传输。所述计算机可读存储介质可以是计算机能够存取的任何可用介质或者是包含一个或多个可用介质集成的服务器、数据中心等数据存储设备。所述可用介质可以是磁性介质,(例如,软盘、硬盘、磁带)、光介质(例如DVD)、或者半导体介质(例如固态硬盘)等。
本领域普通技术人员可以理解实现上述实施例方法中的全部或部分流程,该流程可以由计算机程序来指令相关的硬件完成,该程序可存储于计算机可读取存储介质中,该程序在执行时,可包括如上述各方法实施例的流程。而前述的存储介质包括:ROM或随机存储记忆体RAM、磁碟或者光盘等各种可存储程序代码的介质。

Claims (14)

  1. 一种显示方法,其特征在于,应用于电子设备,所述方法包括:
    在第一场景下,显示第一桌面,所述第一桌面包括多个第一应用图标和至少一个第一卡片;
    识别所述第一场景变化为第二场景,显示第二桌面,所述第二桌面包括多个第二应用图标和至少一个第二卡片,所述多个第二应用图标与所述多个第一应用图标不同,和/或所述至少一个第二卡片与所述至少一个第一卡片不同。
  2. 根据权利要求1所述的方法,其特征在于,所述第一桌面包括第一dock区域,所述方法还包括:
    接收作用于所述第一dock区域的第一操作;
    响应于所述第一操作,显示第三桌面,所述第三桌面包括第二dock区域,所述第二dock区域的面积与所述第一dock区域的面积不同;
    在所述第二dock区域的面积大于所述第一dock区域的面积的情况下,所述第二dock区域中应用图标的数量大于所述第一dock区域中应用图标的数量;在所述第二dock区域的面积小于所述第一dock区域的面积的情况下,所述第二dock区域中应用图标的数量小于所述第一dock区域中应用图标的数量。
  3. 根据权利要求2所述的方法,其特征在于,在所述第一桌面的所述至少一个第一应用图标中,位于所述第一dock区域外的应用图标的数量为N个,在所述第三桌面中,位于所述第二dock区域外的应用图标的数量为M个;
    在所述第二dock区域的面积大于所述第一dock区域的面积的情况下,所述N大于所述M;在所述第二dock区域的面积小于所述第一dock区域的面积的情况下,所述N小于所述M。
  4. 根据权利要求2或3所述的方法,其特征在于,所述第三桌面包括至少一个第三卡片,所述至少一个第三卡片为所述至少以第一卡片中的全部卡片或者部分卡片,所述至少一个第一卡片和所述至少一个第三卡片的显示方式不同。
  5. 根据权利要求1-4任一项所述的方法,其特征在于,所述方法还包括:
    接收作用于第四卡片的第二操作,所述第四卡片为所述至少一个第一卡片中的任意一个卡片;
    响应于所述第二操作,删除所述第四卡片。
  6. 根据权利要求5所述的方法,其特征在于,在接收作用于第四卡片的第二操作之后,所述方法还包括:
    显示第四桌面,所述第四桌面包括至少一个第五卡片,所述至少一个第五卡片和所述至少一个第一卡片的显示方式不同。
  7. 根据权利要求1-6任一项所述的方法,其特征在于,所述方法还包括:
    接收作用于所述第一桌面的第三操作,
    响应于所述第三操作,生成第六卡片。
  8. 根据权利要求7所述的方法,其特征在于,在接收作用于所述第一桌面的第三操作之后,所述方法还包括:
    显示第五桌面,所述第五桌面包括至少一个第七卡片,所述至少一个第七卡片包括所述至少一个第一卡片和所述第六卡片;所述至少一个第七卡片和所述至少一个第一卡片的显示方式不同。
  9. 根据权利要求1-4任一项所述的方法,其特征在于,所述方法还包括:
    接收作用在第八卡片的第四操作,所述第八卡片为所述至少一个第一卡片中的任意一个卡片;
    响应于所述第四操作,放大或缩小所述第八卡片。
  10. 根据权利要求9所述的方法,其特征在于,在接收作用于第八卡片的第四操作之后,所述方法还包括:
    显示第六桌面,所述第六桌面包括所述至少一个第一卡片,所述第六桌面中的所述至少一个第一卡片和所述第一桌面中的所述至少一个第一卡片的显示方式不同。
  11. 根据权利要求1-10任一项所述的方法,其特征在于,所述识别所述第一场景变化为第二场景,包括:
    识别到时间从第一时刻变化为第二时刻,所述第一时刻对应所述第一场景,所述第二时刻对应所述第二场景;
    或,识别到所述电子设备所处的地点从第一地点变化为第二地点,所述第一地点对应所述第一场景,所述第二地点对应所述第二场景;
    或,在第三时刻,识别用户对所述电子设备进行第五操作,在第四时刻,识别用户对所述电子设备进行第六操作,所述用户对所述电子设备进行第五操作对应所述第一场景,所述用户对所述电子设备进行第六操作对应所述第二场景。
  12. 一种电子设备,其特征在于,所述电子设备包括:一个或多个处理器、存储器和显示屏;
    所述存储器与所述一个或多个处理器耦合,所述存储器用于存储计算机程序代码,所述计算机程序代码包括计算机指令,所述一个或多个处理器调用所述计算机指令以使得所述电子设备执行如权利要求1-11中任一项所述的方法。
  13. 一种芯片系统,所述芯片系统应用于电子设备,所述芯片系统包括一个或多个处理器,所述处理器用于调用计算机指令以使得所述电子设备执行如权利要求1-11中任一项 所述的方法。
  14. 一种计算机可读存储介质,包括指令,其特征在于,当所述指令在电子设备上运行时,使得所述电子设备执行如权利要求1-11中任一项所述的方法。
PCT/CN2023/088873 2022-07-28 2023-04-18 显示方法及电子设备 WO2024021691A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210904347.3 2022-07-28
CN202210904347.3A CN117519854A (zh) 2022-07-28 2022-07-28 显示方法及电子设备

Publications (2)

Publication Number Publication Date
WO2024021691A1 WO2024021691A1 (zh) 2024-02-01
WO2024021691A9 true WO2024021691A9 (zh) 2024-05-10

Family

ID=89705198

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/088873 WO2024021691A1 (zh) 2022-07-28 2023-04-18 显示方法及电子设备

Country Status (2)

Country Link
CN (1) CN117519854A (zh)
WO (1) WO2024021691A1 (zh)

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105930038A (zh) * 2016-06-23 2016-09-07 北京金山安全软件有限公司 一种停靠栏图标的显示方法、装置及移动终端
CN108319675A (zh) * 2018-01-29 2018-07-24 出门问问信息科技有限公司 信息展示方法、装置、电子设备以及计算机存储介质
CN111182145A (zh) * 2019-12-27 2020-05-19 华为技术有限公司 显示方法及相关产品
CN113888159B (zh) * 2021-06-11 2022-11-29 荣耀终端有限公司 一种应用的功能页面的开启方法和电子设备

Also Published As

Publication number Publication date
WO2024021691A1 (zh) 2024-02-01
CN117519854A (zh) 2024-02-06

Similar Documents

Publication Publication Date Title
CN109889630B (zh) 显示方法及相关装置
WO2021213164A1 (zh) 应用界面交互方法、电子设备和计算机可读存储介质
WO2021129688A1 (zh) 显示方法及相关产品
EP3893129A1 (en) Recommendation method based on user exercise state, and electronic device
WO2020134869A1 (zh) 电子设备的操作方法和电子设备
CN110825469A (zh) 语音助手显示方法及装置
WO2021004527A1 (zh) 一种倒计时显示方法及电子设备
CN110401767B (zh) 信息处理方法和设备
WO2020259554A1 (zh) 可进行学习的关键词搜索方法和电子设备
CN114363462B (zh) 一种界面显示方法、电子设备及计算机可读介质
CN113496426A (zh) 一种推荐服务的方法、电子设备和系统
WO2023241209A1 (zh) 桌面壁纸配置方法、装置、电子设备及可读存储介质
CN111835904A (zh) 一种基于情景感知和用户画像开启应用的方法及电子设备
CN113163394B (zh) 一种情景智能服务的信息共享方法及相关装置
CN114756785A (zh) 页面显示的方法、装置、电子设备以及可读存储介质
CN111492678A (zh) 一种文件传输方法及电子设备
WO2021218837A1 (zh) 一种提醒方法及相关装置
CN115022982B (zh) 多屏协同无感接入方法、电子设备及存储介质
CN113934352B (zh) 通知消息处理方法、电子设备和计算机可读存储介质
CN117785340A (zh) 一种卡片分享的方法及装置
WO2024021691A9 (zh) 显示方法及电子设备
WO2020077503A1 (zh) 一种信息显示方法及装置
WO2024188190A1 (zh) 一种卡片显示方法及电子设备
WO2024114785A1 (zh) 一种图像处理方法、电子设备及系统
WO2023207799A1 (zh) 消息处理方法和电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23844908

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2023844908

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 2023844908

Country of ref document: EP

Effective date: 20240808