CN113973153B - Display method, graphical interface and related device - Google Patents

Display method, graphical interface and related device Download PDF

Info

Publication number
CN113973153B
CN113973153B CN202111101051.XA CN202111101051A CN113973153B CN 113973153 B CN113973153 B CN 113973153B CN 202111101051 A CN202111101051 A CN 202111101051A CN 113973153 B CN113973153 B CN 113973153B
Authority
CN
China
Prior art keywords
screen
electronic device
user
travel
time
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111101051.XA
Other languages
Chinese (zh)
Other versions
CN113973153A (en
Inventor
张沁峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202111101051.XA priority Critical patent/CN113973153B/en
Publication of CN113973153A publication Critical patent/CN113973153A/en
Application granted granted Critical
Publication of CN113973153B publication Critical patent/CN113973153B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/7243User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages
    • H04M1/72439User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages for image or video messaging
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/26Power supply means, e.g. regulation thereof
    • G06F1/32Means for saving power
    • G06F1/3203Power management, i.e. event-based initiation of a power-saving mode
    • G06F1/3234Power saving characterised by the action undertaken
    • G06F1/325Power saving in peripheral device
    • G06F1/3265Power saving in display device
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72451User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to schedules, e.g. using calendar applications
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72454User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Signal Processing (AREA)
  • Software Systems (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Environmental & Geological Engineering (AREA)
  • General Business, Economics & Management (AREA)
  • Business, Economics & Management (AREA)
  • Multimedia (AREA)
  • User Interface Of Digital Computer (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

The application discloses a display method, a graphical interface and a related device, wherein in the display method, electronic equipment can generate dynamic or static screen-off images according to travel information such as travel modes, positions, time, date, weather, height and the like of users, and the screen-off images are displayed through a display screen of the electronic equipment when the electronic equipment is in a screen-off state. Therefore, the user can quickly know travel information such as travel mode, position, time, date, weather, height and the like of the user through the display screen of the electronic equipment in the screen-off state, the electronic equipment expresses the travel information of the user in the form of images, the interestingness of screen-off display is provided, and personalized screen-off images are provided for the user.

Description

Display method, graphical interface and related device
Technical Field
The present application relates to the field of terminals and communications technologies, and in particular, to a display method, a graphical interface, and related devices.
Background
With rapid development of electronic technology, electronic devices such as mobile phones, tablet computers and the like have become indispensable tools in life and work of people. For example, taking a mobile phone as an example, people watch the mobile phone for about 150 times a day, however, most users only light up the mobile phone screen to watch time, notification and other information. Based on this, the off-screen display (always on display, AOD) function of the electronic device has been developed, that is, only by lighting part of the pixels of the screen to display information such as clock, date, notification, etc., while other pixels of the screen are always in an off state (for example, appear black), so that the user can conveniently look up the information without lighting the whole screen of the electronic device, and reduce the power consumption of the electronic device.
However, the present electronic device has a single display form of the off-screen display function, and when the off-screen display function is started on the terminal device, the user can only view information such as time, notification and the like, and the increasing demands of the user cannot be met.
Disclosure of Invention
The application provides a display method, a graphical interface and a related device, which realize that a screen-off image is generated according to the travel of a user and is displayed when the electronic equipment is in screen-off.
In a first aspect, the present application provides a display method, the method comprising: the electronic equipment detects that the electronic equipment enters a screen-off state at the first time; the electronic equipment acquires a first travel mode, wherein the first travel mode indicates whether a user travels at a first travel time or not, or a vehicle used by the user at the first travel time; under the condition that the first time is within the first travel time, the electronic equipment acquires one or more pieces of information of the first time, weather of the first time, the position of the electronic equipment under the first time or the altitude of the electronic equipment under the first time, and generates a first screen-off image according to the one or more pieces of information and the first travel mode, wherein the first screen-off image comprises interface elements indicating the information and the first travel mode; and responding to the screen-off instruction, and enabling the electronic equipment to screen off and display the first screen-off image.
According to the display method, the screen-off image can be generated according to the travel information of the user, such as travel mode, position, time, day weather, height and the like, and the screen-off image is displayed through the display screen of the electronic equipment when the electronic equipment is in the screen-off state. Therefore, the user can quickly acquire travel information such as travel mode, position, time, date, weather, height and the like of the user by previewing the screen-off image in the screen-off state, the interestingness of screen-off display is improved, and personalized customized screen-off images are provided for the user.
With reference to the first aspect, in a possible implementation manner, the displaying, by the electronic device, the first screen-off image specifically includes: the electronic device lights up a partial region of the display screen and displays the first screen-off image in the partial region.
By implementing the method provided by the first aspect, the electronic device can display the off-screen image only by lighting the partial area of the display screen, so that the power consumption of the electronic device is reduced.
With reference to the first aspect, in one possible implementation manner, the first screen-off image is composed of a first interface element and a second interface element located in different layers, and the electronic device generates the first screen-off image according to the one or more items of information and the first travel mode, and specifically includes: the electronic device obtains a first interface element indicating the one or more items of information and a second interface element indicating the first travel mode; and the electronic equipment combines the first interface element and the second interface element to obtain the first screen-off image.
That is, the interface elements acquired by the electronic device according to the travel information are located in different layers, the electronic device can acquire the screen-off image by combining the layers, and the mutual influence among the interface elements when the travel information of the user is changed is avoided. For example, in the off-screen state, when the weather changes, the electronic device can only change the interface element on the layer corresponding to the weather, so as to help the electronic device to quickly update the off-screen image.
With reference to the first aspect, in one possible implementation manner, the electronic device pre-stores a plurality of interface elements, where the plurality of interface elements includes the first interface element and the second interface element.
That is, the electronic device may include a database in which a plurality of interface elements, i.e., UI elements, may be stored, and the electronic device may select an appropriate interface element from the database according to acquired travel information such as weather, time, date, location, altitude, travel mode, and the like.
With reference to the first aspect, in a possible implementation manner, after the electronic device displays the first screen-off image, the method further includes: the electronic device receives a first operation on the first off-screen image, and in response to the first operation, the electronic device updates the first off-screen image.
That is, the electronic device may detect an operation by the user on the off-screen image, and update the off-screen image in response to the operation. Specifically, the electronic device can simulate the effect of actually opening and closing the window, when the user clicks the screen-off image, the screen-off image when the window is closed is displayed, when the user clicks the screen-off image again, the screen-off image when the window is opened is displayed, the interactivity of the user and the electronic device is enhanced, and the interestingness of screen-off display is improved.
With reference to the first aspect, in a possible implementation manner, the interface element includes a border of the first screen-off image, where the border indicates whether the user travels or not, or a vehicle used by the user.
The outer contour of the quench image may represent the outer contour of a window that may be related to the user's travel pattern, e.g., the window may represent a blind when the user is not in the presence of an airplane, the window may represent an airplane window when the user is in the presence of an airplane, etc. Thus, the effect that a user observes a landscape through a window can be simulated by using the screen-off image.
With reference to the first aspect, in one possible implementation manner, before the electronic device obtains the first travel mode, the method further includes: the electronic equipment acquires journey information of a user through journey application; the electronic equipment generates a user travel card according to the travel information, wherein the user travel card indicates the first travel mode; the electronic equipment acquires a first travel mode of a user, and specifically comprises the following steps: the electronic equipment acquires the first travel mode of the user from the user travel card.
The travel card is an interface visual image which is provided for the user and used for displaying the travel information according to the travel information on the travel class application, the user can directly use the travel card on the desktop to quickly learn the travel of the user, and the display method can directly acquire the travel mode of the user from the travel card and quickly generate a screen-off image.
With reference to the first aspect, in a possible implementation manner, the method further includes: and under the condition that the first time is not in the first travel time, responding to the screen-off instruction, displaying a second screen-off image by the electronic equipment, wherein the second screen-off image is an image generated by the electronic equipment according to a second travel mode, the second travel mode indicates whether the user travels at the second travel time or not, or a vehicle used by the user at the second travel time, and the second travel time is positioned before the first travel time.
When the travel time corresponding to the travel mode acquired by the electronic equipment is not matched with the current screen-off time, the electronic equipment can display the screen-off image generated according to the previous travel mode.
With reference to the first aspect, in one possible implementation manner, the electronic device obtains one or more items of information of the first time, weather of the first time, a position of the electronic device at the first time, or an altitude at which the electronic device is located at the first time, including: the electronic equipment acquires one or more items of information of the first time, the weather of the first time, the position of the electronic equipment at the first time or the altitude of the electronic equipment at the first time from data acquired by hardware, a first application or equipment establishing a connection relation with the electronic equipment.
That is, the electronic device may acquire travel information such as weather, time, date, location, altitude, travel mode, etc. through hardware, an application program, or other devices.
With reference to the first aspect, in one possible implementation manner, the electronic device detects that the electronic device enters a screen-off state at a first time, and specifically includes: the electronic equipment receives a second operation of a user acting on the power key at the first time, responds to the second operation, and enters a screen-off state, or the electronic equipment does not receive the operation of the user within a preset time, and enters the screen-off state at the first time.
That is, the electronic device may actively turn off the screen when the operation of the user is not received within a certain time, or passively turn off the screen when the operation of the user acting on the power key is received.
With reference to the first aspect, in one possible implementation manner, before the electronic device detects that the electronic device enters the screen-off state at the first time, the method further includes: the electronic device displays a user interface provided by a setting application; the electronic device receives a third operation acting on an off-screen display option in the user interface, the third operation being for triggering the electronic device to display the first off-screen image when off-screen.
The electronic equipment can start an off-screen display function through the setting application, and the off-screen display function is used for triggering the electronic equipment to display an off-screen image when the electronic equipment is off-screen.
In a second aspect, the present application provides another display method, the method comprising: the method comprises the steps that the electronic equipment receives a screen-off instruction in a first travel time, wherein a travel mode of a user in the first travel time is a first travel mode, and the first travel mode indicates whether the user travels in the first travel time or not, or a vehicle used by the user in the first travel time; the electronic equipment is turned off and displays a first off-screen image, and the first off-screen image indicates the first travel mode; the electronic equipment receives a screen-off instruction in a second travel time, wherein the travel mode of the user in the second travel time is a second travel mode, and the second travel mode indicates whether the user travels in the second travel time or not, or a vehicle used by the user in the second travel time; the electronic device extinguishes the screen and displays a second off-screen image indicating the second travel mode.
In the method provided in the second aspect, in the off-screen state, the electronic device may display different off-screen images when the user is in different travel modes, where the off-screen images indicate the current travel mode of the user. Thus, the electronic equipment can provide the dynamically-changed screen-off image for the user, the change of the screen-off image is related to the travel change of the user, and the user can quickly acquire the travel condition of the user through the screen-off image.
In a third aspect, the present application provides an electronic device, comprising: a memory, one or more processors, a plurality of applications, and one or more programs; wherein the one or more programs are stored in the memory; wherein the one or more processors, when executing the one or more programs, cause the electronic device to perform the method as described in the first aspect or any implementation of the first aspect.
In a fourth aspect, the application provides a computer readable storage medium comprising instructions, characterized in that the instructions, when run on an electronic device, cause the electronic device to perform a method as described in the first aspect or any implementation of the first aspect.
In a fifth aspect, the application provides a computer program product which, when run on a computer, causes the computer to perform a method as described in the first aspect or any implementation of the first aspect.
By implementing the display method provided by the embodiment of the application, the electronic equipment can generate the screen-off image according to the travel information of the user and display the screen-off image when the user is in a screen-off state, so that the user can quickly acquire the travel information such as the travel mode, the position, the time, the date, the weather, the height and the like of the user by previewing the screen-off image, the interestingness of screen-off display is improved, the travel scene of the user is simulated according to the travel condition of the user, the image displayed by the screen-off is close to the life of the user, and the practicability of screen-off display is improved.
Drawings
FIG. 1 is a user interface for a screen-off display of an electronic device according to an embodiment of the present application;
fig. 2 is a schematic hardware structure of an electronic device according to an embodiment of the present application;
FIGS. 3A-3E illustrate some user interfaces provided by embodiments of the present application;
FIG. 4 is a diagram illustrating the layer stacking principle provided by an embodiment of the present application;
FIGS. 5A-5B illustrate further user interfaces provided by embodiments of the present application;
fig. 6 is a schematic software structure of an electronic device according to an embodiment of the present application;
fig. 7 is a flowchart of interaction between internal modules in a software structure of an electronic device according to an embodiment of the present application;
FIG. 8 is a schematic flow chart of a display method according to an embodiment of the present application;
FIG. 9 is a schematic flow chart of layer stacking according to an embodiment of the present application;
fig. 10 is a flow chart of another display method according to an embodiment of the application.
Detailed Description
The technical solutions of the embodiments of the present application will be clearly and thoroughly described below with reference to the accompanying drawings. Wherein, in the description of the embodiments of the present application, unless otherwise indicated, "/" means or, for example, a/B may represent a or B; the text "and/or" is merely an association relation describing the associated object, and indicates that three relations may exist, for example, a and/or B may indicate: the three cases where a exists alone, a and B exist together, and B exists alone, and furthermore, in the description of the embodiments of the present application, "plural" means two or more than two.
The terms "first," "second," and the like, are used below for descriptive purposes only and are not to be construed as implying or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more such feature, and in the description of embodiments of the application, unless otherwise indicated, the meaning of "a plurality" is two or more.
The term "User Interface (UI)" in the following embodiments of the present application is a media interface for interaction and information exchange between an application program or an operating system and a user, which enables conversion between an internal form of information and a form acceptable to the user. The user interface is a source code written in a specific computer language such as java, extensible markup language (extensible markup language, XML) and the like, and the interface source code is analyzed and rendered on the electronic equipment to finally be presented as content which can be identified by a user. A commonly used presentation form of the user interface is a graphical user interface (graphic user interface, GUI), which refers to a user interface related to computer operations that is displayed in a graphical manner. It may be a visual interface element of text, icons, buttons, menus, tabs, text boxes, dialog boxes, status bars, navigation bars, widgets, etc., displayed in a display of the electronic device.
In order to facilitate a user to quickly look up important information such as time, date and the like and reduce the power consumption of electronic equipment, a technical scheme of screen-off display is provided.
The off-screen display (always on display, AOD) refers to controlling a partial area of a display screen of an electronic device to be lit up without lighting up the entire display screen of the electronic device, and displaying important information such as time, date, electricity, notification, etc. through the partial area. Thus, the user can quickly look up the important information through the display screen of the electronic equipment in the off-screen state.
FIG. 1 illustrates a user interface for an electronic device off-screen display. In the user interface shown in fig. 1, the dark area is an unlit area in the display screen, and the white area is an lit area in the display screen. It can be seen that the electronic device can display time, date and other information in a partial area of the display screen in the off-screen state. Thus, when the electronic equipment is in the screen locking state, the user can view information such as time, date and the like without frequently unlocking the electronic equipment.
However, as can be seen from fig. 1, the user interface can simply display time, date and other information, and the displayed content is relatively boring, so that the user's increasing demands cannot be met.
It will be appreciated that the screen off may also be referred to as a screen off, screen on, etc., and embodiments of the present application are not limited in this regard.
The embodiment of the application provides a display method, which can generate a dynamic or static screen-off image according to travel information such as a travel mode, a position, time, date, weather, height and the like of a user, and when electronic equipment is in a screen-off state, the screen-off image is displayed through a partial area of a display screen of the electronic equipment. Therefore, the user can quickly acquire travel information such as travel mode, position, time, date, weather, height and the like of the user through previewing the screen-off image in the screen-off state, and the interestingness of screen-off display is improved.
The travel mode of the user may include a non-travel mode, an airplane travel mode, a train travel mode, an automobile travel mode, a bus travel mode, a ferry travel mode, etc., and the travel mode of the user may be determined by a location where the user is located and/or a vehicle that the user travels. The time may refer to a current time or period, or a historical time or period, the location may refer to a geographic location where the electronic device is located at the time, the weather may refer to weather of the geographic location at the time, and the altitude may refer to an altitude of the electronic device at the location at the time. A user may refer to an object corresponding to one account logged onto an electronic device, and the user mentioned in the embodiment of the present application is the same account corresponding object.
The electronic device may determine travel information by one or more of:
1) Information of the application is acquired to determine travel information of the user. The electronic device may obtain, through an interface provided by the application, travel information of the user from information generated by one or more applications. When the travel information is in the travel mode, the application may refer to a travel class application, a map class application, a short message application, and the like. For example, the electronic device determines when the user will be in airplane travel mode by acquiring airline ticket information ordered by the user on the trip class application.
2) And determining travel information of the user according to the information acquired by the hardware of the electronic equipment. The hardware may refer to sensors, positioning modules, and the like. For example, when the electronic device determines that the location of the user is at home according to the positioning module, the electronic device may determine that the travel mode of the user is the non-travel mode.
3) And determining travel information of the user according to the information actively input by the user. Taking a trip mode as an example, a user can actively select own trip mode in the electronic device, or input own trip information in the electronic device. The user can actively input information after opening the screen-off display every time, or input information again after reaching the preset time, so that the electronic equipment updates the screen-off image.
4) And acquiring information acquired by other equipment to determine travel information of the user. The electronic device can acquire information acquired by other devices through connecting with other devices, such as a bracelet, a smart watch and the like, so as to determine travel information of a user.
It can be appreciated that the manner of determining the travel information by the electronic device is not limited to the four types of information described above, and the embodiment of the present application is not limited thereto.
The electronic device may enter a screen-off state when a click operation of a user on a power key is received, or when an operation of the user is not received within a preset time. In the off-screen state, partial pixel points of the electronic equipment are lightened, but a user can view an off-screen image generated by the electronic equipment according to the travel information through a display screen, so that the user is helped to quickly acquire the travel information such as travel mode, position, time, date, weather, altitude and the like.
Fig. 2 shows a schematic hardware configuration of the electronic device 100.
The electronic device 100 may be a cell phone, tablet, desktop, laptop, handheld, notebook, ultra-mobile personal computer (ultra-mobile personal computer, UMPC), netbook, as well as a cellular telephone, personal digital assistant (personal digital assistant, PDA), augmented reality (augmented reality, AR) device, virtual Reality (VR) device, artificial intelligence (artificial intelligence, AI) device, wearable device, vehicle-mounted device, smart home device, and/or smart city device, with embodiments of the application not being particularly limited as to the particular type of electronic device.
The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, keys 190, a motor 191, an indicator 192, a camera 193, a display 194, and a subscriber identity module (subscriber identification module, SIM) card interface 195, etc. The sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It should be understood that the illustrated structure of the embodiment of the present application does not constitute a specific limitation on the electronic device 100. In other embodiments of the application, electronic device 100 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
In some embodiments, the processor 110 obtains travel information of the user such as travel mode, location, time, date, weather, altitude, etc., and generates a dynamic or static off-screen image according to the travel information. The description of acquiring the travel information and generating the screen-off image may be referred to later, and will not be repeated here.
The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data.
The charge management module 140 is configured to receive a charge input from a charger.
The power management module 141 is used for connecting the battery 142, and the charge management module 140 and the processor 110.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals.
The mobile communication module 150 may provide a solution for wireless communication including 2G/3G/4G/5G, etc., applied to the electronic device 100. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc. The mobile communication module 150 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation. The mobile communication module 150 can amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (wireless fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication technology (near field communication, NFC), infrared technology (IR), etc., as applied to the electronic device 100. The wireless communication module 160 may be one or more devices that integrate at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, demodulates and filters the electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 2.
In some embodiments, the mobile communication module 150 and the wireless communication module 160 may be used to establish a communication connection with other devices and determine travel information of the user from information sent by the other devices. The other devices may be devices such as a bracelet, an intelligent watch, a mobile phone, and the like.
The electronic device 100 implements display functions through a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. The display panel may employ a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (AMOLED) or an active-matrix organic light-emitting diode (matrix organic light emitting diode), a flexible light-emitting diode (flex), a mini, a Micro led, a Micro-OLED, a quantum dot light-emitting diode (quantum dot light emitting diodes, QLED), or the like.
In some embodiments, the display 194 may be used to display a user interface that includes an off-screen image, and the user interface displayed on the display 194 may be referred to in the following embodiments, and is not described herein.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format.
The internal memory 121 may include one or more random access memories (random access memory, RAM) and one or more non-volatile memories (NVM).
The electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playing, recording, etc.
The gyro sensor 180B may be used to determine a motion gesture of the electronic device 100. In some embodiments, the angular velocity of electronic device 100 about three axes (i.e., x, y, and z axes) may be determined by gyro sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. For example, when the shutter is pressed, the gyro sensor 180B detects the shake angle of the electronic device 100, calculates the distance to be compensated by the lens module according to the angle, and makes the lens counteract the shake of the electronic device 100 through the reverse motion, so as to realize anti-shake. The gyro sensor 180B may also be used for navigating, somatosensory game scenes.
The air pressure sensor 180C is used to measure air pressure. In some embodiments, electronic device 100 calculates altitude from barometric pressure values measured by barometric pressure sensor 180C, aiding in positioning and navigation.
The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). The magnitude and direction of gravity may be detected when the electronic device 100 is stationary. The electronic equipment gesture recognition method can also be used for recognizing the gesture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
In some embodiments, the gyro sensor 180B and the acceleration sensor 180E may be used to detect whether the electronic apparatus 100 is in a stationary state. Specifically, when the electronic apparatus 100 is in the off-screen state, the electronic apparatus 100 may not display the off-screen image any more if the time of rest reaches the preset time. Or, when the electronic device 100 displays the screen-off image, if the gyroscope sensor 180B or the acceleration sensor 180E detects that the electronic device 100 is changed from the standing state, the screen-off image displayed by the electronic device 100 may be updated to be a screen-locking interface, so as to prompt the user to unlock.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The electronic device 100 emits infrared light outward through the light emitting diode. The electronic device 100 detects infrared reflected light from nearby objects using a photodiode. When sufficient reflected light is detected, it may be determined that there is an object in the vicinity of the electronic device 100. When insufficient reflected light is detected, the electronic device 100 may determine that there is no object in the vicinity of the electronic device 100. The electronic device 100 can detect that the user holds the electronic device 100 close to the ear by using the proximity light sensor 180G, so as to automatically extinguish the screen for the purpose of saving power. The proximity light sensor 180G may also be used in holster mode, pocket mode to automatically unlock and lock the screen.
The ambient light sensor 180L is used to sense ambient light level. The electronic device 100 may adaptively adjust the brightness of the display 194 based on the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust white balance when taking a photograph. Ambient light sensor 180L may also cooperate with proximity light sensor 180G to detect whether electronic device 100 is in a pocket to prevent false touches.
In some embodiments, when the electronic device 100 is in the off-screen state, if the brightness of the display screen 194 is less than the threshold, the electronic device 100 may not display the off-screen image, reducing the power consumption of the electronic device 100. This is because when the brightness of the display 194 is less than the threshold, indicating that the display proximate to the light sensor 180G or the ambient light sensor 180L detecting the electronic device 100 may be obscured by an object, it may not be necessary to display an off-screen image.
The touch sensor 180K, also referred to as a "touch device". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is for detecting a touch operation acting thereon or thereabout. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display 194.
In some embodiments, when the touch sensor 180K does not detect a touch operation of the user for a preset time, the electronic device 100 may generate an instruction for turning off the screen, and in response to the instruction, the electronic device 100 enters the off-screen state.
The keys 190 include a power-on key, a volume key, etc. The keys 190 may be mechanical keys. Or may be a touch key. The electronic device 100 may receive key inputs, generating key signal inputs related to user settings and function controls of the electronic device 100.
In some embodiments, the key 190 may be used to trigger a screen off. Specifically, when the electronic apparatus 100 detects a user operation (e.g., a click operation) acting on the key 190, in response to the operation, the electronic apparatus 100 generates an instruction for turning off the screen, in response to which the electronic apparatus 100 enters the off-screen state.
The user interfaces according to embodiments of the present application are exemplarily described below in connection with fig. 3A-3E.
Fig. 3A illustrates the default user interface 11 provided by the setup application after the electronic device 100 opens the setup application. The user interface 11 may include: flight mode, wiFi, bluetooth, hotspot, mobile network, etc. The electronic device 100 may detect a touch operation by a user on different function options in the user interface 11, in response to which the electronic device 100 may turn on or off the functions of flight mode, wiFi, bluetooth, hot spot, mobile network, etc. The user interface 11 includes a "desktop, lock screen, and off screen" option 111, where the "desktop, lock screen, and off screen" option 111 may be used to set the desktop, lock screen, and off screen of the electronic device 100.
As shown in fig. 3A, the electronic device 100 may detect a user operation on the "desktop, lock screen, and quench screen" option 111, and in response to the operation, the electronic device 100 displays the user interface 21 shown in fig. 3B, where the user interface 21 is an interface for setting the desktop, lock screen, and quench screen of the electronic device 100.
As shown in fig. 3B, the user interface 21 includes an "off screen display" option 211, a display mode option 212, and an off screen style selection area 213. Wherein the "off screen display" option 211 is used to turn on or off the off screen display function of the electronic device 100. The off-screen display function is used to light up a partial area of the display screen, and to display important information such as time, date, electricity, notification, etc. in the partial area when the electronic device 100 is off-screen. In addition, in the embodiment of the application, the screen-off display function can generate a dynamic or static screen-off image according to the travel mode, time, weather, position, height and the like of the user, and display the screen-off image in the partial area. When the electronic device 100 detects a user operation (e.g., a click operation) acting on the "off-screen display" option 211, the electronic device 100 turns on or off the off-screen display function in response to the operation. Illustratively, the off-screen display function is in an off state when the main color displayed by the switch 211A is white, and in an on state when the main color displayed by the switch 211A is gray. By default, the off-screen display function is initially in an off state. The display mode option 212 may be used to set a duration of the off-screen display, for example, when the display mode of the off-screen display is set to be a full-day display, the electronic device 100 always displays the off-screen image when the off-screen is displayed. The screen-off style selection area 213 is configured to display one or more screen-off styles provided by the electronic device 100, where the screen-off style selection area 213 includes a first screen-off style option 213A, where the first screen-off style option 213A corresponds to one screen-off style, and when the electronic device 100 detects a user operation acting on the first screen-off style option 213A, the electronic device 100 selects the screen-off style corresponding to the first screen-off style option 213A as the screen-off style of the screen-off display of the electronic device 100, where the display method provided by the embodiment of the application corresponds to the screen-off style indicated by the first screen-off style option 213A. At this time, if the electronic device 100 enters the off-screen state, a partial area in the display screen of the electronic device 100 may display an off-screen image generated by the electronic device 100 according to the travel information of the user, such as the travel mode, time, date, weather, location, altitude, and the like.
Optionally, when the electronic device 100 detects an opening operation acting on the "off-screen display" option 211, the electronic device 100 may default to select an off-screen style corresponding to the display method provided by the embodiment of the present application. In other words, when the electronic apparatus 100 turns on the off-screen display function, the electronic apparatus 100 generates a dynamic or static off-screen image according to a travel mode, time, weather, position, altitude, etc. of the user by default, and displays the off-screen image when the screen is off.
Optionally, an "off screen display" option may be included in the user interface 11, and the electronic device 100 may detect a user operation on the "off screen display" option, turning on or off the off screen display function. Alternatively, the electronic device 100 may provide icons for turning on or off the off-screen display function in a drop-down menu. The embodiment of the application does not limit the mode of opening the display function of the screen quenching.
Fig. 3C-3E schematically illustrate a user interface displayed when the electronic device 100 is off the screen after the electronic device 100 is in the airplane travel mode and the off-screen display function is turned on.
The electronic device 100 may determine that the trip mode of the user is an airplane trip mode through information of an application, information collected by hardware, information actively input by the user or information of other devices, and the like.
Illustratively, the electronic device 100 may determine when the electronic device 100 is in the airplane travel mode by the ticket information purchased by the user on the trip-class application.
For example, the electronic device 100 may enter the off-screen state after detecting a user operation (e.g., a click operation) on a power key.
As shown in fig. 3C, the user interface 31 includes a picture display area 311, an information display area 312. The picture display area 311 is used for displaying a picture generated by the electronic device 100 according to information such as a travel mode, time, weather, etc. of a user, and the information display area 312 is used for displaying information such as time, date, electric quantity, notification, etc., and the picture and the information together form an off-screen image generated by the electronic device 100. As can be seen from the picture display region 311 shown in fig. 3C, the picture display region 311 is used to simulate the scene of a user looking out through the airplane window on the airplane, increasing the interest of the off-screen display. The outline of the picture display area 311 is the outline of the aircraft window, and UI elements such as scenery, sun, cloud, etc. are displayed in the picture display area 311. The scene may be a UI element selected by the electronic device 100 according to a viewing angle of the scene observed by the aircraft in the aircraft trip mode, and the sun may be a time element determined by the electronic device 100 when the current time is in the daytime according to the current time. Similarly, when the electronic device 100 determines that the current time is in the night according to the current time, the sun may be changed to the moon. The cloud may be a weather element determined when the electronic device 100 determines that the weather of the geographic location where the electronic device 100 is located is cloudy according to weather. Similarly, when the electronic device 100 determines that the weather of the geographic location where the electronic device 100 is located is light rain, the cloud may be changed to a raindrop.
It should be noted that the above-mentioned UI elements such as scenes, sun, clouds, etc. may be located in different layers, and after determining the elements in the different layers according to the travel mode, time, weather, etc. of the electronic device 100, the UI elements may be finally displayed in the display screen of the electronic device 100 by combining the layers. The description of the layers may be referred to later, and will not be repeated here.
In addition, the picture displayed in the picture display area 311 may be a picture generated by the electronic device 100 according to the travel information acquired by the current time or period, or may be a picture generated by the electronic device 100 according to the travel information acquired by the historical time or period. The reason is that the travel information of the user generally does not change in real time or rapidly along with time, and the electronic device 100 can acquire the travel information before entering the screen extinction, so that a certain delay in displaying the screen extinction image is avoided when the electronic device 100 acquires the travel information to generate the screen extinction image after entering the screen extinction. Alternatively, the user's travel information may have a failure to obtain, for example, the user may not be able to obtain weather in real time after closing the network in the airplane. Then at this point, the electronic device 100 may use the last acquired travel information to generate an off-screen image or continue to display the last generated off-screen image.
In the user interface shown in fig. 3C, the electronic device may receive a touch operation by the user on the picture display region 311, and in response to the operation, the electronic device 100 displays the user interface 31 shown in fig. 3D. At this time, the picture displayed in the picture display region 311 may be updated from fig. 3C to the content displayed in fig. 3D.
As shown in fig. 3D, the picture displayed in the picture display area 311 is used to simulate an image that can be seen by a user when the aircraft window is closed. It can be seen that when the aircraft window is closed, no scene, sun, cloud, etc. elements are displayed in the picture display region 311. When the electronic apparatus 100 detects again the user operation by the user on the picture display region 311, the picture displayed in the picture display region 311 is updated to the picture displayed in the picture display region 311 in fig. 3E.
As can be seen from fig. 3C to 3E, the image displayed in the picture display region 311 is changed by a click operation by the user on the picture display region 311. The operation of opening and closing the aircraft window by the user can be simulated by the operation of the user acting on the display screen, elements such as scenery, sun, cloud and the like are displayed when the aircraft window is opened, and the elements such as scenery, sun, cloud and the like are not displayed when the aircraft window is closed, so that the interactivity between the user and the electronic equipment 100 is enhanced under the screen-off display function.
Alternatively, after displaying the user interface as shown in fig. 3C, when the elapsed time reaches the preset time, the electronic device 100 may automatically update the content displayed in the picture display region 311 to the content displayed in the picture display region 311 as in fig. 3D. Alternatively, the electronic device 100 may default to display the user interface shown in fig. 3D after entering the off-screen state, and display the user interface shown in fig. 3C after receiving the user operation on the picture display area 311.
As shown in fig. 3E, the picture displayed in the picture display region 311 contains UI elements such as scenes, sun, clouds, and the like, and unlike fig. 3C, the scenes contained in the picture display region 311 may be different. That is, the electronic device 100 may continuously update the picture of the off-screen display, for example, when the electronic device 100 receives a user operation on the picture display region 311, the electronic device 100 updates a scene in the picture of the off-screen display. Therefore, the interest of the screen-off display can be increased, and the content of the screen-off display is enriched.
Alternatively, the electronic device 100 may update the picture of the off-screen display after a preset time of the off-screen display. Then, at this time, the electronic device 100 may directly display the user interface shown in fig. 3E after displaying the preset time of fig. 3C.
As can be seen from fig. 3C to fig. 3E, when the electronic device 100 is in the off-screen state, the electronic device 100 may display an off-screen image in the off-screen state, and simulate a scene of a user going out through the off-screen image, so as to provide a personalized off-screen image for the user. In addition, the electronic device 100 may also detect an operation of the user in the off-screen state, and update the off-screen image in response to the operation, thereby improving the interest and operability of the off-screen display.
Because the picture displayed by the electronic device 100 in the off-screen mode will change along with the change of the information such as the travel mode, time, weather, etc. of the user, the off-screen image may be overlapped by the UI elements in multiple layers, and at this time, the UI elements in one layer will not be affected by the change of the UI elements in the other layers. For example, the off-screen image may contain four layers: an information layer, a foreground layer, an intermediate layer and a background layer. Different layers contain UI elements corresponding to different travel information. Fig. 4 illustrates a layer stacking principle of the screen-off image illustrated in fig. 3C.
As shown in fig. 4, UI elements included in the information layer, the foreground layer, the intermediate layer, and the background layer are shown in fig. 4 (a), (b), (c), and (d), respectively. The information layer may include text information related to time, date, electric quantity, notification, etc., for example, the information layer may display "08:08 month 25 days, friday, gas, ten, etc." the user may intuitively obtain information such as time, date, electric quantity, notification, etc. through the text information. The foreground layer may contain an outer contour of a picture displayed by the screen, which may be a window contour related to a travel mode, for example, a contour of a shutter when the travel mode is an unoccupied mode, an aircraft window contour when the travel mode is an aircraft travel mode, and a contour of a fire window when the travel mode is a train travel mode. The middle layer may contain a weather related UI element, for example, in case of rain, the layer contains a raindrop pattern, and in case of cloudiness, the layer contains a cloud pattern. The background layer may contain UI elements related to time, and the time-related elements may be UI elements of blue sky, starry sky, sun, moon, rainbow, meteor, etc., and when the time is daytime, the layer may contain UI elements of blue sky, sun, rainbow, etc., and when the time is nighttime, the layer may contain UI elements of starry sky, moon, meteor, etc. In addition, the scenery in the background layer can also be a UI element selected according to the travel information, at this time, due to different travel modes, different times, different heights or different positions, etc., the content of the scenery or the viewing angle of the scenery may be different, for example, when the aircraft just takes off, the scenery that can be observed through the window may be an aircraft runway, the scenery that can be observed through the window during the high-altitude flight of the aircraft may be a cloud layer, and the scenery that can be observed through the window during the low-altitude flight of the aircraft may be a mountain-water top view.
In addition, the image shown in fig. 4 (e), that is, the image displayed in the user interface 31 shown in fig. 3C, can be obtained by combining the layers by superimposing the information layer, the foreground layer, the intermediate layer, and the background layer on top of each other.
It will be appreciated that the off-screen images may be obtained from more or fewer layers, not limited to the four layers mentioned above, as embodiments of the present application are not limited in this respect.
In some embodiments, the off-screen image displayed by the electronic device 100 may also contain UI elements related to the transportation means the user is traveling on. For example, as shown in fig. 5A, in the user interface 31 displayed by the electronic device 100, the picture display region 311 may contain an airplane element 311A. The reason is that when the electronic device 100 obtains that the current travel mode of the user is the airplane travel mode, the electronic device 100 can add wing elements in the screen-off image to enrich the content of the screen-off image, so that the scene observed when the user browses the scenery through the airplane window in an actual airplane travel scene is more attached. Similarly, when the travel mode is a train travel mode, a train body or the like may be contained in the screen-off image.
In some embodiments, the travel progress of the user may also be indicated in the off-screen image displayed by the electronic device 100. For example, as shown in fig. 5B, the user interface 31 displayed by the electronic device 100 may further include a travel progress bar 313, where the travel progress bar 313 is used to indicate the progress of the user going out. Specifically, if the electronic device 100 obtains that the user may be in the airplane travel mode from 8 to 11, when the electronic device 100 detects the user screen-off operation at 9, the travel progress bar 313 displayed by the electronic device 100 may indicate that the user has spent one third of the riding time in the airplane, because the electronic device 100 calculates that the user has spent one third of the airplane travel at this time. In this way, the electronic device 100 can further inform the user of the travel time while displaying the screen-off image related to the travel of the user, so as to help the user to quickly understand the travel situation of the user.
The electronic device may be a portable terminal device such as a mobile phone, a tablet computer, a wearable device, etc. on which iOS, android, microsoft or other operating systems are mounted, or may be a non-portable terminal device such as a Laptop computer (Laptop) having a touch-sensitive surface or touch panel, a desktop computer having a touch-sensitive surface or touch panel, etc. The software system of the electronic device 100 may employ a layered architecture, an event driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture. In the embodiment of the application, taking an Android system with a layered architecture as an example, a software structure of the electronic device 100 is illustrated.
Fig. 6 is a software configuration block diagram of the electronic device 100 according to the embodiment of the present application.
The layered architecture divides the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, from top to bottom, an application layer, an application framework layer, an Zhuoyun row (Android run) and system libraries, and a kernel layer, respectively.
The application layer may include a series of application packages.
As shown in FIG. 6, the application package may include applications such as calendar class applications, navigation class applications, weather class applications, travel class applications, and AOD modules. The calendar application is used for providing time and date information for the electronic device 100, the navigation application is used for providing position information of the electronic device 100, the weather application is used for providing weather information for the electronic device 100, and the travel application is used for obtaining travel information of a user, wherein the travel information can comprise information such as a transportation means for the user to travel and time for the user to travel.
The AOD module may be part or all of a module included in a system application or a third party application, for example, the AOD module may be a module for setting a part of an application to process an off-screen display. The AOD module comprises a screen extinguishing display module, an information acquisition module, a user database, an image synthesis module and an interface display module.
The off-screen display module is configured to initialize off-screen display, where the initializing of the off-screen display includes setting parameters related to the off-screen display, determining whether a condition for calling the off-screen display function is reached, and the like, where the condition for calling the off-screen display function includes that an electric quantity of the electronic device 100 is greater than a threshold value, the electronic device 100 is in a standing state, and the like, for example, when the electric quantity of the electronic device 100 is less than the threshold value, the off-screen display module determines that the condition for calling the off-screen display function is not satisfied currently, and then when the electronic device 100 enters the off-screen state, no off-screen image is displayed. It will be appreciated that the embodiments of the present application do not limit the specific content of initializing the quench screen display. In a specific example, the off-screen display module may be embodied as a DozeService.
The information acquisition module is used for acquiring travel information required by the screen-off display and acquiring the UI element from the user database according to the travel information. The travel information includes: travel patterns of the user, time, location, weather, etc. The information acquisition module includes: the system comprises a time management module, a position management module, a weather management module and a travel management module. The time management module is used for acquiring time information, the position management module is used for acquiring position information, the weather management module is used for acquiring weather information, the travel management module is used for acquiring or storing travel information of the user, and a travel mode of the user is determined according to the travel information of the user. In the embodiment of the present application, the manner in which the information obtaining module obtains the travel information may include, but is not limited to: 1) Acquiring travel information by acquiring information of an application; 2) Acquiring travel information according to information acquired by hardware of the electronic equipment; 3) Acquiring travel information according to information actively input by a user; 4) And acquiring travel information of the user through other equipment. Fig. 6 illustrates that the information acquisition module may acquire travel information by acquiring information of an application. Specifically, the time management module can acquire time information through a calendar application, the position management module can acquire position information through a navigation application, the weather management module can acquire weather information through a weather application, and the travel management module can acquire a travel mode of a user in a current or specific time period through travel information in a travel application. The travel mode of the user may include a non-travel mode, an airplane travel mode, a train travel mode, an automobile travel mode, a bus travel mode, and the like, and the travel mode of the user may be determined according to a vehicle on which the user travels in the travel information. For example, when the trip information indicates that the vehicle on which the user travels is an airplane, the travel management module may determine that the travel mode of the user is an airplane travel mode. The description of the manner in which the information obtaining module obtains the travel information may refer to the related content of the manner in which the electronic device obtains the travel information, which is not described herein. In a specific example, the time management module may be embodied as timentils, the location management module may be embodied as PosUtils, and the weather management module may be embodied as weather utils.
The user database stores UI elements corresponding to travel information of the user. The UI element refers to a visualized interface element that can be displayed in a user interface. The UI elements may be divided into: weather elements, time elements, location elements, window elements, and the like. The weather elements are used for representing weather indicated by weather information, and the user database contains corresponding weather elements under different weather conditions, for example, when the weather is cloudy, the weather elements can comprise clouds, the weather is rainy, and the weather elements can comprise raindrops, clouds and the like UI elements. The time element is used for representing the time indicated by the time information, and the user database contains corresponding time elements under different time, for example, when the time is daytime, the time elements can comprise UI elements capable of representing daytime, such as sun, blue sky, white cloud, rainbow and the like, and when the time is nighttime, the time elements can comprise UI elements capable of representing nighttime, such as moon, stars, meteor, and the like. The location element is used to represent a location indicated by the location information, and may comprise a building or scene associated with the location. The window element is used for representing whether a user travels, and the user goes out of a used vehicle, for example, when the travel mode is an unoccupied mode, the window element may include a shutter, when the travel mode is an airplane travel mode, the window element may include an airplane window, and when the travel mode is a train travel mode, the window element may include a fire window. The description of the UI elements stored in the user database may be referred to later, and will not be repeated here. In a specific example, the user database may be embodied as a UserSettingProvider.
The image synthesis module is used for synthesizing the screen-off image according to the UI elements acquired by the screen-off display module, wherein the UI elements acquired by the screen-off display module are specifically positioned on different layers, and the image synthesis module can merge the UI elements acquired by the screen-off display module into the screen-off image by a method of merging the layers. In a specific example, the image composition module may be embodied as a LayerComposer.
The interface display module is used for displaying the screen-off image synthesized by the image synthesis module in the user interface when the electronic device 100 enters the screen-off state. In one specific example, the interface display module may be embodied as a DozeUi.
The application framework layer provides an application programming interface (application programming interface, API) and programming framework for application programs of the application layer. The application framework layer includes a number of predefined functions.
As shown in fig. 6, the application framework layer may include a window manager, a content provider, a view system, a phone manager, a resource manager, a notification manager, and the like.
The window manager is used for managing window programs and processing window related functions. The window manager can acquire the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like. In the embodiment of the application, the window manager can receive the screen-off instruction and call the screen-off display module to start screen-off. In one specific example, the window manager may include the following three modules: phoneWindowManager, dreamManagerService, dreamController.
The content provider is used to store and retrieve data and make such data accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebooks, etc.
The view system includes visual controls, such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, a display interface including a text message notification icon may include a view displaying text and a view displaying a picture.
The telephony manager is used to provide the communication functions of the electronic device 100. Such as the management of call status (including on, hung-up, etc.).
The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, video files, and the like.
The notification manager allows the application to display notification information in a status bar, can be used to communicate notification type messages, can automatically disappear after a short dwell, and does not require user interaction. Such as notification manager is used to inform that the download is complete, message alerts, etc. The notification manager may also be a notification in the form of a chart or scroll bar text that appears on the system top status bar, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, a text message is prompted in a status bar, a prompt tone is emitted, the electronic device vibrates, and an indicator light blinks, etc.
Android run time includes a core library and virtual machines. Android run time is responsible for scheduling and management of the Android system.
The core library consists of two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application program layer and the application program framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface manager (surface manager), media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., openGL ES), 2D graphics engines (e.g., SGL), etc.
The surface manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
Media libraries support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio and video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
The following describes in detail the interaction flow between the internal modules in the process of the electronic device 100 turning off the screen for display in combination with the software of the electronic device 100 shown in fig. 6.
In the process of the electronic device 100 turning off the screen, the following modules in the software structure are mainly involved: the system comprises a window manager, a screen-off display module, a travel management module, a time management module, a position management module, a weather management module, a user database, an image synthesis module and an interface display module.
Fig. 7 shows an interaction flow between internal modules during an off-screen display of the electronic device 100.
As shown in fig. 7, the process may include the steps of:
s101, receiving a screen-off instruction by the window manager.
The screen-off instruction may be an instruction generated by the electronic device 100 for screen-off in response to the electronic device 100 receiving a click operation of a user on a power key, where the screen-off instruction may be used to trigger the electronic device 100 to enter a screen-off state. Or, the screen-off instruction may be an instruction for screen-off generated by the electronic device 100, where the electronic device 100 does not receive the user operation within a preset time. It may be understood that the screen-off instruction may also be an instruction for screen-off generated when the electronic device 100 detects that the brightness of the display screen is less than the threshold value, and the embodiment of the present application does not limit the screen-off instruction.
S102, the window manager calls a screen-off display module to start screen-off.
And the window manager responds to the screen-off instruction and calls the screen-off display module to start screen-off. Specifically, in the process that the window manager calls the screen off display module to start screen off, the PhoneWindowManager calls DreamManagerService, dreamManagerService to call the streamcontroller again, and a screen off event is created.
S103, initializing screen-off display by the screen-off display module.
Initializing the off-screen display includes setting parameters related to the off-screen display, judging whether a condition for calling the off-screen display function is reached, and the like, and the description of initializing the off-screen display can be referred to the foregoing, and will not be repeated here.
S104, the screen-off display module calls the time management module to acquire time information.
The time information may be the current specific time, e.g. 8 points for 8 minutes, or the time information may indicate that it is currently in the day or night. It can be understood that the time information may also refer to morning, afternoon, evening, early morning, etc., and the specific content of the time information is not limited in the embodiment of the present application.
S105, the screen-off display module calls the position management module to acquire position information.
The location information may be embodied as latitude and longitude coordinates, geographic scope or place name, and the like. The position management module may call the GPS to obtain the current position information, or obtain the current position information through the navigation application, or predict the current position of the user by combining the travel information of the user and time, for example, the travel information of the user indicates that the user has a travel of Chongqing to Beijing from 8 to 11, and if the current time is 9, the position management module may predict that the current user is located in xian.
S106, the screen-off display module calls the weather management module to acquire weather information.
Weather information may be used to indicate weather, where the weather may include: a sunny day, a rainy day, a snowy day, a cloudy day, a foggy day, sand dust, etc., and the weather management module can acquire weather information through weather-like applications.
S107, the screen-off display module calls the travel management module to acquire a travel mode of the user.
The travel management module can acquire travel information of the user through travel application or information input by the user. For example, when a user purchases an airline ticket through a trip-class application, the trip management module may obtain information in the airline ticket, determine trip information of the user, and specifically include a time when the user trips and a vehicle when the user trips. When the time executed in step S107 is within the time of the user going out acquired by the travel management module, the screen-off display module acquires the travel mode of the user as the airplane travel mode.
It will be appreciated that the execution sequence of the steps S104 to S107 is not limited, and the steps S104 to S107 may be sequentially executed or simultaneously executed, in addition, in the acquisition time information, the position information, the weather information and the travel mode indicated in the steps S104 to S107, the screen-off display module may execute more or less steps to acquire more or less travel-related information, for example, the screen-off display module may not execute the step S105, so that the final displayed screen-off image of the electronic device 100 may not change along with the change of the geographic position of the electronic device 100.
S108, the screen-off display module acquires the UI element from the user database according to the time information, the position information, the weather information and the travel mode.
The user database stores a plurality of UI elements corresponding to time, position, weather and travel mode respectively. The screen-off display module can respectively select time elements conforming to the time information acquired by the screen-off display module, position elements conforming to the position information acquired by the screen-off display module, weather elements conforming to the weather information acquired by the screen-off display module and window elements conforming to the travel mode acquired by the screen-off display module from the plurality of UI elements.
S109, the screen-off display module sends the UI element to the image synthesis module.
S110, the image synthesis module synthesizes the screen-off image according to the UI element.
The UI elements sent by the screen-off display module are specifically located in different layers, and the image synthesis module can merge the UI elements located in different layers into screen-off images through a layer merging method.
S111, the image synthesis module calls the interface display module to display the screen-extinguishing image.
Specifically, the interface display module may display the off-screen image by lighting a partial region of the display screen and using the partial region. And other areas of the display screen are in an unlit state. That is, after receiving the screen-off instruction, the electronic device 100 may generate a screen-off image according to the travel information by acquiring the travel information, and display the screen-off image when the electronic device 100 is in the screen-off state, so that when the electronic device 100 is in the screen-off state, the user can quickly learn about the travel condition of the user through the screen-off image displayed in the display screen, and the user can acquire the screen-off image dynamically changing with the travel information such as time, position, weather, travel mode, and the like, thereby enhancing the interest of the screen-off display.
The following describes a flow of a display method according to an embodiment of the present application with reference to fig. 8.
As shown in fig. 8, the method includes:
s201, the electronic equipment 100 acquires travel information.
The travel information refers to information related to travel of the user, including: travel mode, time, date, weather, location, power, notification, etc. of the user.
The travel pattern of the user describes whether the user is traveling or not and the vehicle the user is using. The travel mode of the user may include a non-travel mode, an airplane travel mode, a train travel mode, an automobile travel mode, a bus travel mode, and the like.
It is to be understood that the travel mode of the user is not limited to the above mentioned modes, and in the embodiment of the present application, the travel mode of the user may also include other classifications, for example, a taxi travel mode, a private car travel mode, and so on, which is not limited in the embodiment of the present application.
The electronic device 100 may obtain travel information according to one or more of the following:
1) Acquiring travel information by acquiring information of an application
The application may refer to an application in the electronic device 100. For example, when the application is a travel type application, the electronic device 100 may determine a travel mode of the user by acquiring travel information in the travel type application, the electronic device 100 may acquire time and date through a calendar type application, acquire a location through a navigation type application, and acquire weather through a weather type application.
2) Acquiring travel information from information collected by hardware of electronic device 100
The hardware may refer to sensors, positioning modules, and the like. For example, when the electronic device determines that the location of the user is always located in the home according to the location module, the electronic device may determine that the travel mode of the user is the non-travel mode.
3) And determining travel information of the user according to the information actively input by the user. Taking a travel mode as an example, the user may actively select his travel mode in the electronic device 100, or input his travel information in the electronic device 100.
4) And acquiring information acquired by other equipment to determine travel information of the user. The electronic device 100 may determine travel information of the user by connecting other devices, such as a bracelet, a smart watch, etc., and acquiring information collected by the other devices.
In a specific implementation, the screen-off display module in the electronic device 100 may call the location management module to obtain location information, call the weather management module to obtain weather information, and call the trip management module to obtain a trip mode of the user.
It can be understood that the manner in which the electronic device determines the travel information is not limited to the above four types, for example, the electronic device 100 may calculate the travel information through part of data, and for example, the electronic device 100 may predict the location of the electronic device 100 in the travel process according to the travel progress and the travel video. The embodiment of the application does not limit the way of acquiring the travel information.
The trigger timing of the electronic device 100 to acquire the trip information may include the following cases:
1) The electronic device 100 receives the screen-off instruction
The screen-off instruction may be an instruction generated by the electronic device 100 in response to the electronic device 100 receiving a click operation of the power key by the user. Or, the screen-off instruction may be an instruction for screen-off generated by the electronic device 100, where the electronic device 100 does not receive the user operation within a preset time. In a specific implementation, the window manager in the electronic device 100 may receive the screen-off instruction, and trigger the electronic device 100 to screen off according to the instruction. The embodiment of the application does not limit the screen-off instruction. In this way, the electronic device 100 can trigger to acquire the travel information of the user again when the screen is turned off, so that the electronic device 100 is prevented from repeatedly acquiring the travel information in unnecessary time, and the waste of system resources is reduced.
2) Electronic device 100 receives information actively entered by a user
The electronic device 100 may provide an entry for the user to input information, and after receiving the information actively input by the user, the electronic device 100 may be triggered to acquire travel information from the information. That is, the electronic apparatus 100 may determine when to acquire travel information of the user according to the operation of the user.
3) The electronic device 100 periodically acquires travel information
In this way, the electronic device 100 does not need to acquire travel information when the screen is turned off, and can acquire travel information before the screen is turned off, and further, the screen-off image is generated according to the travel information before the screen is turned off, so that the display of the screen-off image after the screen of the electronic device 100 is turned off is quickened, and the display of the screen-off image is prevented from being delayed due to the synthesis process of the screen-off image after the screen-off instruction is received by the electronic device 100.
It can be understood that, the electronic device 100 may trigger to obtain the trip information after the screen-off display function is turned on, and the embodiment of the present application does not limit the triggering time of the electronic device 100 to obtain the trip information.
S202, the electronic equipment 100 generates a screen-off image according to the travel information.
Specifically, the electronic device 100 may acquire a plurality of UI elements for composing the screen-off image from the user database according to the travel information. The user database stores UI elements which are applied relatively to the travel information of the user, and the UI elements can be divided into: weather elements, time elements, location elements, window elements, and the like. In a specific implementation, the off-screen display module in the electronic device 100 may obtain the UI element from the user database according to the time information, the location information, the weather information, and the travel mode.
It will be appreciated that the UI elements used to compose the off-screen image may be obtained from a server through the mobile communication module 150 and the wireless communication module 160, in addition to being obtained from a user database internal to the electronic device 100. Or, the UI element may be obtained from a gallery of the electronic device 100, where the electronic device 100 may provide an option for the user to customize the screen-off image, and the user may select the picture from the gallery as the UI element for synthesizing the screen-off image, or further, the user may cut, doodle, and adjust the display effect of the picture and then use the picture as the UI element for synthesizing the screen-off image.
The UI elements of different types are specifically located in different layers, and the electronic device 100 may combine the acquired UI elements in different layers into a screen-off image by a method of combining the layers. Wherein the electronic device 100 may use an image composition module to incorporate UI elements into an off-screen image. Table 1 exemplarily shows correspondence between different UI elements and layers, travel information.
TABLE 1
As can be seen from table 1, the layers where the UI elements are located can be divided into three layers: a foreground layer, an intermediate layer and a background layer. The window elements corresponding to the travel modes are contained in the foreground layer, when the travel modes are not travel modes, the window elements can select shutters, when the travel modes are airplane travel modes, the window elements can select airplane windows, and when the travel modes are train travel modes, the window elements can be train windows. The middle layer contains weather elements corresponding to weather, wherein the weather elements can comprise UI elements corresponding to six weather conditions including sunny days, rainy days, snowy days, cloudy days, foggy days and sand dust, and for example, the weather elements corresponding to sunny days can be clouds. The background layer includes a time element corresponding to time, specifically, the electronic device 100 may obtain a current specific time, for example, 8 points and 8 minutes, and the electronic device 100 may determine that the current time is the morning according to the correspondence between the UI element and the travel information, and the electronic device 100 may find the UI element corresponding to the morning from the time elements.
For example, the electronic device 100 may further design the planned screen-off image according to the correspondence between the layers and the travel information shown in table 2.
TABLE 2
As can be seen from table 2, electronic device 100 may determine the off-screen image according to different capabilities, where the capabilities may refer to the capability to display the dynamic off-screen image, for example, capability 1 may refer to the capability to support dynamic AOD, and capability 2 may refer to the capability to not support dynamic AOD. In both capability 1 and capability 2, the off-screen image may be divided into three layers: a foreground layer, an intermediate layer and a background layer. The foreground layer includes N static layers, which are N layers divided according to a travel scene, for example, when the travel scene is an airplane travel scene, the foreground layer may include 1 airplane window, and when the travel scene further includes a train travel scene and a non-travel scene, the foreground layer may further include 1 fire window and 1 shutter, respectively. The electronic device 100 may determine a travel scenario of the user, and thus a foreground layer, through the geographic location and the travel card. In the case of no network, when the electronic device 100 cannot acquire the current geographic location or the travel card, the electronic device 100 may determine that the travel scene of the user is a non-travel scene or a last-time-kept scene. The middle layer contains M layers which are changed along with weather, and the layers are layers which are divided according to weather, for example, the weather can be divided into 6 weather types including sunny days, rainy days, snowy days, cloudy days, foggy days and sand dust, different weather types contain different layers, and in addition, the layers under different weather types can be further subdivided by combining scenes, for example, when the scenes contain train traveling scenes, airplane traveling scenes and non-traveling scenes, the layers under 6 different weather types are respectively determined under different scenes. The electronic device 100 may determine the intermediate layer based on weather. The layer may not be displayed when the electronic device 100 is unable to obtain the current weather conditions. The background layer may contain L time-varying layers, for example, the layers may be divided according to time, dividing a day into 5 time periods: in the morning, noon, afternoon, evening and evening, different time periods comprise different background layers, and in addition, the background layers under different time can be further subdivided in combination with a travel scene, for example, when the travel scene comprises a train travel scene, an airplane travel scene and a non-travel scene, the layers under 5 different weather types are respectively determined under different travel scenes. The electronic device 100 may determine the background layer by the system time. When the electronic device 100 cannot acquire the current time information, a period of time may be preset as the current time, for example, afternoon. In addition, the foreground layer in the capability 1 further includes N groups of dynamic switch layers in the layers corresponding to the capability 1 and the capability 2, and the dynamic switch layers can be switched according to the operation of the user, for example, the clicking operation. For example, when the travel scene is an airplane travel scene, the foreground layer may include 2 sets of sequential frame images for opening or closing the airplane window, the click operation may switch the opening or closing of the airplane window, and the background layer after opening the airplane window again may change. The off-screen image corresponding to the capability 1 may refer to an off-screen image that varies according to a user click operation shown in fig. 3C to 3E, and the off-screen image corresponding to the capability 2 may refer to an off-screen image shown in fig. 3C or 3E.
It can be understood that, table 1 and table 2 only show a specific correspondence between the travel information and the UI elements in the layers by way of example, and do not limit the embodiments of the present application, in other embodiments of the present application, more or fewer layers may be included in table 1 and table 2, the correspondence between the UI elements and the layers, and the correspondence between the UI elements and the travel information are not limited to the correspondence in table 1, and the foreground layer, the middle layer, and the background layer are only used to embody the upper-lower relationship of the layers, and do not limit the layers.
Illustratively, fig. 4 shows a process by which the electronic device 100 merges UI elements in different layers into an off-screen image. The image layer included in the screen-off image may further include an information layer shown in fig. 4 (a), where the information layer may include text information related to time, date, notification, electricity, etc., and fig. 4 (e) shows a screen-off image that is finally synthesized by the electronic device 100 and includes text information in the information layer shown in fig. 4 (a), window elements in a foreground layer shown in fig. 4 (b), weather elements in an intermediate layer shown in fig. 4 (c), and time elements in a background layer shown in fig. 4 (d).
It should be noted that, the electronic device 100 determines UI elements corresponding to different travel information in different layers, so that mutual influence of different types of UI elements can be avoided, and the electronic device 100 is helped to quickly update the screen-off image related to the travel information of the user. For example, during the screen-off process, when the electronic device 100 detects that weather changes, the electronic device 100 may only change the weather elements in the middle layer, without having to re-pick the UI elements of all the layers to compose the screen-off image.
S203, the electronic equipment 100 displays the screen-off image under the screen-off condition.
After determining the screen-off image according to the travel information, the electronic device 100 may display the screen-off image in the screen-off state. In particular implementations, the electronic device 100 may invoke the interface display module to display the quench screen image. As shown in fig. 3C, the image shown in the user interface 31 is an off-screen image generated by the electronic device 100.
In addition, the screen-off image may be a dynamic or static image, and when the screen-off image is a dynamic image, the UI elements constituting the screen-off image are composed of sequential frames, and when the screen-off image is a static image, the UI elements constituting the screen-off image are composed of static images.
The display of a dynamic off-screen image or a static off-screen image by the electronic device 100 may be determined by the following two ways:
1) Electronic device 100 detects user selection of a dynamic or static off-screen image
That is, the electronic device 100 may provide a control of a dynamic off-screen image or a static off-screen image, and the electronic device 100 may detect an operation of the control by the user to determine whether the user selects the dynamic or static off-screen image.
2) The electronic device 100 determines to display a dynamic or static off-screen image based on its own capabilities
The capability may refer to hardware capability, power, etc. of the electronic device 100, for example, when the CPU computing capability of the electronic device 100 is weak, or the power of the electronic device 100 is low, the electronic device 100 may display only static off-screen images.
In some embodiments, during the display of the off-screen image by the electronic device 100, the electronic device 100 may also detect an operation (e.g., a click operation) by the user on the off-screen image, in response to which the electronic device 100 alters the off-screen image. As shown in fig. 3C, when the electronic device 100 detects a touch operation performed by the user on the image display area 311, the electronic device 100 displays an image displayed in the image display area 311 shown in fig. 3D, wherein the image displayed in the image display area 311 shown in fig. 3D simulates a scene after the window is closed. Further, when the electronic device 100 detects the touch operation of the user on the touch operation of the image display area 311 again, the electronic device 100 displays the image displayed in the image display area 311 as shown in fig. 3E, wherein the image displayed in the image display area 311 shown in fig. 3E is used to simulate the scene after the window is opened again. Thus, when the electronic device 100 is in the off-screen state, the electronic device 100 can still receive the operation of the user to change the off-screen image, simulate the operation of opening and closing the window, increase the interest of the off-screen display, and enhance the interactivity between the user and the electronic device 100.
It can be understood that the display method provided by the embodiment of the application can be applied to not only travel scenes of the user, but also other scenes, such as a working scene of the user, namely, different screen-off images are determined according to different working scenes of the user. In an operational scenario, the electronic device 100 may determine an operational mode of a user, where the operational mode of the user may include: the weekend rest mode, corporate office mode, business trip office mode, vacation mode, etc., after which the electronic device 100 determines different UI elements according to the different operating modes to generate the off-screen image. Thus, the user can display the working state of the user on the screen-off image, and the personalized screen-off image is customized for the user.
Fig. 9 is an exemplary flow chart of layer stacking according to an embodiment of the present application.
As shown in fig. 9, the stack of layers mainly involves the following five modules: the system comprises an image synthesis module, a time management module, a position management module, a weather management module and an interface display module. The image synthesis module is configured to synthesize the screen-quenching image according to different image layers, the time management module, the position management module, and the weather management module are respectively configured to obtain time, position, and weather information, and the interface display module is configured to display the screen-quenching image, and details about the five modules can be referred to the related content of fig. 6, which is not repeated herein.
S301, the image synthesis module acquires a travel mode.
Specifically, the image synthesis module may acquire a travel mode through a travel card of the user. Here, the travel card of the user refers to an interface image that is automatically generated by the electronic device 100 for the user according to the travel information provided in the travel class application and used for displaying the travel of the user. Thus, the user can acquire the travel information of the user through the travel card without opening the travel application.
S302, an image synthesis module determines a foreground layer of the screen-off image according to the travel mode.
Specifically, the image synthesis module may obtain a UI element related to the trip mode according to the trip mode, and use the UI element as a part of the screen-off image to be displayed finally, for example, when the trip mode is the airplane trip mode, the image synthesis module may obtain an airplane picture, and splice the airplane picture in the screen-off image. In addition, the image synthesis module may splice a foreground layer of the screen-quenching image according to the travel mode, where the foreground layer may refer to a layer shown in (b) in fig. 4, and the layer is used to superimpose an outer contour corresponding to the travel mode. For example, when the trip mode is an airplane trip mode, the image composition module may select an airplane window in a foreground layer of the screen-off image to splice the screen-off image.
S303, the image synthesis module calls the time management module to acquire time information, and determines a background layer of the screen-off image according to the time information.
Specifically, after the image synthesis module obtains the time information, a UI element corresponding to the time information, that is, an immediate element, may be obtained, and the time element is used as a part of the screen-off image that needs to be displayed finally. For example, when the time is daytime, the image composition module may stitch the sun as a stitched element in the background layer into the quench screen image. The background layer may refer to a layer shown in (d) of fig. 4, and the UI element corresponding to the time information may refer to sun shown in (d) of fig. 4.
S304, the image synthesis module calls the position management module to acquire position information, and a background layer of the screen-off image is determined according to the position information.
Specifically, after the image synthesis module acquires the position information, a UI element corresponding to the position information, that is, a position element, may be acquired, and the position element is used as a part of the screen-off image that needs to be displayed finally. For example, when the location information indicates that the user is currently located in a metropolitan area, the image composition module may stitch the pictures related to the metropolitan area as elements stitched in the background layer into the screen-off image. The background layer may refer to a layer shown in (d) of fig. 4, and the UI element corresponding to the position information may refer to a scene picture shown in (d) of fig. 4.
S305, the image synthesis module calls a weather management module to acquire weather information, and determines the middle layer of the screen-quenching image according to the weather information.
Specifically, after the image synthesis module obtains the weather information, a UI element corresponding to the weather information, that is, a weather element, may be obtained, and the weather element is used as a part of the screen-off image that needs to be displayed finally. For example, when the weather is cloudy, the image synthesis module may stitch the clouds as elements stitched in the middle layer into the quench screen image. The middle layer may refer to a layer shown in (c) of fig. 4, and the UI element corresponding to the weather element may refer to a cloud shown in (c) of fig. 4.
S306, the image synthesis module superimposes the image layers to obtain a screen-off image.
The image synthesis module can combine the UI elements determined in the foreground layer, the middle layer and the background layer, so as to obtain the screen-off image. Or, further, the screen-off image may further include an information layer, where the information layer is used to display text information such as time and date, and the image synthesis module may further combine the layers according to the information layer when synthesizing the screen-off image. Finally, the off-screen image obtained by the image composition module may refer to the image shown in (e) of fig. 4. The information layer may refer to the layer shown in fig. 4 (a).
S307, the image synthesis module sends the screen-off image to the interface display module.
S308, displaying the screen-off image by the image display module.
In the case where the electronic device 100 is off-screen, the image display module may display the synthesized off-screen image. The method is used for acquiring the travel mode, time, position, weather and other information of the user through the screen-off image, and provides interesting screen-off experience for the user.
Fig. 10 schematically shows a flow chart of another display method.
As shown in fig. 10, the method includes:
s401, the electronic equipment 100 acquires a travel mode and travel time of a user.
The travel mode of the user indicates whether the user travels at the travel time or not, or a vehicle used by the user at the travel time, and specifically, the travel mode may include a non-travel mode, an airplane travel mode, a train travel mode, a self-driving travel mode, a taxi travel mode, a bus travel mode, and the like.
Specifically, the electronic device 100 may acquire the travel mode and travel time of the user through the user travel card. Here, the travel card of the user refers to an interface image that is automatically generated by the electronic device 100 for the user according to the travel information provided in the travel class application and used for displaying the travel of the user. Thus, the user can acquire the travel information of the user through the travel card without opening the travel application.
The travel card records the travel mode and travel time of the user. The electronic device 100 may trigger to acquire a travel mode and travel time in the travel card when detecting the user screen-off operation.
In addition, the electronic device 100 may trigger execution of step S401 after receiving the screen-off instruction.
S402, the electronic device 100 judges whether the screen-off time is matched with the travel time.
After the electronic device 100 detects the user screen-off operation, the electronic device 100 determines the screen-off time, that is, whether the current time when the user detects the user screen-off operation is within the travel time. Alternatively, the off-screen time may also refer to a time that the electronic device 100 determines in advance that the electronic device 100 may be off-screen. Specifically, the user's operation of turning off the screen may refer to the operation of the user on the power key.
That is, the off-screen time may refer to a time when the electronic device 100 receives the off-screen instruction, or may refer to a time preset by the electronic device 100. This is because the electronic device 100 may passively turn off the screen according to the operation of the user on the power key, or may actively turn off the screen without receiving the operation of the user within a preset time. That is, the electronic device 100 may perform the following steps S403 to S406 after receiving the user screen-off operation, that is, the synthesis of the screen-off image is completed, or may determine the time when the electronic device 100 may be off, and perform the following steps S403 to S406 before the electronic device 100 may be off.
In the embodiment of the application, the screen-off time can also refer to the first time.
If the electronic device 100 determines that the off-screen time matches the travel time, step S403 is executed, otherwise step S407 is executed.
S403, the electronic equipment 100 acquires travel information such as time, position, weather, altitude and the like.
The electronic device 100 may acquire travel information through information of an application, information collected by hardware, information actively input by a user, or information collected by other applications. The manner in which the electronic device 100 obtains the travel information may be referred to in the foregoing, and will not be described herein.
S404, the electronic device 100 determines the UI element according to travel information such as time, position, weather, altitude, travel mode and the like.
Specifically, the electronic device 100 may include a database, where the database includes a large number of UI elements, and the electronic device 100 may obtain, from the database, UI elements corresponding to travel information such as time, location, weather, altitude, travel mode, and the like through a correspondence between travel information and UI elements, and specifically, regarding the correspondence between travel information and UI elements may be referred to in table 1. Alternatively, the electronic device 100 may upload travel information to a server, and acquire a UI element corresponding to the travel information from the server.
S405, the electronic device 100 synthesizes the screen-off image according to the UI element.
The electronic device 100 may adopt a layer stacking method to place UI elements on different layers, and stack the layers together to compose a screen-off image. The content of the electronic device 100 for synthesizing the screen-off image may be referred to in fig. 4 and the related content in step S202, which are not described herein.
S406, the electronic device 100 determines whether the synthesis is successful.
After the electronic device 100 successfully synthesizes the screen-off image, step S407 is performed, otherwise, step S403 is performed back.
That is, after the electronic device 100 completes the synthesis of the off-screen image, it may be determined whether the off-screen image is successfully synthesized at present, if the synthesis is successful, the off-screen image is displayed, if not, the travel information is acquired again and the off-screen image is synthesized, until the off-screen image is successfully synthesized.
S407, the electronic device 100 displays the screen-off image.
Here, the off-screen image displayed by the electronic device 100 refers to an off-screen image that is currently newly synthesized by the electronic device 100. Specifically, when the electronic device 100 determines in step S402 that the off-screen time matches the travel time, the off-screen image displayed by the electronic device 100 is the off-screen image obtained further according to the travel information acquired in steps S403 to S405, and when the electronic device 100 determines in step S402 that the off-screen time does not match the travel time; the screen-off image displayed by the electronic device 100 is a screen-off image obtained according to travel information obtained previously. In the embodiment of the present application, when the current off-screen time matches the travel time, the off-screen image displayed by the electronic device 100 may be referred to as a first off-screen image, and when the current off-screen time does not match the travel time, the off-screen image displayed by the electronic device 100 may be referred to as a second off-screen image.
In addition, when the off-screen image includes time information displayed in real time, when the electronic device 100 displays the off-screen image, the method further includes overlapping the time information into the off-screen image for displaying, and the time information is updated at a moment in the process of displaying the off-screen image, so that it is ensured that the off-screen image displayed by the electronic device 100 can provide the latest time for a user.
The embodiments of the present application may be arbitrarily combined to achieve different technical effects.
In the above embodiments, it may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on a computer, the processes or functions in accordance with the present application are produced in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in a computer-readable storage medium or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by a wired (e.g., coaxial cable, fiber optic, digital subscriber line), or wireless (e.g., infrared, wireless, microwave, etc.). The computer readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that contains an integration of one or more available media. The usable medium may be a magnetic medium (e.g., a floppy disk, a hard disk, a magnetic tape), an optical medium (e.g., a DVD), or a semiconductor medium (e.g., a Solid State Disk (SSD)), or the like.
Those of ordinary skill in the art will appreciate that implementing all or part of the above-described method embodiments may be accomplished by a computer program to instruct related hardware, the program may be stored in a computer readable storage medium, and the program may include the above-described method embodiments when executed. And the aforementioned storage medium includes: ROM or random access memory RAM, magnetic or optical disk, etc.
In summary, the foregoing description is only exemplary embodiments of the present invention and is not intended to limit the scope of the present invention. Any modification, equivalent replacement, improvement, etc. made according to the disclosure of the present invention should be included in the protection scope of the present invention.

Claims (11)

1. A display method, the method comprising:
the electronic equipment receives a screen quenching instruction at the first time;
the electronic equipment acquires a first travel mode, wherein the first travel mode indicates that a user travels by taking an airplane at a first travel time;
and under the condition that the first time is within the first travel time, the electronic equipment acquires the following pieces of information: the method comprises the steps that the first time, the weather of the first time, the position of the electronic equipment at the first time and the altitude of the electronic equipment at the first time generate a first screen quenching image according to the plurality of items of information and the first travel mode, the first screen quenching image displays a plurality of interface elements overlapped up and down and a flight progress bar positioned outside the plurality of interface elements, the flight progress bar is used for indicating the travel progress of a user taking an airplane, the plurality of interface elements comprise an interface element which is overlapped at the uppermost layer and indicates the first travel mode, an interface element which is overlapped at the middle layer and indicates the weather, and an interface element which is overlapped at the lowermost layer and indicates the first time, the position and the altitude, the interface element which is overlapped at the uppermost layer comprises an opened airplane window, and the plurality of interface elements are selected from a database according to the plurality of items of information and the first travel mode and comprise pictures;
Responding to the screen-off instruction, and enabling the electronic equipment to screen off and display the first screen-off image;
the electronic device receives a first operation acting on the first screen-off image, and responds to the first operation, the electronic device updates the first screen-off image, the updated first screen-off image displays a closed airplane window, no interface element indicating the weather is displayed, and the interface element indicating the first time, the position and the altitude is displayed.
2. The method of claim 1, wherein the electronic device displays the first screen-off image, in particular comprising:
and the electronic equipment lightens a partial area of the display screen and displays the first screen-off image in the partial area.
3. The method of claim 1, wherein the first screen-off image is composed of a first interface element and a second interface element located at different layers, and the electronic device generates the first screen-off image according to the plurality of items of information and the first travel mode, specifically including:
the electronic equipment acquires a first interface element indicating the plurality of items of information and a second interface element indicating the first travel mode;
And the electronic equipment merges the first interface element and the second interface element to obtain the first screen-off image.
4. The method of claim 3, wherein the electronic device pre-stores a plurality of interface elements, the plurality of interface elements including the first interface element and the second interface element.
5. The method of claim 1, wherein prior to the electronic device acquiring the first travel pattern, the method further comprises:
the electronic equipment acquires journey information of a user through journey application;
the electronic equipment generates a user travel card according to the travel information, wherein the user travel card indicates the first travel mode;
the electronic equipment acquires a first travel mode of a user, and specifically comprises the following steps:
and the electronic equipment acquires the first travel mode of the user from the user travel card.
6. The method according to claim 1, wherein the method further comprises:
and under the condition that the first time is not in the first travel time, responding to the screen-off instruction, displaying a second screen-off image by the electronic equipment, wherein the second screen-off image is an image generated by the electronic equipment according to a second travel mode, the second travel mode indicates whether the user travels at the second travel time or not, or a vehicle used by the user at the second travel time, and the second travel time is positioned before the first travel time.
7. The method of claim 1, wherein the electronic device obtains one or more of the first time, the weather of the first time, the location of the electronic device at the first time, or the altitude at which the electronic device is located at the first time, and specifically comprises:
and the electronic equipment acquires one or more pieces of information of the first time, the weather of the first time, the position of the electronic equipment at the first time or the altitude of the electronic equipment at the first time from data acquired by hardware, a first application or equipment establishing a connection relation with the electronic equipment.
8. The method of claim 1, wherein the electronic device detecting that the electronic device enters an off-screen state at a first time specifically comprises:
the electronic equipment receives a second operation of a user acting on the power key at the first time, responds to the second operation, and enters a screen-off state, or the electronic equipment does not receive the operation of the user within a preset time, and enters the screen-off state at the first time.
9. The method of any of claims 1-8, wherein the electronic device detects that the electronic device is in an off-screen state at a first time, the method further comprising:
The electronic equipment displays a user interface provided by a setting application;
the electronic equipment receives a third operation acting on an off-screen display option in the user interface, wherein the third operation is used for triggering the electronic equipment to display the first off-screen image when the off-screen is displayed.
10. An electronic device, comprising: a memory, one or more processors, a plurality of applications, and one or more programs; wherein the one or more programs are stored in the memory; wherein the one or more processors, when executing the one or more programs, cause the electronic device to implement the method of any of claims 1-9.
11. A computer readable storage medium comprising instructions which, when run on an electronic device, cause the electronic device to perform the method of any one of claims 1 to 9.
CN202111101051.XA 2021-09-18 2021-09-18 Display method, graphical interface and related device Active CN113973153B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111101051.XA CN113973153B (en) 2021-09-18 2021-09-18 Display method, graphical interface and related device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111101051.XA CN113973153B (en) 2021-09-18 2021-09-18 Display method, graphical interface and related device

Publications (2)

Publication Number Publication Date
CN113973153A CN113973153A (en) 2022-01-25
CN113973153B true CN113973153B (en) 2023-08-29

Family

ID=79586632

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111101051.XA Active CN113973153B (en) 2021-09-18 2021-09-18 Display method, graphical interface and related device

Country Status (1)

Country Link
CN (1) CN113973153B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106557292A (en) * 2016-11-11 2017-04-05 珠海市魅族科技有限公司 Method for information display and device
CN106911853A (en) * 2017-02-28 2017-06-30 广州三星通信技术研究有限公司 For the display control method and equipment of electric terminal
CN110149442A (en) * 2019-04-10 2019-08-20 华为技术有限公司 A kind of control method and terminal device for putting out screen display
WO2021115481A1 (en) * 2019-12-13 2021-06-17 深圳市万普拉斯科技有限公司 Terminal control method and device, terminal and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106557292A (en) * 2016-11-11 2017-04-05 珠海市魅族科技有限公司 Method for information display and device
CN106911853A (en) * 2017-02-28 2017-06-30 广州三星通信技术研究有限公司 For the display control method and equipment of electric terminal
CN110149442A (en) * 2019-04-10 2019-08-20 华为技术有限公司 A kind of control method and terminal device for putting out screen display
CN113411445A (en) * 2019-04-10 2021-09-17 华为技术有限公司 Control method for screen-off display and terminal equipment
WO2021115481A1 (en) * 2019-12-13 2021-06-17 深圳市万普拉斯科技有限公司 Terminal control method and device, terminal and storage medium

Also Published As

Publication number Publication date
CN113973153A (en) 2022-01-25

Similar Documents

Publication Publication Date Title
US11892299B2 (en) Information prompt method and electronic device
EP3072008B1 (en) Head-mounted display device and method of changing light transmittance of the same
CN109672776B (en) Method and terminal for displaying dynamic image
KR20140049850A (en) Method for operating a mobile terminal
CN114003324A (en) Method for combining multiple applications and simultaneously starting multiple applications and electronic equipment
CN115348350B (en) Information display method and electronic equipment
CN113761427A (en) Method for generating card in self-adaptive mode, terminal device and server
CN113050841A (en) Method, electronic equipment and system for displaying multiple windows
CN112612386B (en) Mobile terminal and display method of application card thereof
CN116152122B (en) Image processing method and electronic device
CN111105474B (en) Font drawing method, font drawing device, computer device and computer readable storage medium
US20170337727A1 (en) Digital surface rendering
CN113157357A (en) Page display method, device, terminal and storage medium
CN112905280B (en) Page display method, device, equipment and storage medium
CN114513574A (en) Interface display method, electronic device and storage medium
CN113973153B (en) Display method, graphical interface and related device
CN114003827A (en) Weather information display method and device and electronic equipment
CN116055629B (en) Method for identifying terminal state, electronic equipment, storage medium and chip
KR20190053489A (en) Method for controlling mobile terminal supplying virtual travel survey service using pictorial map based on virtual reality
CN115686700A (en) Rendering method and electronic equipment
CN113963086B (en) Screen-off display method and electronic equipment
CN116688494B (en) Method and electronic device for generating game prediction frame
WO2023125795A1 (en) Display method, user interface, and electronic device
CN114816622B (en) Scene picture display method and device, electronic equipment and storage medium
US20230114178A1 (en) Image display method and electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant