CN114115772B - Method and device for off-screen display - Google Patents

Method and device for off-screen display Download PDF

Info

Publication number
CN114115772B
CN114115772B CN202110827202.3A CN202110827202A CN114115772B CN 114115772 B CN114115772 B CN 114115772B CN 202110827202 A CN202110827202 A CN 202110827202A CN 114115772 B CN114115772 B CN 114115772B
Authority
CN
China
Prior art keywords
target
screen
user
scene
display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110827202.3A
Other languages
Chinese (zh)
Other versions
CN114115772A (en
Inventor
王彦恒
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202110827202.3A priority Critical patent/CN114115772B/en
Publication of CN114115772A publication Critical patent/CN114115772A/en
Application granted granted Critical
Publication of CN114115772B publication Critical patent/CN114115772B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0412Digitisers structurally integrated in a display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/147Digital output to display device ; Cooperation and interconnection of the display device with other functional units using display panels
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/448Execution paradigms, e.g. implementations of programming paradigms
    • G06F9/4488Object-oriented
    • G06F9/449Object-oriented method invocation or resolution
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/70Reducing energy consumption in communication networks in wireless communication networks

Abstract

The application provides a method and a device for off-screen display, wherein the method is applied to electronic equipment with a display screen and comprises the following steps: acquiring a first instruction of a user, wherein the first instruction is used for indicating a target screen to display an AOD resource package, the target AOD resource package comprises a target application package APK, and the target APK is used for processing data of the user; calling the target APK according to the target AOD resource package; and when the electronic equipment is in screen-off state, acquiring the data in the target APK and performing screen-off display on the display screen. According to the technical scheme, the user can obtain the information required by the user under the condition that the electronic equipment is not required to be unlocked by the bright screen, so that the user experience is improved.

Description

Method and device for off-screen display
Technical Field
The application relates to the field of terminals, in particular to a method and a device for off-screen display.
Background
The off-screen display (always on display, AOD) may also be referred to as an off-screen display, which means that after an electronic device (e.g., a mobile phone, a tablet computer, etc.) is off-screen, a partial area on the screen can be lightened to display information such as a clock, a date, a notification, etc., so as to facilitate the operation of a user, thereby improving the experience of the user.
Currently, if a user needs to view information from an electronic device in a specific scene, the user needs to unlock the electronic device and enter a specific Application (App) to obtain the required information. For example, when a user is running, it is desired to acquire exercise information; the user needs to unlock the electronic device and obtain step number information, calorie consumption information or other exercise information from the exercise-like application. However, when the user is running, it is inconvenient for the user to perform a series of operations such as unlocking the electronic device and entering the sports application, resulting in poor user experience.
Disclosure of Invention
The application provides a method and a device for displaying an off-screen display, which can solve the problem that a user obtains required information and has complex operation under the condition of off-screen of electronic equipment.
In a first aspect, a method for off-screen display is provided, where the method is applied to an electronic device with a display screen, and includes: acquiring a first instruction of a user, wherein the first instruction is used for indicating a target screen to display an AOD resource package, the target AOD resource package comprises a target application package APK, and the target APK is used for processing data of the user; calling the target APK according to the target AOD resource package; and when the electronic equipment is in screen-off state, acquiring the data in the target APK and performing screen-off display on the display screen.
Based on the technical scheme of the application, a user can select a target theme according to requirements, and the target theme corresponds to a target AOD resource package displayed by the off-screen display; displaying a display interface corresponding to the target AOD resource package under the condition of screen failure according to the target AOD resource package selected by the user; therefore, a user can acquire the required information without unlocking the electronic equipment bright screen, and the user experience is improved.
The off-screen display method in the embodiment of the application can be applied to an Organic Light-Emitting Diode (OLED) display screen, and the OLED display screen can emit Light from a single pixel point; the off-screen display may refer to displaying a part of the area of the OLED display screen, and the area corresponding to the black pixel point may not be displayed.
With reference to the first aspect, in certain implementation manners of the first aspect, the target AOD resource package includes a scene description file, where the scene description file includes a calling method of the target APK, and the calling the target APK according to the target AOD resource package includes:
analyzing the scene description file to obtain a calling method of the target APK; and calling the target APK according to the calling method.
In one possible implementation manner, the scene description file can be parsed by an extensible markup language (Extensible Markup Language, XML) parsing tool to obtain a calling method of the target APK; and calling the target APK according to the calling method of the target APK.
Based on the technical scheme of the application, the target AOD resource package can also comprise a scene description file, wherein the scene description file comprises a calling method of the target APK; the calling method of the target APK can be obtained by analyzing the scene description file; calling the target APK according to a calling method of the target APK, so that the target APK obtains user data and performs data processing; when the electronic equipment is off-screen, acquiring data in the target APK and performing off-screen display on the display screen, so that user experience is improved.
With reference to the first aspect, in certain implementation manners of the first aspect, the target APK is further configured to obtain, by using a wearable device of the user, data of the user.
In one possible implementation manner, taking the target AOD resource package as an example of the AOD resource package of the running scene, the target APK may obtain data such as the heart rate and the step number of the user through the wearable device of the user.
With reference to the first aspect, in certain implementation manners of the first aspect, the target APK is further configured to instruct the user to configure a target parameter of the electronic device off-screen display.
In one possible implementation, if the target AOD resource package selected by the user needs to be set with parameters, the target application package may further instruct the user to configure the target parameters.
For example, if the target AOD resource package selected by the user is an AOD resource package of the running scene, the user may set the target distance in the interface displayed in the running scene.
With reference to the first aspect, in certain implementation manners of the first aspect, the target AOD resource package further includes a preview image, where the preview image is used to display a preview interface of the electronic device for off-screen.
It should be appreciated that in embodiments of the present application a target topic may correspond to a target AOD resource package.
In one possible implementation, when a user selects a target theme, the off-screen display controller may invoke a target AOD resource package corresponding to the target theme; the target AOD resource package can comprise a target theme description file, a target theme preview picture file and a target application package file corresponding to the target theme; the target theme description file may be used to describe the name and classification of the target theme and call the target application package file corresponding to the target theme; the target theme preview image file is used for displaying an off-screen preview interface corresponding to the target theme; the target application package may be used to guide the user to set target parameters under the target topic and process the data of the user under the target topic.
With reference to the first aspect, in certain implementation manners of the first aspect, the method further includes:
and displaying a theme list, wherein the theme list is used for determining a target theme of the electronic equipment off-screen display by the user, and the target theme corresponds to the target AOD resource package.
With reference to the first aspect, in certain implementation manners of the first aspect, the topic list includes a topic classification, where the topic classification is used to indicate a category corresponding to a topic in the topic list.
In one possible implementation, the scene description file includes a first parameter, where the first parameter is used to indicate whether the target application package is running when the electronic device is on-screen.
According to the technical scheme, under the condition that the target application program package is required to be continuously operated in part, the first parameter can be set in the scene description file to indicate that the target application program can be continuously operated when the electronic equipment is on the screen. For example, in the case that the target theme is a running theme, the electronic device can continuously acquire data of the user in the running scene and process the data; under the condition that the electronic equipment is turned off, data are acquired from a target APK corresponding to the running theme, and running data of a user are displayed.
It should be understood that, in general, the target application package is operated under the condition that the electronic device is off-screen, but for some off-screen topics needing to be operated continuously, it is also possible to determine that the target application package is operated when the screen is on according to the first parameter, and perform corresponding data caching.
In one possible implementation, the target AOD resource package corresponding to the off-screen scenario may be data configured in the electronic device when the electronic device leaves the factory.
In one possible implementation, the data in the target AOD resource package corresponding to the off-screen scenario may also be data obtained from an open database. Under the condition that the AOD resource package is obtained from an open database, the data needs to be unified according to a standard format; i.e. the AOD resource package at least includes a scene description file, a scene preview image file, and an APK file.
In a second aspect, there is provided an apparatus for off-screen display comprising means for performing any one of the methods of the first or second aspects.
Alternatively, the apparatus may be a terminal device. The apparatus may include an input unit and a processing unit.
In one possible implementation, when the apparatus is a terminal device, the processing unit may be a processor, and the input unit may be a communication interface; the terminal device may further comprise a memory for storing computer program code which, when executed by the processor, causes the terminal device to perform any of the methods of the first aspect.
In a third aspect, there is provided a computer readable storage medium storing computer program code which, when run by an off-screen displayed apparatus, causes the apparatus to perform any one of the methods of the first aspect.
In a fourth aspect, there is provided a computer program product comprising: computer program code which, when run by an off-screen displayed apparatus, causes the apparatus to perform any of the methods of the first aspect.
Drawings
FIG. 1 is a schematic diagram of a hardware system suitable for use in the apparatus of the present application;
FIG. 2 is a schematic diagram of a software system suitable for use with the apparatus of the present application;
FIG. 3 is a schematic diagram of a system architecture according to an embodiment of the present application;
FIG. 4 is a schematic diagram of a method for off-screen display according to an embodiment of the present application;
FIG. 5 is a schematic diagram of a method for invoking a target application package provided by an embodiment of the present application;
FIG. 6 is a schematic diagram of a method for off-screen display in a running scenario according to an embodiment of the present application;
FIG. 7 is a schematic diagram of a selection off-screen scene interface according to an embodiment of the present application;
FIG. 8 is a schematic diagram of a selection off-screen scene interface provided by an embodiment of the present application;
FIG. 9 is a schematic diagram of a selection off-screen scene interface provided by an embodiment of the present application;
FIG. 10 is a schematic diagram of a selection off-screen scene interface provided by an embodiment of the present application;
FIG. 11 is a schematic diagram of an off-screen display interface according to an embodiment of the present application;
FIG. 12 is a schematic diagram of an off-screen display interface according to an embodiment of the present application;
FIG. 13 is a schematic diagram of an off-screen display interface according to an embodiment of the present application;
FIG. 14 is a schematic view of an off-screen display device according to the present application;
fig. 15 is a schematic diagram of an off-screen display device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the accompanying drawings.
In the description of embodiments of the present application, the terms "first," "second," and the like are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or as implicitly indicating the number of technical features indicated, or other limitations.
Fig. 1 shows a hardware system suitable for the device of the application.
The apparatus 100 may be a mobile phone, a smart screen, a tablet computer, a wearable electronic device, an in-vehicle electronic device, an augmented reality (augmented reality, AR) device, a Virtual Reality (VR) device, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, a personal digital assistant (personal digital assistant, PDA), a projector, etc., and the embodiments of the present application do not limit the specific type of the apparatus 100.
The apparatus 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, keys 190, a motor 191, an indicator 192, a camera 193, a display 194, and a subscriber identity module (subscriber identification module, SIM) card interface 195, etc. The sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
The configuration shown in fig. 1 does not constitute a specific limitation on the apparatus 100. In other embodiments of the application, the apparatus 100 may include more or fewer components than those shown in FIG. 1, or the apparatus 100 may include a combination of some of the components shown in FIG. 1, or the apparatus 100 may include sub-components of some of the components shown in FIG. 1. The components shown in fig. 1 may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units. For example, the processor 110 may include at least one of the following processing units: application processors (application processor, AP), modem processors, graphics processors (graphics processing unit, GPU), image signal processors (image signal processor, ISP), controllers, video codecs, digital signal processors (digital signal processor, DSP), baseband processors, neural-Network Processors (NPU). The different processing units may be separate devices or integrated devices.
The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
In some embodiments, the processor 110 may include one or more interfaces. For example, the processor 110 may include at least one of the following interfaces: inter-integrated circuit, I2C) interfaces, inter-integrated circuit audio (inter-integrated circuit sound, I2S) interfaces, pulse code modulation (pulse code modulation, PCM) interfaces, universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interfaces, mobile industry processor interfaces (mobile industry processor interface, MIPI), general-purpose input/output (GPIO) interfaces, SIM interfaces, USB interfaces.
The I2C interface is a bi-directional synchronous serial bus comprising a serial data line (SDA) and a serial clock line (derail clock line, SCL). In some embodiments, the processor 110 may contain multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, charger, flash, camera 193, etc., respectively, through different I2C bus interfaces. For example: the processor 110 may couple the touch sensor 180K through an I2C interface, causing the processor 110 to communicate with the touch sensor 180K through an I2C bus interface, implementing the touch functionality of the device 100.
The I2S interface may be used for audio communication. In some embodiments, the processor 110 may contain multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 via an I2S bus to enable communication between the processor 110 and the audio module 170. In some embodiments, the audio module 170 may transmit an audio signal to the wireless communication module 160 through the I2S interface, to implement a function of answering a call through the bluetooth headset.
PCM interfaces may also be used for audio communication to sample, quantize and encode analog signals. In some embodiments, the audio module 170 and the wireless communication module 160 may be coupled through a PCM interface. In some embodiments, the audio module 170 may also transmit audio signals to the wireless communication module 160 through the PCM interface to implement a function of answering a call through the bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus for asynchronous communications. The bus may be a bi-directional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is typically used to connect the processor 110 with the wireless communication module 160. For example: the processor 110 communicates with a bluetooth module in the wireless communication module 160 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 170 may transmit an audio signal to the wireless communication module 160 through a UART interface, to implement a function of playing music through a bluetooth headset.
The MIPI interface may be used to connect the processor 110 with peripheral devices such as the display 194 and camera 193. The MIPI interfaces include camera serial interfaces (camera serial interface, CSI), display serial interfaces (display serial interface, DSI), and the like. In some embodiments, processor 110 and camera 193 communicate through a CSI interface to implement the photographing function of apparatus 100. Processor 110 and display 194 communicate via a DSI interface to implement the display functions of apparatus 100.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal interface as well as a data signal interface. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera 193, the display 194, the wireless communication module 160, the audio module 170, and the sensor module 180. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, or a MIPI interface.
The USB interface 130 is an interface conforming to the USB standard specification, and may be, for example, a Mini (Mini) USB interface, a Micro (Micro) USB interface, or a C-type USB (USB Type C) interface. The USB interface 130 may be used to connect a charger to charge the device 100, to transfer data between the device 100 and a peripheral device, and to connect a headset to play audio through the headset. USB interface 130 may also be used to connect other devices 100, such as AR equipment.
The connection relationships between the modules shown in fig. 1 are merely illustrative, and do not constitute a limitation on the connection relationships between the modules of the apparatus 100. Alternatively, the modules of the apparatus 100 may be combined by using a plurality of connection manners in the foregoing embodiments.
The charge management module 140 is used to receive power from a charger. The charger can be a wireless charger or a wired charger. In some wired charging embodiments, the charge management module 140 may receive the current of the wired charger through the USB interface 130. In some wireless charging embodiments, the charge management module 140 may receive electromagnetic waves (current path shown in dashed lines) through the wireless charging coil of the device 100. The charging management module 140 may also provide power to the device 100 through the power management module 141 while charging the battery 142.
The power management module 141 is used for connecting the battery 142, and the charge management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 to power the processor 110, the internal memory 121, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be used to monitor parameters such as battery capacity, battery cycle times, and battery state of health (e.g., leakage, impedance). Alternatively, the power management module 141 may be provided in the processor 110, or the power management module 141 and the charge management module 140 may be provided in the same device.
The wireless communication function of the apparatus 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the apparatus 100 may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution for wireless communication applied on the device 100, such as at least one of the following: second generation (2) th generation, 2G) mobile communication solutions, third generation (3 th generation, 3G) mobile communication solution, fourth generation (4 th generation, 5G) mobile communication solution, fifth generation (5 th generation, 5G) mobile communication solution. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc. The mobile communication module 150 may receive electromagnetic waves from the antenna 1, perform processes such as filtering and amplifying the received electromagnetic waves, and then transmit the electromagnetic waves to a modem processor for demodulation. The mobile communication module 150 may further amplify the signal modulated by the modem processor, and the amplified signal is converted into electromagnetic waves by the antenna 1 and radiated. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be provided in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then transmits the demodulated low frequency baseband signal to the baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs sound signals through audio devices (e.g., speaker 170A, receiver 170B) or displays images or video through display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional module, independent of the processor 110.
Similar to the mobile communication module 150, the wireless communication module 160 may also provide wireless communication solutions applied on the device 100, such as at least one of the following: wireless local area networks (wireless local area networks, WLAN), bluetooth (BT), bluetooth low energy (bluetooth low energy, BLE), ultra Wide Band (UWB), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field communication (near field communication, NFC), infrared (IR) technologies. The wireless communication module 160 may be one or more devices that integrate at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, frequency-modulates and filters the electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, frequency modulate and amplify it, and convert the signal into electromagnetic waves to radiate via the antenna 2.
In some embodiments, antenna 1 of apparatus 100 is coupled to mobile communication module 150 and antenna 2 of apparatus 100 is coupled to wireless communication module 160 such that apparatus 100 may communicate with networks and other electronic devices via wireless communication techniques. The wireless communication technology may include at least one of the following communication technologies: global system for mobile communications (global system for mobile communications, GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), wideband code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC, FM, IR technologies. The GNSS may include at least one of the following positioning techniques: global satellite positioning system (global positioning system, GPS), global navigation satellite system (global navigation satellite system, GLONASS), beidou satellite navigation system (beidou navigation satellite system, BDS), quasi zenith satellite system (quasi-zenith satellite system, QZSS), satellite based augmentation system (satellite based augmentation systems, SBAS).
The device 100 may implement display functions through a GPU, a display screen 194, and an application processor. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 may be used to display images or video. The display 194 includes a display panel. The display panel may employ a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (AMOLED), a flexible light-emitting diode (FLED), a Mini light-emitting diode (Mini LED), a Micro light-emitting diode (Micro LED), a Micro OLED (Micro OLED), or a quantum dot LED (quantum dot light emitting diodes, QLED). In some embodiments, the apparatus 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
The apparatus 100 may implement a photographing function through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
The ISP is used to process data fed back by the camera 193. For example, when photographing, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electric signal, and the camera photosensitive element transmits the electric signal to the ISP for processing and is converted into an image visible to naked eyes. The ISP can carry out algorithm optimization on noise, brightness and color of the image, and can optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in the camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into a standard Red Green Blue (RGB), YUV, etc. format image signal. In some embodiments, the apparatus 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. For example, when the apparatus 100 selects a frequency bin, a digital signal processor is used to fourier transform the frequency bin energy, or the like.
Video codecs are used to compress or decompress digital video. The apparatus 100 may support one or more video codecs. In this way, the apparatus 100 may play or record video in a variety of encoding formats, such as: dynamic picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, and MPEG4.
The NPU is a processor which refers to the biological neural network structure, for example, refers to the transmission mode among human brain neurons to rapidly process input information, and can also be continuously self-learned. Intelligent awareness and other functions of the device 100 may be implemented by the NPU, for example: image recognition, face recognition, speech recognition, and text understanding.
The external memory interface 120 may be used to connect an external memory card, such as a Secure Digital (SD) card, to implement the memory capability of the expansion device 100. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions. For example, files such as music, video, etc. are stored in an external memory card.
The internal memory 121 may be used to store computer executable program code including instructions. The internal memory 121 may include a storage program area and a storage data area. Wherein the storage program area may store application programs required for at least one function (e.g., a sound playing function and an image playing function) of the operating system. The storage data area may store data (e.g., audio data and phonebooks) created during use of the device 100. Further, the internal memory 121 may include a high-speed random access memory, and may also include a nonvolatile memory such as: at least one disk storage device, a flash memory device, and a universal flash memory (universal flash storage, UFS), etc. The processor 110 performs various processing methods of the apparatus 100 by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
The device 100 may implement audio functions, such as music playing and recording, through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like.
The audio module 170 is used to convert digital audio information into an analog audio signal output, and may also be used to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 or a portion of the functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also referred to as a horn, is used to convert audio electrical signals into sound signals. The device 100 may listen to music or hands-free conversation through the speaker 170A.
A receiver 170B, also referred to as an earpiece, converts the audio electrical signal into a sound signal. When a user uses the device 100 to answer a telephone call or voice message, the user can answer the voice by placing the receiver 170B close to the ear.
Microphone 170C, also known as a microphone or microphone, is used to convert sound signals into electrical signals. When a user makes a call or transmits voice information, a sound signal may be input to the microphone 170C by sounding near the microphone 170C. The apparatus 100 may be provided with at least one microphone 170C. In other embodiments, the apparatus 100 may be provided with two microphones 170C to achieve a noise reduction function. In other embodiments, the device 100 may also be provided with three, four or more microphones 170C to perform the functions of identifying the source of sound and directing the recording. The processor 110 may process the electrical signal output by the microphone 170C, for example, the audio module 170 and the wireless communication module 160 may be coupled through a PCM interface, and after the microphone 170C converts the environmental sound into an electrical signal (such as a PCM signal), the electrical signal is transmitted to the processor 110 through the PCM interface; the electrical signal is subjected to volume analysis and frequency analysis from the processor 110 to determine the volume and frequency of the ambient sound.
The earphone interface 170D is used to connect a wired earphone. The headset interface 170D may be a USB interface 130 or a 3.5mm open mobile device 100 platform (open mobile terminal platform, OMTP) standard interface, a american cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor 180A is used to sense a pressure signal, and may convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A may be of various types, such as a resistive pressure sensor, an inductive pressure sensor, or a capacitive pressure sensor. The capacitive pressure sensor may be a device comprising at least two parallel plates with conductive material, and when a force is applied to the pressure sensor 180A, the capacitance between the electrodes changes, and the device 100 determines the strength of the pressure based on the change in capacitance. When a touch operation acts on the display screen 194, the apparatus 100 detects the touch operation according to the pressure sensor 180A. The device 100 may also calculate the location of the touch based on the detection signal of the pressure sensor 180A. In some embodiments, touch operations that act on the same touch location, but at different touch operation strengths, may correspond to different operation instructions. For example: executing an instruction for checking the short message when the touch operation with the touch operation intensity smaller than the first pressure threshold acts on the short message application icon; and executing the instruction of newly creating the short message when the touch operation with the touch operation intensity being larger than or equal to the first pressure threshold acts on the short message application icon.
The gyro sensor 180B may be used to determine a motion gesture of the apparatus 100. In some embodiments, the angular velocity of device 100 about three axes (i.e., the x-axis, the y-axis, and the z-axis) may be determined by gyro sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. For example, when the shutter is pressed, the gyro sensor 180B detects the angle of the shake of the apparatus 100, calculates the distance to be compensated for by the lens module according to the angle, and allows the lens to counteract the shake of the apparatus 100 by the reverse motion, thereby realizing anti-shake. The gyro sensor 180B can also be used for scenes such as navigation and motion sensing games.
The air pressure sensor 180C is used to measure air pressure. In some embodiments, the device 100 calculates altitude from barometric pressure values measured by the barometric pressure sensor 180C, aiding in positioning and navigation.
The magnetic sensor 180D includes a hall sensor. The device 100 may detect the opening and closing of the flip holster using the magnetic sensor 180D. In some embodiments, when the device 100 is a flip-top machine, the device 100 may detect the opening and closing of the flip-top according to the magnetic sensor 180D. The device 100 can set the characteristics of automatic unlocking of the flip cover according to the detected opening and closing state of the leather sheath or the detected opening and closing state of the flip cover.
The acceleration sensor 180E can detect the magnitude of acceleration of the device 100 in various directions (typically the x-axis, y-axis, and z-axis). The magnitude and direction of gravity can be detected when the device 100 is stationary. The acceleration sensor 180E may also be used to recognize the gesture of the apparatus 100 as an input parameter for applications such as landscape switching and pedometer.
The distance sensor 180F is used to measure a distance. The device 100 may measure distance by infrared or laser. In some embodiments, for example, in a shooting scene, the apparatus 100 may range using the distance sensor 180F to achieve fast focusing.
The proximity light sensor 180G may include, for example, a light-emitting diode (LED) and a light detector, for example, a photodiode. The LED may be an infrared LED. The device 100 emits infrared light outwards through the LED. The device 100 uses a photodiode to detect infrared reflected light from nearby objects. When reflected light is detected, the apparatus 100 may determine that an object is present nearby. When no reflected light is detected, the apparatus 100 may determine that there is no object nearby. The device 100 can use the proximity light sensor 180G to detect whether the user is holding the device 100 close to the ear for talking, so as to automatically extinguish the screen for power saving. The proximity light sensor 180G may also be used for automatic unlocking and automatic screen locking in holster mode or pocket mode.
The ambient light sensor 180L is used to sense ambient light level. The device 100 may adaptively adjust the brightness of the display 194 based on the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust white balance when taking a photograph. Ambient light sensor 180L may also cooperate with proximity light sensor 180G to detect whether device 100 is in a pocket to prevent false touches.
The fingerprint sensor 180H is used to collect a fingerprint. The device 100 can utilize the collected fingerprint characteristics to realize the functions of unlocking, accessing an application lock, photographing, answering an incoming call and the like.
The temperature sensor 180J is for detecting temperature. In some embodiments, the apparatus 100 performs a temperature processing strategy using the temperature detected by the temperature sensor 180J. For example, when the temperature reported by temperature sensor 180J exceeds a threshold, apparatus 100 performs a reduction in performance of a processor located in the vicinity of temperature sensor 180J in order to reduce power consumption to implement thermal protection. In other embodiments, when the temperature is below another threshold, the device 100 heats the battery 142 to avoid low temperatures causing the device 100 to shut down abnormally. In other embodiments, when the temperature is below a further threshold, the device 100 performs boosting of the output voltage of the battery 142 to avoid abnormal shutdown caused by low temperatures.
The touch sensor 180K, also referred to as a touch device. The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a touch screen. The touch sensor 180K is for detecting a touch operation acting thereon or thereabout. The touch sensor 180K may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display 194. In other embodiments, the touch sensor 180K may also be disposed on the surface of the device 100 and at a different location than the display 194.
The bone conduction sensor 180M may acquire a vibration signal. In some embodiments, bone conduction sensor 180M may acquire a vibration signal of a human vocal tract vibrating bone pieces. The bone conduction sensor 180M may also contact the pulse of the human body to receive the blood pressure pulsation signal. In some embodiments, bone conduction sensor 180M may also be provided in a headset, in combination with an osteoinductive headset. The audio module 170 may analyze the voice signal based on the vibration signal of the sound portion vibration bone block obtained by the bone conduction sensor 180M, so as to implement a voice function. The application processor may analyze the heart rate information based on the blood pressure beat signal acquired by the bone conduction sensor 180M, so as to implement a heart rate detection function.
The keys 190 include a power-on key and an volume key. The keys 190 may be mechanical keys or touch keys. The device 100 may receive a key input signal and implement a function associated with the case input signal.
The motor 191 may generate vibration. The motor 191 may be used for incoming call alerting as well as for touch feedback. The motor 191 may generate different vibration feedback effects for touch operations acting on different applications. The motor 191 may also produce different vibration feedback effects for touch operations acting on different areas of the display screen 194. Different application scenarios (e.g., time alert, receipt message, alarm clock, and game) may correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
The indicator 192 may be an indicator light, which may be used to indicate a change in state of charge and charge, or may be used to indicate a message, missed call, and notification.
The SIM card interface 195 is used to connect a SIM card. The SIM card may be inserted into the SIM card interface 195 to make contact with the apparatus 100, or may be removed from the SIM card interface 195 to make separation from the apparatus 100. The device 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The same SIM card interface 195 may simultaneously insert multiple cards, which may be of the same type or of different types. The SIM card interface 195 may also be compatible with external memory cards. The device 100 interacts with the network through the SIM card to perform functions such as talking and data communication. In some embodiments, the device 100 employs an embedded SIM (eSIM) card, which may be embedded in the device 100 and not separable from the device 100.
The hardware system of the apparatus 100 is described in detail above, and the software system of the apparatus 100 is described below. The software system may employ a layered architecture, an event driven architecture, a microkernel architecture, a micro-service architecture, or a cloud architecture, and embodiments of the present application illustratively describe the software system of the apparatus 100.
As shown in fig. 2, the software system using the hierarchical architecture is divided into several layers, each of which has a clear role and division. The layers communicate with each other through a software interface. In some embodiments, the software system may be divided into four layers, from top to bottom, an application layer, an application framework layer, an Zhuoyun row (Android run) and system libraries, and a kernel layer, respectively.
The application layer may include camera, gallery, calendar, conversation, map, navigation, WLAN, bluetooth, music, video, short message, etc. applications.
The application framework layer provides an application programming interface (application programming interface, API) and programming framework for application programs of the application layer. The application framework layer may include some predefined functions.
For example, the application framework layer includes a window manager, a content provider, a view system, a telephony manager, a resource manager, and a notification manager.
The window manager is used for managing window programs. The window manager may obtain the display screen size, determine if there are status bars, lock screens, and intercept screens.
The content provider is used to store and retrieve data and make such data accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, and phonebooks.
The view system includes visual controls, such as controls to display text and controls to display pictures. The view system may be used to build applications. The display interface may be composed of one or more views, for example, a display interface including a text notification icon may include a view displaying text and a view displaying a picture.
The telephony manager is used to provide communication functions of the device 100, such as management of call status (on or off).
The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, and video files.
The notification manager allows the application to display notification information in a status bar, can be used to communicate notification type messages, can automatically disappear after a short dwell, and does not require user interaction. Such as a notification manager, is used for download completion notification and message alerting. The notification manager may also manage notifications that appear in the system top status bar in the form of charts or scroll bar text, such as notifications for applications running in the background. The notification manager may also manage notifications that appear on the screen in the form of dialog windows, such as prompting text messages in status bars, sounding prompts, vibrating electronic devices, and flashing lights.
Android run time includes a core library and virtual machines. Android run time is responsible for scheduling and management of the Android system.
The core library consists of two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application program layer and the application program framework layer as binary files. The virtual machine is used for executing functions such as management of object life cycle, stack management, thread management, security and exception management, garbage collection and the like.
The system library may include a plurality of functional modules, such as: surface manager (surface manager), media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., open graphics library (open graphics library for embedded systems, openGL ES) for embedded systems) and 2D graphics engines (e.g., skia graphics library (skia graphics library, SGL)).
The surface manager is used to manage the display subsystem and provides a fusion of the 2D and 3D layers for the plurality of applications.
The media library supports playback and recording of multiple audio formats, playback and recording of multiple video formats, and still image files. The media library may support a variety of audio video coding formats such as MPEG4, h.264, moving picture experts group audio layer 3 (moving picture experts group audio layer III, MP 3), advanced audio coding (advanced audio coding, AAC), adaptive multi-rate (AMR), joint picture experts group (joint photographic experts group, JPG), and portable network graphics (portable network graphics, PNG).
Three-dimensional graphics processing libraries may be used to implement three-dimensional graphics drawing, image rendering, compositing, and layer processing.
The two-dimensional graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The kernel layer may include a display driver, a camera driver, an audio driver, a sensor driver, and the like.
The workflow of the software system and hardware system of the apparatus 100 is illustrated in connection with displaying a photo scene.
When a user performs a touch operation on the touch sensor 180K, a corresponding hardware interrupt is sent to the kernel layer, which processes the touch operation into a raw input event, for example, information including touch coordinates and a time stamp of the touch operation. The original input event is stored in the kernel layer, and the application framework layer acquires the original input event from the kernel layer, identifies a control corresponding to the original input event, and notifies an Application (APP) corresponding to the control. For example, the touch operation is a click operation, the APP corresponding to the control is a camera APP, and after the camera APP is awakened by the click operation, the camera APP may call the camera driver of the kernel layer through the API, and the camera driver controls the camera 193 to shoot.
The off-screen display means that after the electronic equipment is off-screen, a part of the area on the screen can be lightened to display information such as a clock, a date, a notice and the like so as to facilitate the operation of a user, thereby improving the experience of the user. Currently, if a user needs to view information from an electronic device in a specific scene, the user needs to unlock the electronic device and enter a specific Application (App) to obtain the required information. For example, when a user is running, it is desired to acquire exercise information; the user needs to unlock the electronic device and acquire the step number information, the calorie consumption information or other exercise information from the exercise App. However, when the user is running, it is inconvenient for the user to perform a series of operations such as unlocking the electronic device and entering an exercise App, resulting in poor user experience.
Therefore, the embodiment of the application provides a method for displaying the screen-off state, wherein a user can select a target theme during the screen-off state according to requirements, and the target theme can correspond to a target AOD resource package; the target AOD resource package comprises a target application program package (application package, APK), and the target APK can be used for acquiring data of a user for processing; when the electronic equipment is turned off, the data in the target APK can be obtained to conduct off-screen display on the display screen, so that a user can obtain information required by the user without unlocking the electronic equipment on-screen, and user experience can be improved.
The method for off-screen display provided by the present application will be described in detail with reference to fig. 3 to 13.
Fig. 3 is a schematic diagram of an architecture of a method for off-screen display according to an embodiment of the present application. As shown in fig. 3, the hardware may include a power key 201 and a display screen 202; wherein the power key 201 is used for connecting or disconnecting the power of the terminal device; the terminal device detects an operation of clicking the power key 201 by a user; in response to the user's operation, the terminal device may start the screen-extinguishing display service 203; the display 202 may be used to display an interface of the terminal device; the screen-off display service 203 is used for displaying information to a user in a display screen when the terminal equipment is in a screen-off scene; the extensible markup language (Extensible Markup Language, XML) parsing tool 204 is used to obtain the XML file and read the XML file into the memory; for example, as shown in fig. 3, the XML parsing tool may parse an application package (application package, APK) calling method in the scene resource description file; and the information of the calling method of the acquired APK is sent to the screen-extinguishing display service 203, and the screen-extinguishing display service 203 can call the APK, and the APK acquires data and processes the data; a User Interface (UI) tool service 205 is used to render APK-transmitted data; the data display service 206 is used to display APK transmitted data.
In one example, when the terminal device detects an operation of clicking the power key 201 by the user, the off-screen display service 203 is operated; the screen-in-flight display service 203 may send an parsing instruction to the XML parsing tool 204, which may be used to instruct the XML parsing tool 204 to parse the information in the current AOD resource package; the XML parsing tool 204 can parse the method for calling APK for the scene description file in the AOD resource package, and sends the APK calling method to the screen-extinguishing display service 203 after parsing; after acquiring and calling an APK method, the screen-extinguishing display service 203 can call the APK to process data, and the APK processes the acquired user data and performs rendering display with the data display service 206 through the UI tool service 205; for example, the APK may start to operate after the user performs scene setting for the off-screen display, acquire user data in the background, and process the user data.
For example, taking a running scene as an off-screen scene of the terminal device as an example, the APK is used for obtaining data, and a data processing method after the APK obtains the data can be shown in fig. 6.
The implementation mode is as follows: in the case where the user is provided with a wearable device, a locator (e.g., global positioning system) is included in the wearable device.
For example, the terminal device can acquire heart rate data, step number information and position information of the user through the wearable device; the heart rate data, the step number information and the position information can be transmitted to the exercise health application program, and the APK can acquire the data of the user through the exercise health application program.
In one possible implementation, the APK may not need to obtain heart rate data, step number information, and location information in the wearable device through the sports health application.
The implementation mode II is as follows: in the case where the user is provided with a wearable device, the global positioning system is not included in the wearable device.
For example, the terminal device may obtain heart rate data and step number information of the user through the wearable device, and may obtain location information of the user through a locator (for example, a global positioning system) included in the terminal device; the heart rate data, the step number information and the position information can be transmitted to the exercise health application program, and the APK can acquire the data of the user through the exercise health application program.
In one possible implementation, the APK may not need to obtain heart rate data and step number information in the wearable device through the sports health application.
And the implementation mode is three: in the case where the user does not have a wearable device.
For example, the location information of the user may be obtained by a locator (e.g., global positioning system) included in the terminal device; the AOD resource package can comprise a pedometer algorithm, and the step number information of the user can be determined according to the position information of the user; and sending the position information and the step number information of the user to the APK. It should be understood that, taking the off-screen scene as an example of the running scene, the APK acquiring data is described by way of example; the present application is not limited in any way.
It should also be understood that in the embodiment of the present application, an APK is newly added in the AOD resource package, and an interaction flow between the screen-extinguishing display service and the APK; as shown in fig. 3, the off-screen display service 203 may invoke APK; APK may pass the processed data results to the message display service 203; the screen-extinguishing display service 203 renders and displays the acquired data result with the data display service 206 through the UI tool service 205. In the embodiment of the application, one off-screen scene can correspond to one APK; the user can select the target theme according to the self requirement, and the electronic equipment can display a display interface corresponding to the target theme when the screen is turned off, so that the user can obtain the self required information without the need of unlocking the electronic equipment.
Fig. 4 is a schematic diagram of a method for off-screen display according to an embodiment of the present application. The method 300 shown in fig. 4 may include steps S310 to S350, which are described in detail below, respectively.
Step S310, an off-screen scene list (an example of a display subject list) is displayed.
For example, a topic list may be displayed, where the topic list may be used for a user to determine a target topic for the electronic device off-screen display, where the target topic corresponds to a target AOD resource package; the theme of the off-screen display may be referred to as an off-screen scene.
In one example, an electronic device may pre-cast an off-screen scene and manage its classification; the off-screen scene may include a scene classification; the scene classification may include a motion classification, a travel classification, and other classifications; the motion classification may include: running, climbing mountain, swimming, elliptical machines, etc.; travel classification may include weather, navigation, or driving scenes.
For example, an off-screen scene may be displayed on a screen of an electronic device as shown in fig. 7 to 10.
It should be understood that the off-screen scene can be set according to the needs of the user; the screen-off scene is exemplified, and the application is not limited to the screen-off scene.
In one possible implementation, as shown in fig. 7, the user may click on the off-screen display setting option, and the electronic device initiates the off-screen display setting option in response to the user's click operation.
In one example, the user may also initiate the off screen scene setting option via a voice command. For example, the user triggers a voice function by initiating a preset voice; after triggering the voice function, the user can initiate a voice instruction of 'off-screen scene setting', so that an off-screen scene setting option in the electronic equipment is started to run.
Step S320, selecting a target scene (an example of a target theme).
It should be understood that in embodiments of the present application, a target scene may refer to a target theme of an off-screen display.
For example, the user may select a target scene from the off-screen scene list according to the user's own needs, where the target scene may refer to the off-screen scene that the user currently needs according to the user's own needs.
It should be noted that, through the target scene, the user can obtain the information required currently under the condition that the electronic device is off-screen.
In one possible implementation, in response to a click operation by a user, the electronic device may display a display interface as shown in fig. 8 after initiating the off-screen display setting option; the classification of the screen-off scene can be displayed in the display interface; for example, displaying a sports category and a travel category; if the user wants to select running as the target scene, the user can click on the motion classification of the off-screen scene interface; in response to a user clicking the operation of the motion classification, the electronic device may display a display interface as shown in fig. 9; in the display interface shown in fig. 9, the user clicks on the running option to set running as the target scene of the off-screen display of the electronic device.
In one possible implementation, in response to a click operation by a user, the electronic device may display a preview off-screen display interface as shown in fig. 10 after starting the off-screen display setting option; the display interface shown in fig. 10 includes a classification of the off-screen scene and a scene included in the classification; for example, sports classification includes running and mountain climbing; travel classification includes weather and navigation; the user may directly click on the display interface shown in fig. 10 to set up a running as the target scene of the off-screen display of the electronic device.
It should be understood that the above classification by movement and travel is taken as an example, and other classifications may be included in the off-screen scene; furthermore, the above-described scene in the sports classification by running and mountain climbing is exemplified; illustrating the scenes in the appearance classification by weather and navigation; other scenes may be included in the motion classification and the travel classification, which is not limited in any way by the present application.
In one example, the user may also select an off-screen display target scene of the electronic device via voice instructions. For example, the user triggers a voice function by initiating a preset voice; after triggering the voice function, the user can initiate a voice instruction of setting the running scene as the screen-off scene, so as to select a target scene; the application does not limit the preset voice.
Step S330, engine application packages of the target scene (an example of the target application packages) are matched.
For example, the off-screen display controller may match the desired engine according to the target scene selected by the user.
It should be appreciated that different target scenarios may correspond to different engine packages; the engine program package is used for processing data corresponding to the target scene; for example, data in different scenes is called through a data interface and data processing is performed.
It should also be appreciated that an engine may correspond to an engine application package (application package, APK), i.e., engine APK; the code of the application program is to be run on the electronic device of the system, and then the code must be compiled and packaged into a file which can be identified by the system so that the code can be run, and APK refers to a file format which can be identified and run by the system.
In one possible implementation, if the target scene selected by the user needs to be set with parameters, the engine APK may further guide the user to perform relevant parameter configuration of the target scene.
For example, if the user selected target scene is running, the engine may direct the user to set target parameters.
As shown in fig. 11, a user may set a target distance in an interface of a running scene off-screen display; for example, the user may input 10 km, and set the target distance of the running to 10 km.
In one example, if the target scene selected by the user is weather, the engine APK may guide the user to configure the city name of the travel city.
In one example, if the target scene selected by the user is navigation, the engine APK may guide the user to configure the start position and the end position of navigation.
Step S340, data processing of the target scene.
Illustratively, after the user configures the relevant parameters of the target scene, the engine APK obtains the relevant data of the target scene and performs data processing. The specific flow of data processing can be seen in the following fig. 5.
And S350, performing off-screen display of the target scene.
For example, the engine APK may process related data of the user in the current target scene, render the graphics, and display the processed data off-screen.
In one possible implementation, when the user selects a target scene, the off-screen display controller may invoke a scene resource package (one example of a target AOD resource package) of the target scene (one example of a target theme); the scene resource package may include a scene description file, a scene preview file, and an engine APK (an example of a target application package); the scene description file is used for describing the name and classification of the target scene and calling an engine application package file corresponding to the target scene; the scene preview file is used for displaying an off-screen preview interface corresponding to the target scene; the engine application program package is used for guiding the user to set target parameters in the target scene and processing the data of the user in the target scene.
For example, when the running is selected as the target scene, a scene resource package corresponding to the running can be called; the engine APK of the running scene can be called through the description file in the corresponding running scene resource package, so that running data of a user are processed and a graph is rendered under the off-screen scene to perform off-screen display of the running scene.
In one example, when the user selects the target scene to be outdoor running, the relevant data of the current outdoor running of the user is displayed in an off-screen mode in the electronic equipment of the user by acquiring the data of the current outdoor running of the user and performing data processing. For example, as shown in fig. 12, in the running scene, the electronic device may display contents on the off screen, including: target (distance), current (distance), heart rate, advice, exercise, calories, average pace, average heart rate, number of steps, cumulative hill climbs, and the like.
Illustratively, in an embodiment of the present application, one target scene may correspond to one scene resource package; the scene resource package can be a compression package file with glowing screen display (Honor Always on Display, HNA) as a suffix format and no compression rate; the scene resource package may include a scene description file (description. Xml), a scene preview image file (preview. Jpg), and an engine application package (application package, APK) file corresponding to the scene.
By way of example, a scenario resource package corresponding to a running scenario is taken as an example for illustration; among these, a scene description (xml) file, a scene preview (jpg) file, and a running engine (apk) file may be included.
For example, the following may be included in a scene description (xml) file:
wherein, the 10 tags are all < title > represents scene name; < title-cn > represents the scene chinese name; the < category > represents scene classification for classifying and displaying on the screen-off scene page according to the item; < persistence > is used to identify whether this engine APK needs to run continuously while on the screen; true indicates continuous running, namely, the engine APK is in a running state under the conditions of screen-off and screen-on; false indicates that the operation is not continuous, namely, the operation of the engine APK needs to be stopped when the screen is on; < author > represents an author of a scene; < designer > represents a scene designer; < version > represents a version number; < packagname > represents the engine package name; < classname > represents the engine class name; < method > represents the engine method.
For example, if the target scene of the screen-off selected by the user is weather, the engine APK corresponding to the weather scene is not required to continue to run when the screen is on, and the engine APK may be cleared and restarted when the screen is off next time, in which case < persistence > may be set to false. If the target scene selected by the user to go out of the screen is running, the engine APK corresponding to the running scene needs to continue running to calculate the mileage the user runs through when the screen is on, in which case < persistence > may be set to true.
It should be appreciated that the above examples describe the content included in the scene description file; three tags of author, scene designer and version number may not be included in the scene description file.
Illustratively, the scene preview file may be in jpg format; or other image formats may be employed.
Illustratively, a running engine application package (apk) file may be used for data processing and display rendering; and calling the engine to process data through the off-screen display, and then displaying the data processed by the engine.
For example, the call engine may include the following by off-screen display:
the engine package name in the calling method may correspond to < package name > in the scene description (description. Xml) file, the engine class name corresponds to < package name > in the scene description (description. Xml) file, and the engine method corresponds to < method > in the scene description (description. Xml) file.
In one possible implementation manner, the electronic device may not only perform the screen-off scene, but also access the screen-off display scene resource package designed by the third party designer; however, the scene resource package at least includes the above-mentioned scene description file (description. Xml), scene preview image (preview. Jpg) file, and corresponding engine APK file; on the basis, elements in the scene resource package can be expanded, so that the engine APK provided in the scene resource package can normally process data corresponding to the scene.
In other words, the scene resource package of the screen-off scene in the electronic device may be data configured in the electronic device when the electronic device leaves the factory; or, the scene resource package of the off-screen scene can also be data obtained from an open database; however, when the scene resource package of the off-screen scene is acquired from the open database, the data needs to be unified according to the standard format of the scene resource package; namely, the scene resource package at least comprises a scene description file, a scene preview image file and an engine APK file corresponding to the scene.
It should be understood that in the embodiment of the present application, the data format in the scene resource package is normalized; however, the manner of acquiring the data in the scene resource package is not limited in any way.
In the embodiment of the present application, when the second engine APK starts to operate, the first engine APK needs to be stopped.
For example, when the electronic device is currently on, the user sets the off-screen scene to select the off-screen scene as running, and then a display interface of the running scene is displayed after the electronic device is off-screen; the user wakes up the screen again, sets the screen-off scene to select the screen-off scene as swimming, stops running the engine APK of the running scene when the electronic equipment is in screen-off display, and starts the engine APK of the swimming scene; in addition, the running program updates the engine package name of the engine APK included in the scene resource package from running to swimming.
It should also be understood that, in general, the engine APK may not be operated when the electronic device is on, and the engine APK may be started to operate after each screen is off; when the engine APK runs, the running program is compared with the last running program before the current time; if the program changes, the engine APK will stop the last program and start the current program.
Illustratively, the running engine APK needs to be stopped before the off-screen display calls the engine APK each time, and the stopping method may be as follows:
ActivityManager manager=(ActivityManager)context.getSystemService(Context.ACTIVITY_SERVICE);
manager.killBackgroundProcesses(currentEngineName)。
for example, fig. 5 is a schematic diagram of a method for calling an engine application package according to an embodiment of the present application. The method 400 shown in fig. 5 may include steps S410 to S460, which are described in detail below, respectively.
Step S410, stopping the first engine APK (an example of the second APK).
Illustratively, the target scene selected by the user at the historical moment is running; according to the requirements of the user, the weather is selected as a target scene at the current moment of the user; thus, it is necessary to stop running the corresponding first engine application package.
Step S420, obtains the package name of the second engine APK (an example of the first APK).
Step S430, obtaining the engine class name of the second engine APK.
Step S440, obtaining an engine method of the second engine APK.
It should also be understood that, in the case where the target scene is switched from the first scene to the second scene, it is necessary to stop the operation of the first engine APK corresponding to the first scene and start the second engine APK corresponding to the second scene; the engine package name, the engine class name and the engine method for acquiring the second engine APK ensure that the second engine APK can be operated.
Step S450, starting the engine processor.
Illustratively, the second engine APK may be started by the engine processor after obtaining the engine package name, the engine class name, and the engine method of the second engine APK.
Step S460, updating the APK package name of the binding engine.
For example, after the first engine APK stops and the second engine APK starts, the running program updates the packet name of the engine APK included in the scene resource packet from the name of the first engine APK to the name of the second engine APK.
For example, the target scene at the historical moment is running, and the target scene at the current moment is swimming; the engine files in the scene asset package may be updated by the running application package (running. Apk) to the swimming application package (swimming. Apk).
In one example, a user selects a first scene as a target scene when the electronic device is on screen; when the electronic equipment is in a screen-off state, calling a scene resource package corresponding to the first scene, processing data of a user in the first scene and performing screen-off display; when a user wakes up the screen of the screen electronic device to lighten the screen, a scene resource package corresponding to the first scene can determine whether to stop running the engine APK according to a value in < persistence > in the description. Xml; therefore, when the user is off screen again in the first scene, the electronic equipment can further process the data in the first scene and display the data off screen again.
For example, a user selects to run as a target scene when the electronic device is on; when the electronic equipment is off the screen for the first time, the user runs for 3 km, and the off-screen display comprises the current distance of 3 km at the moment; the user may wake up the screen while running, and the user may still be running, at which time 4 km has been run, and the display interface includes the current distance of 4 km when the electronic device is off the screen a second time. For example, in the state of the bright screen, the engine APK needs to be in an operating state to know the mileage that the user is operating in the state of the bright screen, and thus < persistence > in the description. Xml needs to be set to true.
It should be understood that, in general, the engine APK operates under the condition that the electronic device is off-screen, but for some scenes requiring continuous operation, the engine APK may be configured to operate under the condition of on-screen and perform corresponding data buffering; for example, the electronic device off-screen display interface corresponding to the running scene can be displayed when the user is running; the engine APK may continue to process the user's running data instead of restarting the calculation when the electronic device is off screen again, with a short time of lighting and continuing running while the user is running.
According to the method for displaying the screen-off state, the user can select the target scene of the screen-off display according to the requirement, and the screen-off display is performed when the electronic equipment is in the screen-off state according to the target scene selected by the user, so that the user can obtain information required by the user under the condition that the electronic equipment is not required to be unlocked in the screen-on state, and user experience can be improved.
The method for off-screen display according to the embodiment of the present application will be described in detail below with reference to fig. 6, which illustrates running as an example of a target scene.
Fig. 6 is a schematic diagram of a method for off-screen display in a running scene according to an embodiment of the present application. The method 500 shown in fig. 6 may include steps S501 to S511, which are described in detail below, respectively.
Step S501, the user selects a running scene as a target scene (one example of a target theme).
For example, the user may select the off-screen scene to be running on the setup display interface of the electronic device when the electronic device is on-screen.
For example, as shown in fig. 7, fig. 7 shows a display interface of the electronic device at a setting interface, where a plurality of setting options may be displayed; such as wireless local area networks, off screen displays, bluetooth, batteries, etc.
It should be understood that the display of the settings interface may include other more settings options, as the application is not limited in this regard.
As shown in fig. 7, the user may click on the off-screen display setting option, and in response to the click operation of the user, the electronic device starts the off-screen display setting option; the electronic device may display a display interface as shown in fig. 8 after initiating the off-screen display setting option. Scene classification of the off-screen scene can be displayed in the display interface; for example, scene classification may include motion classification, travel classification, and other classifications. The user may click on the sports option in the display interface as in fig. 8; a display interface as shown in fig. 9 may be displayed in response to a click operation by a user; sports classification may include running, mountain climbing, and other scenarios; the user may click on the run option in the display interface as in fig. 9, setting the run as the target scene.
Step S502, running scene engine APK (one example of target APK) matches.
For example, the off-screen display controller may automatically match a desired engine according to a target scene selected by a user.
For example, when the user selects running as the target scene, the off-screen display controller may call a scene resource package corresponding to the running scene; the scene resource package can comprise a scene description file, a scene preview image file and an engine application package file corresponding to the target scene; the scene description file is used for describing names and classifications of running scenes and calling engine application package files corresponding to the running scenes; the scene preview image file is used for displaying an off-screen preview interface corresponding to the running scene; the engine application program package is used for guiding the user to set target parameters in the running scene and processing data of the user in the running scene. The scene resource file of the running scene can be matched with the engine APK corresponding to the running scene by calling the running scene.
Step S503, the user sets the target distance.
Illustratively, the user may set a target distance in an interface of an off-screen display in a running scene as shown in fig. 11; for example, the user may input 10 km and set the target distance of the running to 10 km.
Step S504, application preview.
Illustratively, the user may click on the "apply" option after setting the target distance as shown in fig. 11, in response to a click operation by the user; the electronic device may generate a preview display interface as shown in fig. 12.
Step S505, heart rate acquisition.
For example, the electronic device may obtain heart rate data of the user in the running scenario by connecting with the wearable device.
It should be understood that step S505 described above is an optional step; in the case where the electronic device is not connected to the wearable device, heart rate acquisition may not be performed.
Step S506, speed matching instruction.
In one example, the user may be prompted to speed up or down based on a current user heart rate versus a medical running heart rate.
In one example, if heart rate data of the user cannot be obtained, no pacing instruction may be made.
Step S507, position acquisition.
The position acquisition may be performed, for example, by a positioning element in the electronic device.
Step S508, mileage calculation.
In one example, the running engine may obtain the current location through the system interface and perform an integral calculation to obtain the distance.
ds=sqrt[(X i -X i-1 )+(Y i -Y i-1 )+(Z i -Z i-1 )];
s=s+ds;
Where sqrt represents a square root function; (X) i ,Y i ,Z i ) Representing the coordinates at time i; ds represents the distance from the moment i-1 to the moment i, and the time difference between the two moments is less than 1 second; s denotes the distance of the current movement.
Step S509, calculating the target achievement condition.
For example, the goal achievement instance may include a goal remaining amount or a completion percentage.
Step S510, energy consumption calculation.
For example, the energy expenditure may be computationally determined using a calorie calculation formula.
Step S511, graphic rendering.
For example, after the data processing is completed, the engine application package may perform graphics rendering according to the processed data and perform off-screen display.
In one possible implementation manner, the target scene selected by the user may be a weather scene, and the engine APK may access the weather APK to query the weather state in real time and perform off-screen display; for example, in the weather, a snowing animation can be played in the screen-extinguishing display; the raining animation and the like can be played in the screen-extinguishing display in raining weather; and temperature information, pollution indexes, travel advice and the like can be displayed in the off-screen display.
In one possible implementation manner, the target scene selected by the user may be a navigation scene, and the navigation engine APK may perform off-screen display on the real-time navigation data; for example, turn right after 100 meters as shown in fig. 13.
According to the method for displaying the screen-off state, a user can select the displayed target scene when the screen-off state is performed according to the requirement, and the screen-off display is performed when the electronic equipment is in the screen-off state according to the target scene selected by the user, so that the user can obtain information required by the user under the condition that the electronic equipment is not required to be unlocked when the electronic equipment is on the screen, and user experience can be improved.
The method for off-screen display provided by the embodiment of the application is described in detail above with reference to fig. 1 to 13; an embodiment of the device of the present application will be described in detail with reference to fig. 14 and 15. It should be understood that the apparatus in the embodiments of the present application may perform the methods of the foregoing embodiments of the present application, that is, specific working procedures of the following various products may refer to corresponding procedures in the foregoing method embodiments.
Fig. 14 is a schematic structural diagram of a device for off-screen display according to the present application. The device 600 has a display screen, the device 600 comprising an acquisition unit 610 and a processing unit 620.
The acquiring unit 610 is configured to acquire a first instruction of a user, where the first instruction is configured to instruct a target information screen to display an AOD resource package, where the target AOD resource package includes a target application package APK, and the target APK is configured to process data of the user; the processing unit 620 is configured to invoke the target APK according to the target AOD resource packet; and when the electronic equipment is in screen-off state, acquiring the data in the target APK and performing screen-off display on the display screen.
Optionally, as an embodiment, the target AOD resource package includes a scenario description file, where the scenario description file includes a calling method of the target APK, and the processing unit 620 is specifically configured to parse the scenario description file to obtain the calling method of the target APK; and calling the target APK according to the calling method.
The target APK is further configured to obtain data of the user through a wearable device of the user.
Optionally, as an embodiment, the target APK is further configured to instruct the user to configure a target parameter of the off-screen display of the electronic device.
Optionally, as an embodiment, the target APK is a first APK, and the processing unit 620 is further configured to:
And stopping calling a second APK, wherein the second APK is different from the first APK.
Optionally, as an embodiment, the target AOD resource package further includes a preview image, where the preview image is used to display a preview interface of the electronic device off-screen.
Optionally, as an embodiment, the processing unit 620 is further configured to:
and displaying a theme list, wherein the theme list is used for determining a target theme of the electronic equipment off-screen display by the user, and the target theme corresponds to the target AOD resource package.
Optionally, as an embodiment, the topic list includes a topic classification, where the topic classification is used to indicate a category corresponding to a topic in the topic list.
The above-described apparatus 600 is embodied in the form of a functional unit. The term "unit" herein may be implemented in software and/or hardware, without specific limitation.
For example, a "unit" may be a software program, a hardware circuit or a combination of both that implements the functions described above. The hardware circuitry may include application specific integrated circuits (application specific integrated circuit, ASICs), electronic circuits, processors (e.g., shared, proprietary, or group processors, etc.) and memory for executing one or more software or firmware programs, merged logic circuits, and/or other suitable components that support the described functions.
Thus, the elements of the examples described in the embodiments of the present application can be implemented in electronic hardware, or in a combination of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
Fig. 15 shows a schematic structural diagram of an electronic device provided by the present application. The dashed line in fig. 15 indicates that the unit or the module is optional. The electronic device 700 may be used to implement the methods described in the method embodiments described above.
The electronic device 700 includes one or more processors 701, which one or more processors 702 may support the electronic device 700 to implement the methods in the method embodiments. The processor 701 may be a general-purpose processor or a special-purpose processor. For example, the processor 701 may be a central processing unit (central processing unit, CPU), digital signal processor (digital signal processor, DSP), application specific integrated circuit (application specific integrated circuit, ASIC), field programmable gate array (field programmable gate array, FPGA), or other programmable logic device such as discrete gates, transistor logic, or discrete hardware components.
The processor 701 may be used to control the electronic device 700, execute a software program, and process data of the software program. The electronic device 700 may further comprise a communication unit 705 for enabling input (reception) and output (transmission) of signals.
For example, the electronic device 700 may be a terminal device, the communication unit 705 may be a transceiver of the terminal device, or the communication unit 705 may be a transceiver circuit of the terminal device.
The electronic device 700 may include one or more memories 702 having a program 704 stored thereon, the program 704 being executable by the processor 701 to generate instructions 703 such that the processor 701 performs the methods described in the method embodiments above in accordance with the instructions 703.
Optionally, the memory 702 may also have data stored therein. Alternatively, processor 701 may also read data stored in memory 702, which may be stored at the same memory address as program 704, or which may be stored at a different memory address than program 704.
Alternatively, the processor 701 and the memory 702 may be provided separately or may be integrated together; for example, integrated on a System On Chip (SOC) of the terminal device.
Illustratively, the memory 702 may be used to store a related program 704 of the method of off-screen display provided in the embodiment of the present application, and the processor 701 may be used to invoke the related program 704 of the method of off-screen display stored in the memory 702 when the terminal device is off-screen displayed, to execute the method of off-screen display of the embodiment of the present application; for example, a first instruction of a user is obtained, wherein the first instruction is used for indicating a target screen to display an AOD resource package, the target AOD resource package comprises a target application package APK, and the target APK is used for processing data of the user; calling the target APK according to the target AOD resource package; and when the electronic equipment is in screen-off state, acquiring the data in the target APK and performing screen-off display on the display screen.
The application also provides a computer program product which, when executed by a processor 701, implements the method according to any of the method embodiments of the application.
The computer program product may be stored in the memory 702, for example, the program 704, and the program 704 is finally converted into an executable object file capable of being executed by the processor 701 through preprocessing, compiling, assembling, and linking.
The application also provides a computer readable storage medium having stored thereon a computer program which when executed by a computer implements the method according to any of the method embodiments of the application. The computer program may be a high-level language program or an executable object program.
Optionally, the computer readable storage medium is, for example, memory 702. The memory 702 may be volatile memory or nonvolatile memory, or the memory 702 may include both volatile and nonvolatile memory. The nonvolatile memory may be a read-only memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an electrically Erasable EPROM (EEPROM), or a flash memory. The volatile memory may be random access memory (random access memory, RAM) which acts as an external cache. By way of example, and not limitation, many forms of RAM are available, such as Static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double data rate SDRAM (DDR SDRAM), enhanced SDRAM (ESDRAM), synchronous DRAM (SLDRAM), and direct memory bus RAM (DR RAM).
It will be clearly understood by those skilled in the art that, for convenience and brevity of description, specific working processes and technical effects of the apparatus and device described above may refer to corresponding processes and technical effects in the foregoing method embodiments, which are not described in detail herein.
In the several embodiments provided by the present application, the disclosed systems, devices, and methods may be implemented in other manners. For example, some features of the method embodiments described above may be omitted, or not performed. The above-described apparatus embodiments are merely illustrative, the division of units is merely a logical function division, and there may be additional divisions in actual implementation, and multiple units or components may be combined or integrated into another system. In addition, the coupling between the elements or the coupling between the elements may be direct or indirect, including electrical, mechanical, or other forms of connection.
It should be understood that, in various embodiments of the present application, the size of the sequence number of each process does not mean that the execution sequence of each process should be determined by its functions and internal logic, and should not constitute any limitation on the implementation process of the embodiments of the present application.
In addition, the terms "system" and "network" are often used interchangeably herein. The term "and/or" herein is merely one association relationship describing the associated object, meaning that there may be three relationships, e.g., a and/or B, may represent: a exists alone, A and B exist together, and B exists alone. In addition, the character "/" herein generally indicates that the front and rear associated objects are an "or" relationship.
In summary, the foregoing description is only a preferred embodiment of the present application, and is not intended to limit the scope of the present application. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (8)

1. A method of off-screen display, the method being applied to an electronic device having a display screen, comprising:
acquiring a first instruction of a user, wherein the first instruction is used for indicating a target screen-in display AOD resource package, the target screen-in display AOD resource package comprises a target application package APK and a scene description file, the target application package APK is used for processing data of the user, the scene description file comprises a calling method of the target application package APK, the scene description file corresponds to a target scene, and if the target scene needs parameter configuration, the target application package APK is also used for guiding the user to configure target parameters of the target scene;
Detecting configuration operation of the user on the target parameters in the target scene;
detecting a first operation of a power key in the electronic equipment, wherein the first operation is used for indicating to extinguish a display screen of the electronic equipment;
responding to the first operation, extinguishing a display screen of the electronic equipment, analyzing the scene description file in the target information screen display AOD resource package, and obtaining a calling method of the target application package APK;
calling the target application package APK according to a calling method of the target application package APK, wherein the target application package APK acquires the target parameters and the data of the user, and performs data processing on the data of the user;
and acquiring the data processed in the target application package APK, and performing off-screen display on a display screen of the electronic equipment.
2. The method of claim 1, wherein the target application package APK is further configured to obtain data of the user through a wearable device of the user.
3. The method according to claim 1 or 2, wherein the target application package APK is a first application package APK, and before invoking the target application package APK according to the target message display AOD resource package, further comprising:
And stopping calling a second application package APK, wherein the second application package APK is different from the first application package APK.
4. The method of claim 1 or 2, wherein the target off-screen display AOD resource package further comprises a preview image, and the preview image is used for displaying a preview interface of the off-screen electronic device.
5. The method of claim 1 or 2, further comprising:
and displaying a theme list, wherein the theme list is used for determining a target theme of the electronic equipment screen-off display by the user, and the target theme corresponds to the target screen-off display AOD resource package.
6. The method of claim 5, wherein the topic list includes a topic classification, the topic classification indicating a category corresponding to a topic in the topic list.
7. An apparatus for off-screen display, the apparatus comprising a processor and a memory, the memory for storing a computer program, the processor for calling and running the computer program from the memory, causing the apparatus to perform the method of any one of claims 1 to 6.
8. A computer readable storage medium, characterized in that the computer readable storage medium stores a computer program, which when executed by a processor causes the processor to perform the method of any of claims 1 to 6.
CN202110827202.3A 2021-07-21 2021-07-21 Method and device for off-screen display Active CN114115772B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110827202.3A CN114115772B (en) 2021-07-21 2021-07-21 Method and device for off-screen display

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110827202.3A CN114115772B (en) 2021-07-21 2021-07-21 Method and device for off-screen display

Publications (2)

Publication Number Publication Date
CN114115772A CN114115772A (en) 2022-03-01
CN114115772B true CN114115772B (en) 2023-08-11

Family

ID=80359496

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110827202.3A Active CN114115772B (en) 2021-07-21 2021-07-21 Method and device for off-screen display

Country Status (1)

Country Link
CN (1) CN114115772B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110221898A (en) * 2019-06-19 2019-09-10 北京小米移动软件有限公司 Display methods, device, equipment and the storage medium of breath screen picture
WO2021000804A1 (en) * 2019-06-29 2021-01-07 华为技术有限公司 Display method and apparatus in locked state

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102575844B1 (en) * 2016-04-05 2023-09-08 삼성전자주식회사 Electronic device for displaying screen and method for controlling thereof
CN110087292A (en) * 2019-04-28 2019-08-02 努比亚技术有限公司 Intelligent wearable device, energy-saving control method and computer readable storage medium
CN110489199A (en) * 2019-08-23 2019-11-22 深圳传音控股股份有限公司 Breath screen display method, apparatus, terminal and storage medium
CN113138816A (en) * 2020-01-19 2021-07-20 华为技术有限公司 Message screen display theme display method and mobile device
CN111580908A (en) * 2020-04-30 2020-08-25 江苏紫米电子技术有限公司 Display method, device, equipment and storage medium
CN112181560A (en) * 2020-09-24 2021-01-05 Oppo(重庆)智能科技有限公司 Navigation interface display method and device, electronic equipment and readable storage medium
CN112764624B (en) * 2021-01-26 2022-09-09 维沃移动通信有限公司 Information screen display method and device
CN112783392A (en) * 2021-01-29 2021-05-11 展讯通信(上海)有限公司 Information screen display method and device

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110221898A (en) * 2019-06-19 2019-09-10 北京小米移动软件有限公司 Display methods, device, equipment and the storage medium of breath screen picture
WO2021000804A1 (en) * 2019-06-29 2021-01-07 华为技术有限公司 Display method and apparatus in locked state

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
杨洁.第三节 可穿戴设备.《视觉交互设计》.江苏凤凰美术出版社,2018,第26-27页. *

Also Published As

Publication number Publication date
CN114115772A (en) 2022-03-01

Similar Documents

Publication Publication Date Title
CN113704014B (en) Log acquisition system, method, electronic device and storage medium
WO2020029306A1 (en) Image capture method and electronic device
WO2021258814A1 (en) Video synthesis method and apparatus, electronic device, and storage medium
WO2021218429A1 (en) Method for managing application window, and terminal device and computer-readable storage medium
CN113887264A (en) Code scanning method, system and related device
CN115333941B (en) Method for acquiring application running condition and related equipment
WO2021238740A1 (en) Screen capture method and electronic device
CN112416984A (en) Data processing method and device
CN113380240B (en) Voice interaction method and electronic equipment
CN117009005A (en) Display method, automobile and electronic equipment
CN114115772B (en) Method and device for off-screen display
CN113867851A (en) Electronic equipment operation guide information recording method, electronic equipment operation guide information acquisition method and terminal equipment
CN116048831B (en) Target signal processing method and electronic equipment
CN115686182B (en) Processing method of augmented reality video and electronic equipment
CN116703741B (en) Image contrast generation method and device and electronic equipment
CN116027933B (en) Method and device for processing service information
CN116233599B (en) Video mode recommendation method and electronic equipment
CN114020186B (en) Health data display method and device
CN116795604B (en) Processing method, device and equipment for application exception exit
WO2023116669A1 (en) Video generation system and method, and related apparatus
WO2023098467A1 (en) Voice parsing method, electronic device, readable storage medium, and chip system
CN117611423A (en) Special effect processing method and electronic equipment
CN117762281A (en) Method for managing service card and electronic equipment
CN116709609A (en) Message delivery method, electronic device and storage medium
CN116339569A (en) Split screen display method, folding screen device and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant