CN114115772A - Screen-off display method and device - Google Patents

Screen-off display method and device Download PDF

Info

Publication number
CN114115772A
CN114115772A CN202110827202.3A CN202110827202A CN114115772A CN 114115772 A CN114115772 A CN 114115772A CN 202110827202 A CN202110827202 A CN 202110827202A CN 114115772 A CN114115772 A CN 114115772A
Authority
CN
China
Prior art keywords
target
screen
apk
user
scene
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110827202.3A
Other languages
Chinese (zh)
Other versions
CN114115772B (en
Inventor
王彦恒
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202110827202.3A priority Critical patent/CN114115772B/en
Publication of CN114115772A publication Critical patent/CN114115772A/en
Application granted granted Critical
Publication of CN114115772B publication Critical patent/CN114115772B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0412Digitisers structurally integrated in a display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/147Digital output to display device ; Cooperation and interconnection of the display device with other functional units using display panels
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/448Execution paradigms, e.g. implementations of programming paradigms
    • G06F9/4488Object-oriented
    • G06F9/449Object-oriented method invocation or resolution
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/70Reducing energy consumption in communication networks in wireless communication networks

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Telephone Function (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application provides a screen-off display method and a screen-off display device, wherein the method is applied to electronic equipment with a display screen and comprises the following steps: acquiring a first instruction of a user, wherein the first instruction is used for indicating a target information screen to display an AOD resource package, the target AOD resource package comprises a target application program package (APK), and the target APK is used for processing data of the user; calling the target APK according to the target AOD resource packet; and when the electronic equipment is turned off, acquiring the data in the target APK and displaying the data on the display screen in a turned-off manner. According to the technical scheme, the user can obtain the information required by the user without unlocking the bright screen of the electronic equipment, so that the user experience is improved.

Description

Screen-off display method and device
Technical Field
The application relates to the field of terminals, in particular to a screen-off display method and device.
Background
The information on display (AOD) may also be called as off-screen display, which means that after an electronic device (e.g., a mobile phone, a tablet computer, etc.) is turned off, a partial area on a screen can be lighted to display information such as a clock, a date, and a notification, so that a user can operate the device conveniently, thereby improving user experience.
Currently, if a user needs to view information from an electronic device in a specific scene, the user needs to unlock the electronic device and enter a specific Application (App) to obtain required information. For example, when a user is running, it is desirable to acquire exercise information; the user needs to unlock the electronic device and obtain the step information, the calorie consumption information or other exercise information from the exercise application program. However, when the user is running, it is inconvenient for the user to perform a series of operations such as unlocking the electronic device and entering the sports application, which results in poor user experience.
Disclosure of Invention
The application provides a screen-off display method and device, which can solve the problem of complex operation when a user acquires required information under the condition that an electronic device is screen-off.
In a first aspect, a method for turning off a screen display is provided, where the method is applied to an electronic device having a display screen, and includes: acquiring a first instruction of a user, wherein the first instruction is used for indicating a target information screen to display an AOD resource package, the target AOD resource package comprises a target application program package (APK), and the target APK is used for processing data of the user; calling the target APK according to the target AOD resource packet; and when the electronic equipment is turned off, acquiring the data in the target APK and displaying the data on the display screen in a turned-off manner.
Based on the technical scheme of the application, a user can select a target theme according to the requirement, and the target theme corresponds to the target AOD resource package displayed in a screen-off mode; displaying a display interface corresponding to the target AOD resource package under the condition that the screen of the target AOD resource package selected by a user is turned off; therefore, the user can acquire required information without unlocking the bright screen of the electronic equipment, and the user experience is improved.
The screen turn-off display method in the embodiment of the application can be applied to an Organic Light-Emitting Diode (OLED) display screen, and the OLED display screen can emit Light by a single pixel point; the screen-off display can be that partial areas in the OLED display screen are displayed, and areas corresponding to black pixel points are not displayed.
With reference to the first aspect, in certain implementations of the first aspect, the invoking the target APK according to the target AOD resource package includes:
analyzing the scene description file to obtain a calling method of the target APK; and calling the target APK according to the calling method.
In a possible implementation manner, the scene description file may be parsed by an Extensible Markup Language (XML) parsing tool to obtain a calling method of the target APK; and calling the target APK according to the calling method of the target APK.
Based on the technical scheme of the application, the target AOD resource package can further comprise a scene description file, and the scene description file comprises a calling method of the target APK; the calling method of the target APK can be obtained by analyzing the scene description file; calling the target APK according to the calling method of the target APK, so that the target APK acquires user data and performs data processing; when the electronic equipment is turned off, the data in the target APK is acquired and displayed on the display screen in a turned-off mode, and user experience is improved.
With reference to the first aspect, in certain implementations of the first aspect, the target APK is further configured to obtain data of the user through a wearable device of the user.
In a possible implementation manner, taking the target AOD resource package as the AOD resource package of the running scene as an example, the target APK may acquire data of the heart rate, the number of steps, and the like of the user through the wearable device of the user.
With reference to the first aspect, in certain implementations of the first aspect, the target APK is further configured to instruct the user to configure a target parameter of the screen-out display of the electronic device.
In one possible implementation, if the target AOD resource package selected by the user needs to perform parameter setting, the target application package may further instruct the user to configure the target parameters.
For example, the target AOD resource package selected by the user is an AOD resource package of a running scene, and the user may set a target distance in an interface of a screen-off display in the running scene.
With reference to the first aspect, in certain implementation manners of the first aspect, the target AOD resource package further includes a preview image, and the preview image is used to display a preview interface of the electronic device that is turned off.
It should be understood that in embodiments of the present application, one target subject may correspond to one target AOD resource package.
In a possible implementation manner, when a user selects a target theme, the screen-off display controller may invoke a target AOD resource package corresponding to the target theme; the target AOD resource package can comprise a target theme description file, a target theme preview image file and a target application package file corresponding to a target theme; the target theme description file can be used for describing the name and classification of the target theme and calling a target application package file corresponding to the target theme; the target theme preview image file is used for displaying a screen-off preview interface corresponding to the target theme; the target application package can be used for guiding the user to set target parameters under the target theme and process the data of the user under the target theme.
With reference to the first aspect, in certain implementations of the first aspect, the method further includes:
and displaying a theme list, wherein the theme list is used for the user to determine a target theme displayed by the electronic equipment in a screen-off mode, and the target theme corresponds to the target AOD resource package.
With reference to the first aspect, in certain implementations of the first aspect, the topic list includes a topic classification, and the topic classification is used to indicate a category corresponding to a topic in the topic list.
In a possible implementation manner, the scene description file includes a first parameter, and the first parameter is used for indicating whether the target application package is running or not in a case that the electronic device is in a bright screen state.
Based on the technical scheme of the application, under the condition that part of the target application packages need to be continuously operated, the first parameter can be set in the scene description file to indicate that the target application packages can be continuously operated when the electronic equipment is on the bright screen. For example, in the case that the target theme is a running theme, the electronic device can continuously acquire data of the user in a running scene and perform data processing; and under the condition that the electronic equipment is turned off, acquiring data from the target APK corresponding to the running theme and displaying the running data of the user.
It should be understood that the target application package is normally operated when the electronic device is turned off, but for some screen-off themes requiring continuous operation, it may be determined that the target application package is operated when the screen is on and corresponding data buffering is performed according to the configured first parameter.
In a possible implementation manner, the target AOD resource package corresponding to the screen-off scene may be data configured in the electronic device when the electronic device leaves a factory.
In a possible implementation manner, the data in the target AOD resource package corresponding to the screen-off scene may also be data obtained from an open database. Under the condition that the AOD resource packet is acquired from the open database, data needs to be unified according to a standard format; that is, at least a scene description file, a scene preview image file, and an APK file are included in the AOD resource package.
In a second aspect, there is provided an apparatus for off-screen display, comprising means for performing any of the methods of the first or second aspects.
Alternatively, the apparatus may be a terminal device. The apparatus may include an input unit and a processing unit.
In one possible implementation, when the apparatus is a terminal device, the processing unit may be a processor, and the input unit may be a communication interface; the terminal device may further comprise a memory for storing computer program code which, when executed by the processor, causes the terminal device to perform any of the methods of the first aspect.
In a third aspect, there is provided a computer readable storage medium having stored thereon computer program code which, when run by an apparatus for off-screen display, causes the apparatus to perform any of the methods of the first aspect.
In a fourth aspect, there is provided a computer program product comprising: computer program code which, when run by an apparatus for off-screen display, causes the apparatus to perform any of the methods of the first aspect.
Drawings
FIG. 1 is a schematic diagram of a hardware system suitable for use in the apparatus of the present application;
FIG. 2 is a schematic diagram of a software system suitable for use in the apparatus of the present application;
FIG. 3 is a system architecture diagram according to an embodiment of the present application;
FIG. 4 is a schematic diagram of a method for displaying a blank screen according to an embodiment of the present disclosure;
FIG. 5 is a diagram illustrating a method for invoking a target application package according to an embodiment of the present application;
fig. 6 is a schematic view of a method for displaying a screen off in a running scene according to an embodiment of the present application;
FIG. 7 is a schematic diagram of a select screen-off scene interface provided by an embodiment of the present application;
FIG. 8 is a schematic diagram of a select screen-off scene interface provided by an embodiment of the present application;
FIG. 9 is a schematic diagram of a select screen-off scene interface provided by an embodiment of the present application;
FIG. 10 is a schematic diagram of a select screen-off scene interface provided by an embodiment of the present application;
FIG. 11 is a schematic view of a screen-off display interface provided in an embodiment of the present application;
FIG. 12 is a schematic view of a screen-off display interface provided in an embodiment of the present application;
FIG. 13 is a schematic view of a screen-off display interface provided in an embodiment of the present application;
FIG. 14 is a schematic view of a device for blanking out a screen display provided herein;
fig. 15 is a schematic view of a device for displaying a blank screen according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the accompanying drawings.
In the description of the embodiments of the present application, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implying any number of technical features indicated, or other limitations.
Fig. 1 shows a hardware system suitable for use in the apparatus of the present application.
The apparatus 100 may be a mobile phone, a smart screen, a tablet computer, a wearable electronic device, an in-vehicle electronic device, an Augmented Reality (AR) device, a Virtual Reality (VR) device, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, a Personal Digital Assistant (PDA), a projector, and the like, and the embodiment of the present application does not limit the specific type of the apparatus 100.
The apparatus 100 may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a key 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identification Module (SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
The configuration shown in fig. 1 is not intended to specifically limit the apparatus 100. In other embodiments of the present application, the apparatus 100 may include more or fewer components than those shown in FIG. 1, or the apparatus 100 may include a combination of some of the components shown in FIG. 1, or the apparatus 100 may include sub-components of some of the components shown in FIG. 1. The components shown in fig. 1 may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units. For example, the processor 110 may include at least one of the following processing units: an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a video codec, a Digital Signal Processor (DSP), a baseband processor, and a neural Network Processor (NPU). The different processing units may be independent devices or integrated devices.
The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
In some embodiments, processor 110 may include one or more interfaces. For example, the processor 110 may include at least one of the following interfaces: an inter-integrated circuit (I2C) interface, an inter-integrated circuit audio source (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a SIM interface, and a USB interface.
The I2C interface is a bi-directional synchronous serial bus that includes a serial data line (SDA) and a Serial Clock Line (SCL). In some embodiments, processor 110 may include multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, the charger, the flash, the camera 193, etc., respectively, through different I2C bus interfaces. For example: the processor 110 may be coupled to the touch sensor 180K via an I2C interface, such that the processor 110 and the touch sensor 180K communicate via an I2C bus interface to implement the touch functionality of the device 100.
The I2S interface may be used for audio communication. In some embodiments, processor 110 may include multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 via an I2S bus to enable communication between the processor 110 and the audio module 170. In some embodiments, the audio module 170 may communicate audio signals to the wireless communication module 160 via the I2S interface, enabling answering of calls via a bluetooth headset.
The PCM interface may also be used for audio communication, sampling, quantizing and encoding analog signals. In some embodiments, the audio module 170 and the wireless communication module 160 may be coupled through a PCM interface. In some embodiments, the audio module 170 may also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to implement a function of answering a call through a bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus used for asynchronous communications. The bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is generally used to connect the processor 110 with the wireless communication module 160. For example: the processor 110 communicates with a bluetooth module in the wireless communication module 160 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 170 may transmit the audio signal to the wireless communication module 160 through a UART interface, so as to realize the function of playing music through a bluetooth headset.
MIPI interfaces may be used to connect processor 110 with peripheral devices such as display screen 194 and camera 193. The MIPI interface includes a Camera Serial Interface (CSI), a Display Serial Interface (DSI), and the like. In some embodiments, processor 110 and camera 193 communicate through a CSI interface to implement the capture functionality of apparatus 100. The processor 110 and the display screen 194 communicate via the DSI interface to implement the display function of the device 100.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal interface and may also be configured as a data signal interface. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera 193, the display 194, the wireless communication module 160, the audio module 170, and the sensor module 180. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, or a MIPI interface.
The USB interface 130 is an interface conforming to the USB standard specification, and may be a Mini (Mini) USB interface, a Micro (Micro) USB interface, or a USB Type C (USB Type C) interface, for example. The USB interface 130 may be used to connect a charger to charge the apparatus 100, to transmit data between the apparatus 100 and a peripheral device, and to connect an earphone to play audio through the earphone. The USB interface 130 may also be used to connect other apparatuses 100, such as AR devices.
The connection relationship between the modules shown in fig. 1 is merely illustrative and does not limit the connection relationship between the modules of the apparatus 100. Alternatively, the modules of the apparatus 100 may also adopt a combination of the connection manners in the above embodiments.
The charge management module 140 is used to receive power from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 140 may receive the current of the wired charger through the USB interface 130. In some wireless charging embodiments, the charging management module 140 may receive electromagnetic waves through a wireless charging coil of the device 100 (current path shown as dashed line). The charging management module 140 may also supply power to the device 100 through the power management module 141 while charging the battery 142.
The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140, and supplies power to the processor 110, the internal memory 121, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be used to monitor parameters such as battery capacity, battery cycle number, and battery state of health (e.g., leakage, impedance). Alternatively, the power management module 141 may be disposed in the processor 110, or the power management module 141 and the charging management module 140 may be disposed in the same device.
The wireless communication function of the apparatus 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in device 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution for wireless communication applied on the device 100, such as at least one of the following: second generation (2)thgeneration, 2G) mobile communication solution, third generation (3)thgeneration, 3G) mobile communication solution, fourth generation (4)thgeneration, 5G) mobile communication solution, fifth generation (5)thgeneration, 5G) mobile communication solutions. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive electromagnetic waves from the antenna 1, and perform filtering, amplification, and other processes on the received electromagnetic waves, and then transmit the electromagnetic waves to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processorThe amplified signal is converted into electromagnetic wave by the antenna 1 and radiated out. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs sound signals through an audio device (e.g., speaker 170A, microphone 170B) or displays images or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional modules, independent of the processor 110.
Similar to the mobile communication module 150, the wireless communication module 160 may also provide a wireless communication solution applied on the device 100, such as at least one of the following: wireless Local Area Networks (WLANs), Bluetooth (BT), Bluetooth Low Energy (BLE), Ultra Wide Band (UWB), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR) technologies. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, frequency-modulates and filters electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive, frequency modulate and amplify the signal to be transmitted from the processor 110, which is converted to electromagnetic waves via the antenna 2 for radiation.
In some embodiments, antenna 1 of apparatus 100 and mobile communication module 150 are coupled and antenna 2 of apparatus 100 and wireless communication module 160 are coupled such that apparatus 100 may communicate with networks and other electronic devices via wireless communication techniques. The wireless communication technology may include at least one of the following communication technologies: global system for mobile communications (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), time division code division multiple access (TD-SCDMA), Long Term Evolution (LTE), BT, GNSS, WLAN, NFC, FM, IR technologies. The GNSS may include at least one of the following positioning techniques: global Positioning System (GPS), global navigation satellite system (GLONASS), beidou satellite navigation system (BDS), quasi-zenith satellite system (QZSS), Satellite Based Augmentation System (SBAS).
The device 100 may implement display functionality through the GPU, the display screen 194, and the application processor. The GPU is a microprocessor for image processing, and is connected to the display screen 194 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 may be used to display images or video. The display screen 194 includes a display panel. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (AMOLED), a flexible light-emitting diode (FLED), a Mini light-emitting diode (Mini LED), a Micro light-emitting diode (Micro LED), a Micro OLED (Micro OLED), or a quantum dot light-emitting diode (QLED). In some embodiments, the apparatus 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
The device 100 may implement a photographing function through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
The ISP is used to process the data fed back by the camera 193. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can perform algorithm optimization on the noise, brightness and color of the image, and can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into a standard Red Green Blue (RGB), YUV, or the like format image signal. In some embodiments, device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the apparatus 100 selects a frequency bin, the digital signal processor is configured to perform fourier transform or the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. The apparatus 100 may support one or more video codecs. In this way, the apparatus 100 can play or record video in a variety of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, and MPEG 4.
The NPU is a processor which uses biological neural network structure for reference, for example, the NPU can rapidly process input information by using a transfer mode between human brain neurons, and can also continuously self-learn. The NPU may implement functions of the apparatus 100, such as intelligent recognition: image recognition, face recognition, speech recognition and text understanding.
The external memory interface 120 may be used to connect an external memory card, such as a Secure Digital (SD) card, to implement the memory capability of the expansion device 100. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music, video, etc. are saved in an external memory card.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The internal memory 121 may include a program storage area and a data storage area. Wherein the storage program area may store an operating system, an application program required for at least one function (e.g., a sound playing function and an image playing function). The storage data area may store data (e.g., audio data and a phonebook) created during use of the device 100. In addition, the internal memory 121 may include a high-speed random access memory, and may also include a nonvolatile memory such as: at least one magnetic disk storage device, a flash memory device, and a universal flash memory (UFS), and the like. The processor 110 performs various processing methods of the apparatus 100 by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
The apparatus 100 may implement audio functions, such as music playing and recording, through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor.
The audio module 170 is used to convert digital audio information into an analog audio signal for output, and may also be used to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 or some functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also referred to as a horn, converts the audio electrical signal into a sound signal. The device 100 may listen to music or hands-free talk through the speaker 170A.
The receiver 170B, also called an earpiece, is used to convert the electrical audio signal into a sound signal. When the user uses the device 100 to receive a call or voice information, the voice can be received by placing the receiver 170B close to the ear.
The microphone 170C, also referred to as a microphone or microphone, is used to convert sound signals into electrical signals. When a user makes a call or sends voice information, a sound signal may be input into the microphone 170C by sounding near the microphone 170C. The apparatus 100 may be provided with at least one microphone 170C. In other embodiments, the apparatus 100 may be provided with two microphones 170C to implement the noise reduction function. In other embodiments, three, four, or more microphones 170C may be provided with the apparatus 100 to perform the functions of identifying the source of the sound and directing the recording. The processor 110 may process the electrical signal output by the microphone 170C, for example, the audio module 170 and the wireless communication module 160 may be coupled via a PCM interface, and the microphone 170C converts the ambient sound into an electrical signal (e.g., a PCM signal) and transmits the electrical signal to the processor 110 via the PCM interface; from processor 110, the electrical signal is subjected to a volume analysis and a frequency analysis to determine the volume and frequency of the ambient sound.
The headphone interface 170D is used to connect a wired headphone. The headset interface 170D may be the USB interface 130, or may be a 3.5mm open mobile device 100 platform (OMTP) standard interface, a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor 180A is used for sensing a pressure signal, and converting the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A may be of a wide variety, and may be, for example, a resistive pressure sensor, an inductive pressure sensor, or a capacitive pressure sensor. The capacitive pressure sensor may be a sensor that includes at least two parallel plates having conductive material, and when a force is applied to the pressure sensor 180A, the capacitance between the electrodes changes, and the apparatus 100 determines the strength of the pressure based on the change in capacitance. When a touch operation is applied to the display screen 194, the device 100 detects the touch operation from the pressure sensor 180A. The apparatus 100 may also calculate the position of the touch from the detection signal of the pressure sensor 180A. In some embodiments, the touch operations that are applied to the same touch position but different touch operation intensities may correspond to different operation instructions. For example: when the touch operation with the touch operation intensity smaller than the first pressure threshold value acts on the short message application icon, executing an instruction for viewing the short message; and when the touch operation with the touch operation intensity larger than or equal to the first pressure threshold value acts on the short message application icon, executing an instruction of newly building the short message.
The gyro sensor 180B may be used to determine the motion attitude of the apparatus 100. In some embodiments, the angular velocity of device 100 about three axes (i.e., the x-axis, y-axis, and z-axis) may be determined by gyroscope sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. For example, when the shutter is pressed, the gyro sensor 180B detects the shake angle of the device 100, calculates the distance to be compensated for by the lens module according to the shake angle, and allows the lens to counteract the shake of the device 100 by a reverse movement, thereby achieving anti-shake. The gyro sensor 180B can also be used in scenes such as navigation and motion sensing games.
The air pressure sensor 180C is used to measure air pressure. In some embodiments, the device 100 calculates altitude from barometric pressure values measured by the barometric pressure sensor 180C, aiding in positioning and navigation.
The magnetic sensor 180D includes a hall sensor. The device 100 may detect the opening and closing of the flip holster using the magnetic sensor 180D. In some embodiments, when the apparatus 100 is a flip phone, the apparatus 100 may detect the opening and closing of the flip according to the magnetic sensor 180D. The device 100 can set the automatic unlocking of the flip cover according to the detected opening and closing state of the leather sheath or the opening and closing state of the flip cover.
Acceleration sensor 180E may detect the magnitude of acceleration of device 100 in various directions, typically the x-axis, y-axis, and z-axis. The magnitude and direction of gravity can be detected when the device 100 is at rest. The acceleration sensor 180E may also be used to recognize the attitude of the device 100 as an input parameter for applications such as landscape and portrait screen switching and pedometers.
The distance sensor 180F is used to measure a distance. The device 100 may measure distance by infrared or laser. In some embodiments, for example in a shooting scene, the device 100 may utilize the range sensor 180F to range for fast focus.
The proximity light sensor 180G may include, for example, a light-emitting diode (LED) and a photodetector, for example, a photodiode. The LED may be an infrared LED. The device 100 emits infrared light outward through the LED. The apparatus 100 uses a photodiode to detect infrared reflected light from nearby objects. When reflected light is detected, the apparatus 100 may determine that an object is present nearby. When no reflected light is detected, the apparatus 100 can determine that there is no object nearby. The device 100 can detect whether the user holds the device 100 close to the ear or not by using the proximity light sensor 180G, so as to automatically turn off the screen to save power. The proximity light sensor 180G may also be used for automatic unlocking and automatic screen locking in a holster mode or a pocket mode.
The ambient light sensor 180L is used to sense the ambient light level. The device 100 may adaptively adjust the brightness of the display screen 194 based on the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust the white balance when taking a picture. The ambient light sensor 180L may also cooperate with the proximity light sensor 180G to detect whether the device 100 is in a pocket to prevent inadvertent contact.
The fingerprint sensor 180H is used to collect a fingerprint. The device 100 can utilize the collected fingerprint characteristics to achieve the functions of unlocking, accessing an application lock, taking a picture, answering an incoming call, and the like.
The temperature sensor 180J is used to detect temperature. In some embodiments, the apparatus 100 implements a temperature processing strategy using the temperature detected by the temperature sensor 180J. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold, the apparatus 100 performs a reduction in performance of a processor located near the temperature sensor 180J, so as to reduce power consumption and implement thermal protection. In other embodiments, the device 100 heats the battery 142 when the temperature is below another threshold to avoid a low temperature causing the device 100 to shut down abnormally. In other embodiments, when the temperature is below a further threshold, the apparatus 100 performs a boost on the output voltage of the battery 142 to avoid abnormal shutdown due to low temperature.
The touch sensor 180K is also referred to as a touch device. The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also referred to as a touch screen. The touch sensor 180K is used to detect a touch operation applied thereto or in the vicinity thereof. The touch sensor 180K may pass the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided through the display screen 194. In other embodiments, the touch sensor 180K may be disposed on a surface of the device 100 at a different location than the display screen 194.
The bone conduction sensor 180M may acquire a vibration signal. In some embodiments, the bone conduction sensor 180M may acquire a vibration signal of the human vocal part vibrating the bone mass. The bone conduction sensor 180M may also contact the human pulse to receive the blood pressure pulsation signal. In some embodiments, the bone conduction sensor 180M may also be disposed in a headset, integrated into a bone conduction headset. The audio module 170 may analyze a voice signal based on the vibration signal of the bone mass vibrated by the sound part acquired by the bone conduction sensor 180M, so as to implement a voice function. The application processor can analyze heart rate information based on the blood pressure beating signal acquired by the bone conduction sensor 180M, so as to realize the heart rate detection function.
The keys 190 include a power-on key and a volume key. The keys 190 may be mechanical keys or touch keys. The device 100 can receive a key input signal and realize the function related to the case input signal.
The motor 191 may generate vibrations. The motor 191 may be used for incoming call prompts as well as for touch feedback. The motor 191 may generate different vibration feedback effects for touch operations applied to different applications. The motor 191 may also produce different vibration feedback effects for touch operations applied to different areas of the display screen 194. Different application scenarios (e.g., time reminders, received messages, alarms, and games) may correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
Indicator 192 may be an indicator light that may be used to indicate a change in charge status and charge level, or may be used to indicate a message, missed call, and notification.
The SIM card interface 195 is used to connect a SIM card. The SIM card may be inserted into the SIM card interface 195 to make contact with the device 100, or may be removed from the SIM card interface 195 to make separation from the device 100. The apparatus 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The same SIM card interface 195 may be inserted with multiple cards at the same time, which may be of the same or different types. The SIM card interface 195 may also be compatible with external memory cards. The device 100 interacts with the network through the SIM card to implement functions such as communication and data communication. In some embodiments, the device 100 employs an embedded SIM (eSIM) card, which can be embedded in the device 100 and cannot be separated from the device 100.
The hardware system of the apparatus 100 is described in detail above, and the software system of the apparatus 100 is described below. The software system may adopt a layered architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture or a cloud architecture, and the software system of the apparatus 100 is exemplarily described in the embodiment of the present application by taking the layered architecture as an example.
As shown in fig. 2, the software system adopting the layered architecture is divided into a plurality of layers, and each layer has a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the software system may be divided into four layers, an application layer, an application framework layer, an Android Runtime (Android Runtime) and system library, and a kernel layer from top to bottom, respectively.
The application layer may include applications such as camera, gallery, calendar, talk, map, navigation, WLAN, bluetooth, music, video, short message, etc.
The application framework layer provides an Application Programming Interface (API) and a programming framework for the application of the application layer. The application framework layer may include some predefined functions.
For example, the application framework layers include a window manager, a content provider, a view system, a phone manager, a resource manager, and a notification manager.
The window manager is used for managing window programs. The window manager can obtain the size of the display screen and judge whether a status bar, a lock screen and a capture screen exist.
The content provider is used to store and retrieve data and make it accessible to applications. The data may include video, images, audio, calls made and answered, browsing history and bookmarks, and phone books.
The view system includes visual controls such as controls to display text and controls to display pictures. The view system may be used to build applications. The display interface may be composed of one or more views, for example, a display interface including a short message notification icon, which may include a view displaying text and a view displaying pictures.
The phone manager is used to provide communication functions of the device 100, such as management of call status (on or off).
The resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, and video files.
The notification manager enables the application to display notification information in the status bar, can be used to convey notification-type messages, can disappear automatically after a short dwell, and does not require user interaction. Such as notification managers, are used for download completion notifications and message reminders. The notification manager may also manage notifications that appear in a chart or scrollbar text form in a status bar at the top of the system, such as notifications for applications running in the background. The notification manager may also manage notifications that appear on the screen in dialog windows, such as prompting for text messages in a status bar, sounding a prompt tone, vibrating the electronic device, and flashing an indicator light.
The Android Runtime comprises a core library and a virtual machine. The Android runtime is responsible for scheduling and managing an Android system.
The core library comprises two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. And executing java files of the application program layer and the application program framework layer into a binary file by the virtual machine. The virtual machine is used to perform the functions of object lifecycle management, stack management, thread management, security and exception management, and garbage collection.
The system library may include a plurality of functional modules, such as: surface managers (surface managers), Media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., open graphics library for embedded systems, OpenGL ES) and 2D graphics engines (e.g., Skin Graphics Library (SGL)) for embedded systems.
The surface manager is used for managing the display subsystem and providing fusion of the 2D layer and the 3D layer for a plurality of application programs.
The media library supports playback and recording of multiple audio formats, playback and recording of multiple video formats, and still image files. The media library may support a variety of audiovisual coding formats, such as MPEG4, h.264, moving picture experts group audio layer 3 (MP 3), Advanced Audio Coding (AAC), adaptive multi-rate (AMR), joint picture experts group (JPG), and Portable Network Graphics (PNG).
The three-dimensional graphics processing library may be used to implement three-dimensional graphics drawing, image rendering, compositing, and layer processing.
The two-dimensional graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The kernel layer can comprise driving modules such as a display driver, a camera driver, an audio driver and a sensor driver.
The following illustrates the workflow of the software system and the hardware system of the apparatus 100 in connection with displaying a photographing scene.
When a user performs a touch operation on the touch sensor 180K, a corresponding hardware interrupt is sent to the kernel layer, and the kernel layer processes the touch operation into an original input event, where the original input event includes information such as touch coordinates and a timestamp of the touch operation. The original input event is stored in the kernel layer, and the application framework layer acquires the original input event from the kernel layer, identifies a control corresponding to the original input event, and notifies an Application (APP) corresponding to the control. For example, the touch operation is a click operation, the APP corresponding to the control is a camera APP, and after the camera APP is awakened by the click operation, the camera drive of the kernel layer can be called through the API, and the camera 193 is controlled to shoot through the camera drive.
The screen-off display means that after the electronic equipment is screen-off, a partial area on the screen can be lightened to display information such as a clock, a date and a notice, so that the operation of a user is facilitated, and the experience of the user is improved. Currently, if a user needs to view information from an electronic device in a specific scene, the user needs to unlock the electronic device and enter a specific Application (App) to obtain required information. For example, when a user is running, it is desirable to acquire exercise information; the user needs to unlock the electronic device and obtain the step information, calorie consumption information, or other exercise information from the sport App. However, when the user is running, it is inconvenient for the user to perform a series of operations such as unlocking the electronic device and entering the sports App, and the user experience is poor.
In view of this, the embodiment of the present application provides a method for displaying a blank screen, where a user may select a target theme during blank screen according to a requirement, and the target theme may correspond to one target AOD resource package; the target AOD resource package comprises a target Application Package (APK), and the target APK can be used for acquiring data of a user for processing; when the electronic equipment is turned off, the data in the target APK can be acquired and displayed on the display screen in a screen-off mode, so that a user can acquire information required by the user under the condition that the electronic equipment is not required to be turned on and unlocked, and user experience can be improved.
The method of the screen-off display provided by the present application is described in detail below with reference to fig. 3 to 13.
Fig. 3 is a schematic structural diagram of a method for displaying a blank screen according to an embodiment of the present disclosure. As shown in fig. 3, the hardware may include a power key 201 and a display screen 202; the power key 201 is used for connecting or disconnecting the power of the terminal equipment; the terminal device detects the operation of clicking the power key 201 by the user; in response to the user's operation, the terminal device may start the information screen display service 203; the display screen 202 may be used to display an interface of the terminal device; the information screen display service 203 is used for displaying information to a user in a display screen in a scene that the terminal equipment is in a screen off state; an Extensible Markup Language (XML) parsing tool 204 is used for acquiring XML files and reading the XML files into a memory; for example, as shown in fig. 3, the XML parsing tool may parse an Application Package (APK) calling method in the scene resource description file; sending the obtained information of the calling method of the APK to the message screen display service 203, wherein the message screen display service 203 can call the APK, and the APK obtains data and performs data processing; user Interface (UI) tool services 205 are used to render data for APK transmission; the data display service 206 is used for displaying data transmitted by the APK.
In one example, after the terminal device detects that the user clicks the power key 201, the information screen display service 203 is executed; the message screen display service 203 may send a parsing instruction to the XML parsing tool 204, where the parsing instruction may be used to instruct the XML parsing tool 204 to parse information in the current AOD resource packet; the XML parsing tool 204 may parse the scene description file in the AOD resource package by calling the APK method, and send the APK calling method to the message screen display service 203 after parsing; the message screen display service 203 acquires the APK calling method and then calls the APK to perform data processing, and the APK processes the acquired user data and performs rendering display through the UI tool service 205 and the data display service 206; for example, the APK may start running after the user performs scene setting for the message screen display, and acquire and process user data in the background.
For example, a mode of acquiring data by the APK is exemplified by taking a screen-off scene of the terminal device as a running scene, and a data processing method after the APK acquires data may be shown in subsequent fig. 6.
The implementation mode is as follows: in the case where the user is provided with a wearable device, a locator (e.g., a global positioning system) is included in the wearable device.
For example, the terminal device may acquire heart rate data, step information, and location information of the user through the wearable device; the heart rate data, the step number information and the position information can be transmitted to the exercise health application program, and the APK can acquire the data of the user through the exercise health application program.
In one possible implementation, the APK may also not need to obtain heart rate data, step count information, and location information in the wearable device through the athletic health application.
The implementation mode two is as follows: when the user is provided with the wearable device and the wearable device does not include a global positioning system.
For example, the terminal device may obtain heart rate data and step number information of the user through the wearable device, and may obtain location information of the user through a locator (e.g., a global positioning system) included in the terminal device; the heart rate data, the step number information and the position information can be transmitted to the exercise health application program, and the APK can acquire the data of the user through the exercise health application program.
In one possible implementation, the APK may also not need to obtain heart rate data and step number information in the wearable device through the athletic health application.
The implementation mode is three: in the case where the user does not have a wearable device.
For example, the location information of the user may be acquired by a locator (e.g., a global positioning system) included in the terminal device; the AOD resource packet can comprise a pedometer algorithm, and the step number information of the user can be determined according to the position information of the user; and sending the position information and the step number information of the user to the APK. It should be understood that, the above description, taking the screen-off scene as a running scene as an example, describes a manner in which the APK acquires data by way of example; this is not a limitation of the present application.
It should also be understood that, in the embodiment of the present application, an APK is newly added in the AOD resource package, and an interactive flow between the message screen display service and the APK is provided; as shown in fig. 3, the message screen display service 203 may invoke an APK; the APK may pass the processed data results to the message screen display service 203; the screen displaying service 203 renders and displays the obtained data result through the UI tool service 205 and the data displaying service 206. In the embodiment of the application, one screen-off scene may correspond to one APK; the user can select the target theme according to the self requirement, and the electronic equipment can display the display interface corresponding to the target theme when the screen is turned off, so that the user can obtain the self required information under the condition that the screen of the electronic equipment is not required to be turned on and turned off.
Fig. 4 is a schematic diagram of a method for displaying a blank screen according to an embodiment of the present application. The method 300 shown in fig. 4 may include steps S310 to S350, which are described in detail below.
Step S310, displaying a screen-off scene list (one example of a list of displayed topics).
For example, a topic list may be displayed, where the topic list may be used for a user to determine a target topic displayed by the electronic device on a screen, where the target topic corresponds to a target AOD resource package; the theme of the screen-off display may refer to a screen-off scene.
In one example, the electronic device may prefabricate and manage the screen-out scenes; the off-screen scene may include a scene classification; the scene classification can include a motion classification, a trip classification and other classifications; the motion classification may include: running, mountain climbing, swimming, or elliptical machines, etc.; the travel classification may include weather, navigation, or driving scenes.
For example, a screen-off scene may be displayed on the screen of the electronic device as illustrated in fig. 7 to 10.
It should be understood that the screen-off scene can be set according to the needs of the user; the screen-off scene is an example, and the screen-off scene is not limited in any way in the present application.
In one possible implementation, as shown in fig. 7, the user may click the off-screen display setting option, and in response to the click operation of the user, the electronic device activates the off-screen display setting option.
In one example, the user may also initiate a screen-off scene setting option via a voice command. For example, a user triggers a voice function by initiating a preset voice; after the voice function is triggered, the user can initiate a voice instruction of 'screen-off scene setting', so that a screen-off scene setting option in the electronic equipment is started to operate, and the preset voice is not limited.
Step S320, selecting a target scene (one example of a target theme).
It should be understood that the target scene may refer to a target subject of the screen-out display in the embodiments of the present application.
Illustratively, the user may select a target scene in the screen-off scene list according to the user's own needs, and the target scene may refer to the screen-off scene that the user currently needs according to the user's own needs.
It should be noted that, through the target scene, the user may obtain currently required information when the electronic device is turned off.
In one possible implementation manner, in response to a click operation of a user, a display interface as shown in fig. 8 may be displayed after the electronic device starts a screen-off display setting option; the classification of the screen-off scene can be displayed in the display interface; for example, displaying a sport category and a trip category; if the user wants to select running as a target scene, the user can click the motion classification of the screen-off scene interface; in response to the user clicking the motion classification operation, the electronic device may display a display interface as shown in fig. 9; the user clicks the run option in the display interface shown in fig. 9 to set running as a target scene of the screen-off display of the electronic device.
In one possible implementation, in response to a click operation of the user, after the electronic device starts the screen-off display setting option, a preview screen-off display interface shown in fig. 10 may be displayed; a category of the screen-out scene and a scene included in the category are included in the display interface shown in fig. 10; for example, the sports classification includes running and mountain climbing; the travel classification comprises weather and navigation; the user may click the run option directly on the display interface shown in fig. 10 to set running as the target scene of the screen-out display of the electronic device.
It should be understood that the motion classification and the travel classification are used as examples, and other classifications may be included in the screen-off scene; in addition, the above-described scenes in the sport classification by running and mountain climbing are exemplified; illustrating scenes in the occurrence classification by weather and navigation; the motion classification and the travel classification may further include other scenes, which is not limited in this application.
In one example, the user may also select a screen-off display target scene of the electronic device through voice instructions. For example, a user triggers a voice function by initiating a preset voice; after the voice function is triggered, the user can initiate a voice instruction of setting a running scene as a screen-off scene, so that a target scene is selected; the preset voice is not limited in the present application.
Step S330, matching an engine application package (one example of a target application package) of the target scene.
Illustratively, the off-screen display controller may match out the desired engine according to the target scene selected by the user.
It should be understood that different target scenarios may correspond to different engine packages; the engine program package is used for processing data corresponding to the target scene; for example, data in different scenes are called through a data interface and processed.
It should also be understood that an engine may correspond to an engine Application Package (APK), i.e., an engine APK; the codes of the application program are required to be executed on the electronic equipment of the system, and then the codes are compiled and then packaged into files which can be recognized by the system, so that the files can be executed, and the APK refers to a file format which can be recognized and executed by the system.
In a possible implementation manner, if the target scene selected by the user needs to be subjected to parameter setting, the engine APK may further guide the user to perform relevant parameter configuration of the target scene.
For example, if the target scene selected by the user is running, the engine may guide the user to set the target parameters.
As shown in fig. 11, the user may set a target distance in the interface of the running scene screen-off display; for example, if the user can input 10 km, the target distance of the current running is set to 10 km.
In one example, if the target scene selected by the user is weather, the engine APK may guide the user to configure a city name of a travel city.
In one example, if the target scene selected by the user is a navigation, the engine APK may guide the user to configure the start position and the end position of the navigation.
And step S340, data processing of the target scene.
Illustratively, after the user configures the relevant parameters of the target scene, the engine APK acquires the relevant data of the target scene and performs data processing. The specific flow of data processing can be seen in the following fig. 5.
And step S350, displaying the screen of the target scene in a turn-off mode.
For example, the engine APK may process and render graphics of relevant data of the user in the current target scene, and display the processed data in a screen-off manner.
In one possible implementation, when the user selects a target scene, the off-screen display controller may invoke a scene resource package (one example of a target AOD resource package) of the target scene (one example of a target topic); the scene resource package can include a scene description file, a scene preview file and an engine APK (an example of a target application package); the scene description file is used for describing the name and classification of the target scene and calling an engine application package file corresponding to the target scene; the scene preview image file is used for displaying a screen-off preview interface corresponding to the target scene; the engine application package is used for guiding the user to set target parameters in the target scene and processing the data of the user in the target scene.
For example, when the method is used for selecting running as a target scene, a scene resource package corresponding to running can be called; the engine APK of the running scene can be called through the description file in the scene resource packet corresponding to the running, so that the running data of the user is processed in the screen-off scene, and the image is rendered to perform screen-off display of the running scene.
In one example, when the user selects the target scene to be running outdoors, the data related to the running outdoors is displayed on the electronic device of the user in a screen-off manner by acquiring the data of the running outdoors and performing data processing. For example, as shown in fig. 12, in the running scene, the electronic device may display contents in the off-screen display, including: target (distance), current (distance), heart rate, advice, exercise, calories, average pace, average heart rate, number of steps, and cumulative hill climbs.
For example, in the embodiment of the present application, one target scenario may correspond to one scenario resource package; the scene resource package can be a compressed package file without compression rate in a suffix format of Honor Always on Display (HNA); the scene resource package may include a scene description file (description. xml), a scene preview picture file (preview. jpg), and an engine Application Package (APK) file corresponding to the scene.
Exemplarily, a scene resource package corresponding to a running scene is taken as an example for illustration; among them, a scene description (xml) file, a scene preview (jpg) file, and a running engine (apk) file may be included.
For example, the following may be included in a scene description (xml) file:
Figure BDA0003174031500000141
wherein, the total 10 tags, < title > represent scene names; < title-cn > represents the scene Chinese name; < category > indicates that the scene classification is used for performing category division display according to the item on the screen-off scene page; < persistence > is used to identify whether the engine APK needs to run continuously while the screen is bright; true indicates that the engine APK is running continuously, i.e. in both off-screen and on-screen conditions; false indicates that the engine APK needs to be stopped when the screen is bright; < author > represents the author of the scene; < designer > represents a scene designer; < version > indicates a version number; < packagemame > represents the engine package name; < classname > represents the engine class name; < method > indicates an engine method.
For example, if the target scene of the screen-off selected by the user is weather, the engine APK corresponding to the weather scene is not required to continue to operate when the screen is on, the engine APK may be cleared and restarted when the screen is off next time, and in this case, < persistence > may be set to false. If the target scene of the screen off selected by the user is running, the engine APK corresponding to the running scene needs to continue to run to calculate the mileage traveled by the user during running when the screen is on, in which case < persistence > may be set to true.
It should be understood that the above examples describe the contents included in the scene description file; the three tags of author, scene designer and version number may not be included in the scene description file.
Illustratively, the scene preview image file may be in a jpg format; or other image formats may be used.
Illustratively, a running engine application package (apk) file may be used for data processing and display rendering; and calling the engine for data processing through screen-off display, and then displaying the data processed by the engine.
For example, the call engine via the off-screen display may include the following:
Figure BDA0003174031500000151
the engine package name in the calling method may correspond to < packagemame > in a scene description (description.xml) file, the engine class name corresponds to < classname > in the scene description (description.xml) file, and the engine method corresponds to < method > in the scene description (description.xml) file.
In a possible implementation manner, the electronic device may not only prefabricate a screen-off scene, but also access a screen-off display scene resource package designed by a third-party designer; however, the scene resource package at least includes the above-mentioned scene description file (description. xml), the scene preview image (preview. jpg) file, and the corresponding engine APK file; on the basis, the elements in the scene resource package can be expanded, so that the engine APK provided in the scene resource package can normally process the data corresponding to the scene.
In other words, the scene resource package of the screen-off scene in the electronic device may be data configured in the electronic device when the electronic device leaves a factory; or, the scene resource package of the screen-off scene may also be data acquired from an open database; however, when the scene resource package of the screen-off scene is acquired from the open database, the data needs to be unified according to the standard format of the scene resource package; namely, the scene resource package at least includes a scene description file, a scene preview image file, and an engine APK file corresponding to the scene.
It should be understood that, in the embodiment of the present application, a data format in a scene resource package is specified; however, the manner of acquiring the data in the scene resource package is not limited at all.
In the embodiment of the present application, when the second engine APK starts to operate, the first engine APK needs to be stopped.
For example, when the electronic device is currently on screen, the user sets the screen-off scene to select the screen-off scene to be running, and then the display interface of the running scene is displayed after the electronic device is off screen; the user wakes up the screen again, sets the screen-off scene to select the screen-off scene as swimming, and the electronic equipment stops running the engine APK of the running scene and starts the engine APK of the swimming scene when the screen-off display is performed; in addition, the running program updates the engine package name of the engine APK included in the scene resource package from running to swimming.
It should also be understood that the engine APK may not be operated when the electronic device is turned on, and the engine APK is started to operate after each screen off; when the engine APK runs, the program running this time is compared with the program running last time before this time; if the program changes, the engine APK stops the last program and starts the program.
For example, the running engine APK needs to be stopped before the screen-off display calls the engine APK each time, and the stopping method may be as follows:
ActivityManager manager=(ActivityManager)context.getSystemService(Context.ACTIVITY_SERVICE);
manager.killBackgroundProcesses(currentEngineName)。
for example, fig. 5 is a schematic diagram of a method for invoking an engine application package according to an embodiment of the present application. The method 400 shown in fig. 5 may include steps S410 to S460, which are described in detail below.
Step S410 stops the first engine APK (an example of the second APK).
Illustratively, the target scene selected by the user at the historical moment is running; selecting weather as a target scene at the current moment by a user according to the requirement of the user; therefore, it is necessary to stop running the corresponding first engine application package.
Step S420 acquires an engine package name of the second engine APK (an example of the first APK).
And step S430, obtaining the engine class name of the second engine APK.
And step S440, obtaining an engine method of the second engine APK.
It should be further understood that, in a case where a target scene is switched from a first scene to a second scene, a first engine APK corresponding to the first scene needs to be stopped and a second engine APK corresponding to the second scene needs to be started; the engine package name, the engine class name, and the engine method for obtaining the second engine APK ensure that the second engine APK can be operated.
Step S450, starting the engine processor.
For example, the engine package name, the engine class name, and the engine method of the second engine APK are acquired, and then the second engine APK may be started by the engine processor.
And step S460, updating the APK package name of the binding engine.
For example, after the first engine APK stops the second engine APK from starting, the running program may update the engine package name of the engine APK included in the scene resource package from the name of the first engine APK to the name of the second engine APK.
For example, the target scene at the historical time is running, and the target scene at the current time is swimming; the engine file in the scene resource package can be updated by the running application package (running.apk) to the swimming application package (swimming.apk).
In one example, a user selects a first scene as a target scene when the electronic device is on the screen; the method comprises the steps that when the screen is turned off, the electronic equipment calls a scene resource packet corresponding to a first scene, processes data of a user in the first scene and carries out screen turn-off display; when a user wakes up the screen of the electronic device, the scene resource packet corresponding to the first scene can determine whether to stop running the engine APK according to the value in < persistence > in description.xml; therefore, when the user turns off the screen again in the first scene, the electronic equipment can also continue to process the data in the first scene and perform screen-off display again.
For example, when the electronic device is turned on, the user selects running as a target scene; when the electronic equipment is turned off for the first time, the user runs 3 kilometers, and the screen turning-off display at the time comprises the current distance of 3 kilometers; the user may wake up the screen while running, and the user may still be running, which has run for 4 km, and the display interface includes the current distance of 4 km when the electronic device is turned off for the second time. For example, in the bright screen state, the engine APK needs to be in the running state to know the mileage run by the user in the bright screen state, and therefore < persistence > in description.xml needs to be set to true.
It should be understood that the engine APK is normally operated when the electronic device is turned off, but for some operation scenes requiring persistence, the engine APK may be configured to operate when the electronic device is turned on and perform corresponding data caching; for example, when the user is running, the screen of the electronic device is turned off, so that a screen turn-off display interface corresponding to a running scene can be displayed; if the screen is lit for too short a time while the user is running and continues running, the engine APK may continue to process the user's running data when the electronic device is turned off again, rather than resuming the calculation.
According to the method for screen turn-off display, the user can select the target scene of screen turn-off display according to the requirement, and screen turn-off display is carried out when the electronic equipment turns off the screen according to the target scene selected by the user, so that the user can obtain the information required by the user under the condition that the electronic equipment is not required to be turned on and unlocked, and the user experience can be improved.
The method for displaying the off screen provided by the embodiment of the present application is described in detail below with reference to fig. 6, which takes a target scene as an example of running.
Fig. 6 is a schematic view of a method for displaying a dead screen in a running scene according to an embodiment of the present application. The method 500 shown in fig. 6 may include steps S501 to S511, which are described in detail below.
Step S501, the user selects a running scene as a target scene (one example of a target theme).
For example, the user may select the screen-off scene to be running on the setting display interface of the electronic device when the electronic device is on.
For example, as shown in fig. 7, fig. 7 shows a display interface of the electronic device on a setting interface, where a plurality of setting options can be displayed; such as wireless local area network, off-screen display, bluetooth, and battery, etc.
It should be understood that the display content of the setting interface may also include other more setting options, which is not limited in this application.
As shown in fig. 7, the user may click the off-screen display setting option, and in response to the click operation of the user, the electronic device starts the off-screen display setting option; the electronic device may display a display interface as shown in fig. 8 after activating the off-screen display setting option. The scene classification of the screen-off scene can be displayed in the display interface; for example, the scene classification may include a motion classification, a travel classification, and other classifications. The user may click on the motion option in the display interface as in fig. 8; a display interface as shown in fig. 9 may be displayed in response to a click operation by the user; the motion classification can comprise running, mountain climbing and other scenes; the user may click on the run option in the display interface as in fig. 9, setting running to the target scene.
Step S502, the running scenario engine APK (one example of a target APK) matches.
For example, according to the target scene of the screen-off display selected by the user, the screen-off display controller may automatically match out the required engine according to the selected target scene.
For example, when the user selects running as the target scene, the screen-off display controller may call a scene resource packet corresponding to the running scene; the scene resource package can comprise a scene description file, a scene preview image file and an engine application package file corresponding to a target scene; the scene description file is used for describing the name and classification of the running scene and calling an engine application program package file corresponding to the running scene; the scene preview image file is used for displaying a screen-off preview interface corresponding to the running scene; the engine application package is used for guiding the user to set target parameters in a running scene and processing data of the user in the running scene. The engine APK corresponding to the running scene can be matched by calling the scene resource file of the running scene.
Step S503, the user sets the target distance.
For example, as shown in fig. 11, the user may set a target distance in the interface of the screen-off display in the running scene; for example, the user may input 10 km and set the target distance of the current running to 10 km.
Step S504, preview is applied.
For example, as shown in fig. 11, after setting the target distance, the user may click an "application" option, in response to a click operation of the user; the electronic device may generate a preview display interface, as shown in fig. 12.
And step S505, heart rate acquisition.
For example, the electronic device may obtain heart rate data of the user in a running scene by connecting with the wearable device.
It should be understood that the above step S505 is an optional step; the heart rate acquisition may not be performed also in a case where the electronic device is not connected to the wearable device.
And step S506, speed matching guidance.
In one example, the user may be prompted to speed up or slow down based on the current user heart rate compared to the medical running heart rate.
In one example, pacing instructions may not be given if heart rate data for the user cannot be obtained.
And step S507, position acquisition.
Position acquisition may be performed, for example, by a positioning element in the electronic device.
And step S508, mileage calculation.
In one example, the running engine may obtain the current location through the system interface and perform an integration calculation to obtain the distance.
ds=sqrt[(Xi-Xi-1)+(Yi-Yi-1)+(Zi-Zi-1)];
s=s+ds;
Wherein sqrt represents a square root function; (X)i,Yi,Zi) Coordinates representing time i; ds represents the distance from the moment i-1 to the moment i, and the time difference between the two moments is less than 1 second; s represents the distance of the current movement.
Step S509, target achievement calculation.
Illustratively, the target achievement may include a target remaining amount or a percentage of completion.
Step S510, energy consumption calculation.
For example, the energy consumption may be computationally determined using a calorie calculation formula.
And step S511, rendering graphics.
After the data processing is completed, the engine application package may perform graphic rendering according to the processed data and perform screen-off display.
In a possible implementation manner, the target scene selected by the user can be a weather scene, and the engine APK can access the weather APK to inquire the weather state in real time and perform screen-off display; for example, a snowing animation can be played in a screen-off display when snow falls in the weather; when the weather is rainy, a rain animation and the like can be played in the screen-off display; temperature information, pollution indexes, travel suggestions and the like can be displayed in the screen-off display.
In a possible implementation manner, the target scene selected by the user may be a navigation scene, and the navigation engine APK may perform screen-off display on the real-time navigation data; for example, turn right after 100 meters as shown in fig. 13.
Based on the screen-off display method provided by the embodiment of the application, the user can select the displayed target scene during screen-off according to the requirement, and screen-off display is performed according to the target scene selected by the user during screen-off of the electronic equipment, so that the user can obtain the information required by the user under the condition of not unlocking the screen of the electronic equipment, and the user experience can be improved.
The method for displaying the off-screen provided by the embodiment of the present application is described in detail above with reference to fig. 1 to 13; an embodiment of the apparatus of the present application will be described in detail below with reference to fig. 14 and 15. It should be understood that the apparatus in the embodiment of the present application may perform the various methods in the embodiment of the present application, that is, the following specific working processes of various products, and reference may be made to the corresponding processes in the embodiment of the foregoing methods.
Fig. 14 is a schematic structural diagram of a device for displaying a blank screen according to the present application. The device 600 has a display screen and the device 600 comprises an acquisition unit 610 and a processing unit 620.
The obtaining unit 610 is configured to obtain a first instruction of a user, where the first instruction is used to instruct a target touchscreen to display an AOD resource package, where the target AOD resource package includes a target application package APK, and the target APK is used to process data of the user; the processing unit 620 is configured to invoke the target APK according to the target AOD resource package; and when the electronic equipment is turned off, acquiring the data in the target APK and displaying the data on the display screen in a turned-off manner.
Optionally, as an embodiment, the target AOD resource package includes a scene description file, where the scene description file includes a calling method of the target APK, and the processing unit 620 is specifically configured to analyze the scene description file and obtain the calling method of the target APK; and calling the target APK according to the calling method.
The target APK is also used for acquiring the data of the user through the wearable equipment of the user.
Optionally, as an embodiment, the target APK is further configured to instruct the user to configure a target parameter of the screen-out display of the electronic device.
Optionally, as an embodiment, the target APK is a first APK, and the processing unit 620 is further configured to:
stopping invoking a second APK, the second APK being different from the first APK.
Optionally, as an embodiment, the target AOD resource package further includes a preview image, and the preview image is used to display a preview interface of the electronic device that is turned off.
Optionally, as an embodiment, the processing unit 620 is further configured to:
and displaying a theme list, wherein the theme list is used for the user to determine a target theme displayed by the electronic equipment in a screen-off mode, and the target theme corresponds to the target AOD resource package.
Optionally, as an embodiment, the topic list includes a topic classification, and the topic classification is used to indicate a category corresponding to a topic in the topic list.
It should be noted that the apparatus 600 is embodied in the form of a functional unit. The term "unit" herein may be implemented in software and/or hardware, and is not particularly limited thereto.
For example, a "unit" may be a software program, a hardware circuit, or a combination of both that implement the above-described functions. The hardware circuitry may include an Application Specific Integrated Circuit (ASIC), an electronic circuit, a processor (e.g., a shared processor, a dedicated processor, or a group of processors) and memory that execute one or more software or firmware programs, a combinational logic circuit, and/or other suitable components that support the described functionality.
Accordingly, the units of the respective examples described in the embodiments of the present application can be realized in electronic hardware, or a combination of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
Fig. 15 shows a schematic structural diagram of an electronic device provided in the present application. The dashed lines in fig. 15 indicate that the unit or the module is optional. The electronic device 700 may be used to implement the methods described in the method embodiments above.
The electronic device 700 includes one or more processors 701, and the one or more processors 702 may support the electronic device 700 to implement the methods in the method embodiments. The processor 701 may be a general purpose processor or a special purpose processor. For example, the processor 701 may be a Central Processing Unit (CPU), a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA), or other programmable logic device, such as a discrete gate, a transistor logic device, or a discrete hardware component.
The processor 701 may be used to control the electronic device 700, execute software programs, and process data of the software programs. The electronic device 700 may further include a communication unit 705 to enable input (reception) and output (transmission) of signals.
For example, the electronic device 700 may be a terminal device and the communication unit 705 may be a transceiver of the terminal device, or the communication unit 705 may be a transceiver circuit of the terminal device.
The electronic device 700 may comprise one or more memories 702, on which programs 704 are stored, and the programs 704 may be executed by the processor 701, and generate instructions 703, so that the processor 701 may execute the method described in the above method embodiment according to the instructions 703.
Optionally, data may also be stored in the memory 702. Alternatively, the processor 701 may also read data stored in the memory 702, the data may be stored at the same memory address as the program 704, or the data may be stored at a different memory address from the program 704.
Alternatively, the processor 701 and the memory 702 may be provided separately or integrated together; for example, on a System On Chip (SOC) of the terminal device.
For example, the memory 702 may be configured to store a program 704 related to the method of the screen-out display provided in the embodiment of the present application, and the processor 701 may be configured to call the program 704 related to the method of the screen-out display stored in the memory 702 when the terminal device performs the screen-out display, so as to execute the method of the screen-out display according to the embodiment of the present application; for example, a first instruction of a user is obtained, where the first instruction is used to instruct a target touchscreen to display an AOD resource package, where the target AOD resource package includes a target application package APK, and the target APK is used to process data of the user; calling the target APK according to the target AOD resource packet; and when the electronic equipment is turned off, acquiring the data in the target APK and displaying the data on the display screen in a turned-off manner.
The application also provides a computer program product which, when executed by the processor 701, implements the method according to any of the method embodiments of the application.
The computer program product may be stored in the memory 702, for example, as the program 704, and the program 704 is finally converted into an executable object file capable of being executed by the processor 701 through preprocessing, compiling, assembling, linking and the like.
The present application also provides a computer-readable storage medium having stored thereon a computer program which, when executed by a computer, implements the method of any of the method embodiments of the present application. The computer program may be a high-level language program or an executable object program.
Optionally, the computer readable storage medium is, for example, a memory 702. Memory 702 may be either volatile memory or nonvolatile memory, or memory 702 may include both volatile and nonvolatile memory. The non-volatile memory may be a read-only memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an electrically Erasable EPROM (EEPROM), or a flash memory. Volatile memory can be Random Access Memory (RAM), which acts as external cache memory. By way of example, but not limitation, many forms of RAM are available, such as Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), Synchronous Dynamic Random Access Memory (SDRAM), double data rate SDRAM, enhanced SDRAM, SLDRAM, Synchronous Link DRAM (SLDRAM), and direct rambus RAM (DR RAM).
It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working processes and the generated technical effects of the above-described apparatuses and devices may refer to the corresponding processes and technical effects in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, the disclosed system, apparatus and method can be implemented in other ways. For example, some features of the method embodiments described above may be omitted, or not performed. The above-described embodiments of the apparatus are merely exemplary, the division of the unit is only one logical function division, and there may be other division ways in actual implementation, and a plurality of units or components may be combined or integrated into another system. In addition, the coupling between the units or the coupling between the components may be direct coupling or indirect coupling, and the coupling includes electrical, mechanical or other connections.
It should be understood that, in the various embodiments of the present application, the sequence numbers of the processes do not mean the execution sequence, and the execution sequence of the processes should be determined by the functions and the inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
Additionally, the terms "system" and "network" are often used interchangeably herein. The term "and/or" herein is merely an association relationship describing an associated object, meaning that three relationships may exist, e.g., a and/or B, may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the character "/" herein generally indicates that the former and latter related objects are in an "or" relationship.
In short, the above description is only a preferred embodiment of the present disclosure, and is not intended to limit the scope of the present disclosure. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present application shall be included in the protection scope of the present application.

Claims (10)

1. A method for displaying a screen off, the method being applied to an electronic device having a display screen, the method comprising:
acquiring a first instruction of a user, wherein the first instruction is used for indicating a target information screen to display an AOD resource package, the target AOD resource package comprises a target application program package (APK), and the target APK is used for processing data of the user;
calling the target APK according to the target AOD resource packet;
and when the electronic equipment is turned off, acquiring the data in the target APK and displaying the data on the display screen in a turned-off manner.
2. The method of claim 1, wherein the target AOD resource package includes a scene description file, the scene description file includes a calling method of the target APK, and the calling the target APK according to the target AOD resource package includes:
analyzing the scene description file to obtain a calling method of the target APK;
and calling the target APK according to the calling method.
3. The method of claim 1 or 2, wherein the target APK is further used to obtain data of the user through a wearable device of the user.
4. The method of any of claims 1-3, wherein the target APK is further to instruct the user to configure target parameters for a screen-out display of the electronic device.
5. The method of any of claims 1 to 4, wherein the target APK is a first APK, and further comprising, prior to invoking the target APK according to the target AOD resource package:
stopping invoking a second APK, the second APK being different from the first APK.
6. The method of any of claims 1 to 5, wherein the target AOD resource package further comprises a preview image, and the preview image is used for displaying a preview interface of the electronic device for screen-off.
7. The method of any of claims 1 to 6, further comprising:
and displaying a theme list, wherein the theme list is used for the user to determine a target theme displayed by the electronic equipment in a screen-off mode, and the target theme corresponds to the target AOD resource package.
8. The method of claim 7, wherein the topic list includes a topic category, and the topic category is used to indicate a category corresponding to a topic in the topic list.
9. An apparatus for blanking display, the apparatus comprising a processor and a memory, the memory for storing a computer program, the processor for calling up and running the computer program from the memory, such that the apparatus performs the method of any one of claims 1 to 8.
10. A computer-readable storage medium, characterized in that the computer-readable storage medium stores a computer program which, when executed by a processor, causes the processor to carry out the method of any one of claims 1 to 8.
CN202110827202.3A 2021-07-21 2021-07-21 Method and device for off-screen display Active CN114115772B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110827202.3A CN114115772B (en) 2021-07-21 2021-07-21 Method and device for off-screen display

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110827202.3A CN114115772B (en) 2021-07-21 2021-07-21 Method and device for off-screen display

Publications (2)

Publication Number Publication Date
CN114115772A true CN114115772A (en) 2022-03-01
CN114115772B CN114115772B (en) 2023-08-11

Family

ID=80359496

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110827202.3A Active CN114115772B (en) 2021-07-21 2021-07-21 Method and device for off-screen display

Country Status (1)

Country Link
CN (1) CN114115772B (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110087292A (en) * 2019-04-28 2019-08-02 努比亚技术有限公司 Intelligent wearable device, energy-saving control method and computer readable storage medium
CN110221898A (en) * 2019-06-19 2019-09-10 北京小米移动软件有限公司 Display methods, device, equipment and the storage medium of breath screen picture
CN110489199A (en) * 2019-08-23 2019-11-22 深圳传音控股股份有限公司 Breath screen display method, apparatus, terminal and storage medium
CN111580908A (en) * 2020-04-30 2020-08-25 江苏紫米电子技术有限公司 Display method, device, equipment and storage medium
US20200326845A1 (en) * 2016-04-05 2020-10-15 Samsung Electronics Co., Ltd. Electronic device for displaying picture and control method therefor
CN112181560A (en) * 2020-09-24 2021-01-05 Oppo(重庆)智能科技有限公司 Navigation interface display method and device, electronic equipment and readable storage medium
WO2021000804A1 (en) * 2019-06-29 2021-01-07 华为技术有限公司 Display method and apparatus in locked state
CN112764624A (en) * 2021-01-26 2021-05-07 维沃移动通信有限公司 Information screen display method and device
CN112783392A (en) * 2021-01-29 2021-05-11 展讯通信(上海)有限公司 Information screen display method and device
CN113138816A (en) * 2020-01-19 2021-07-20 华为技术有限公司 Message screen display theme display method and mobile device

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200326845A1 (en) * 2016-04-05 2020-10-15 Samsung Electronics Co., Ltd. Electronic device for displaying picture and control method therefor
CN110087292A (en) * 2019-04-28 2019-08-02 努比亚技术有限公司 Intelligent wearable device, energy-saving control method and computer readable storage medium
CN110221898A (en) * 2019-06-19 2019-09-10 北京小米移动软件有限公司 Display methods, device, equipment and the storage medium of breath screen picture
WO2021000804A1 (en) * 2019-06-29 2021-01-07 华为技术有限公司 Display method and apparatus in locked state
CN110489199A (en) * 2019-08-23 2019-11-22 深圳传音控股股份有限公司 Breath screen display method, apparatus, terminal and storage medium
CN113138816A (en) * 2020-01-19 2021-07-20 华为技术有限公司 Message screen display theme display method and mobile device
CN111580908A (en) * 2020-04-30 2020-08-25 江苏紫米电子技术有限公司 Display method, device, equipment and storage medium
CN112181560A (en) * 2020-09-24 2021-01-05 Oppo(重庆)智能科技有限公司 Navigation interface display method and device, electronic equipment and readable storage medium
CN112764624A (en) * 2021-01-26 2021-05-07 维沃移动通信有限公司 Information screen display method and device
CN112783392A (en) * 2021-01-29 2021-05-11 展讯通信(上海)有限公司 Information screen display method and device

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
杨洁: "《视觉交互设计》", 31 October 2018, 江苏凤凰美术出版社, pages: 26 - 27 *
江苏省教育厅: "《江苏高校优势学科建设项目案例选编》", 31 May 2019, 镇江:江苏大学出版社, pages: 160 - 161 *

Also Published As

Publication number Publication date
CN114115772B (en) 2023-08-11

Similar Documents

Publication Publication Date Title
WO2021136050A1 (en) Image photographing method and related apparatus
WO2022127787A1 (en) Image display method and electronic device
WO2021258814A1 (en) Video synthesis method and apparatus, electronic device, and storage medium
WO2021218429A1 (en) Method for managing application window, and terminal device and computer-readable storage medium
CN113935898A (en) Image processing method, system, electronic device and computer readable storage medium
CN113254409A (en) File sharing method, system and related equipment
WO2022001258A1 (en) Multi-screen display method and apparatus, terminal device, and storage medium
WO2021238740A1 (en) Screen capture method and electronic device
CN115333941A (en) Method for acquiring application running condition and related equipment
WO2023207667A1 (en) Display method, vehicle, and electronic device
CN114444000A (en) Page layout file generation method and device, electronic equipment and readable storage medium
CN116048831B (en) Target signal processing method and electronic equipment
WO2023000746A1 (en) Augmented reality video processing method and electronic device
CN113380240B (en) Voice interaction method and electronic equipment
CN117348894A (en) Software upgrading method, terminal equipment and system
CN114115772B (en) Method and device for off-screen display
CN113822643A (en) Method and device for setting travel reminding
CN114003241A (en) Interface adaptation display method and system of application program, electronic device and medium
CN114173381A (en) Data transmission method and electronic equipment
CN113495733A (en) Theme pack installation method and device, electronic equipment and computer readable storage medium
CN113867851A (en) Electronic equipment operation guide information recording method, electronic equipment operation guide information acquisition method and terminal equipment
CN114020186B (en) Health data display method and device
CN116709609B (en) Message delivery method, electronic device and storage medium
CN116048629B (en) System service switching method, control device, electronic equipment and storage medium
WO2023116669A1 (en) Video generation system and method, and related apparatus

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant