CN116521283A - Interface display method, device, equipment and storage medium - Google Patents

Interface display method, device, equipment and storage medium Download PDF

Info

Publication number
CN116521283A
CN116521283A CN202210070014.5A CN202210070014A CN116521283A CN 116521283 A CN116521283 A CN 116521283A CN 202210070014 A CN202210070014 A CN 202210070014A CN 116521283 A CN116521283 A CN 116521283A
Authority
CN
China
Prior art keywords
interface
target
picture set
target video
target picture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210070014.5A
Other languages
Chinese (zh)
Inventor
赵翔
曹晓亮
王多新
李建余
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202210070014.5A priority Critical patent/CN116521283A/en
Publication of CN116521283A publication Critical patent/CN116521283A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/26Power supply means, e.g. regulation thereof
    • G06F1/32Means for saving power
    • G06F1/3203Power management, i.e. event-based initiation of a power-saving mode
    • G06F1/3234Power saving characterised by the action undertaken
    • G06F1/325Power saving in peripheral device
    • G06F1/3265Power saving in display device
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application discloses an interface display method, an interface display device, interface display equipment and a storage medium, and belongs to the field of electronic equipment. The method is used for an electronic device supporting operation of a first system and a second system, the operation power consumption of the first system is lower than the operation power consumption of the second system, and the method comprises the following steps: the second system control screen displays a second system interface, and the second system interface takes the target video as an interface background; responding to a system switching instruction, and acquiring screen control authority by a first system; the first system control screen displays a first system interface, the first system interface takes a continuously played target picture set as an interface background, and the target picture set is generated based on a target video. By adopting the scheme provided by the embodiment of the application, the system interface is dynamically displayed through the system control screen with different power consumption, so that the power consumption of the electronic equipment is reduced.

Description

Interface display method, device, equipment and storage medium
Technical Field
The embodiment of the application relates to the field of electronic equipment, in particular to an interface display method, an interface display device, interface display equipment and a storage medium.
Background
With the continuous development of technology, more and more electronic devices with different functions are generated, and a great deal of convenience is brought to the daily life of users.
The user can set the functions of the electronic device in a custom manner to increase its playability. Taking the intelligent watch as an example, a user can set a dial background according to own preference, but the power consumption of the intelligent watch is increased, so that the cruising ability of the intelligent watch is reduced.
Disclosure of Invention
The embodiment of the application provides an interface display method, device, equipment and storage medium. The technical scheme is as follows:
in one aspect, an embodiment of the present application provides an interface display method, where the method is used for an electronic device, where the electronic device supports operation of a first system and a second system, and operation power consumption of the first system is lower than operation power consumption of the second system;
the method comprises the following steps:
the second system control screen displays a second system interface, and the second system interface takes a target video as an interface background;
responding to a system switching instruction, and acquiring screen control authority by the first system;
the first system control screen displays a first system interface, the first system interface takes a continuously played target picture set as an interface background, and the target picture set is generated based on the target video.
On the other hand, the embodiment of the application provides an interface display device, which is used for electronic equipment, wherein the electronic equipment supports to operate a first system and a second system, and the operation power consumption of the first system is lower than that of the second system;
The device comprises:
the second system module is used for controlling the screen to display a second system interface, and the second system interface takes the target video as an interface background;
the first system module is used for responding to the system switching instruction and acquiring screen control authority;
the first system module is further configured to control the screen to display a first system interface, where the first system interface uses a continuously played target picture set as an interface background, and the target picture set is generated based on the target video.
In another aspect, an embodiment of the present application provides an electronic device including a processor and a memory; the memory stores at least one instruction for execution by the processor to implement the interface display method of the above aspect.
In another aspect, embodiments of the present application provide a computer-readable storage medium storing at least one instruction for execution by a processor to implement an interface display method as described in the above aspects.
In another aspect, embodiments of the present application provide a computer program product or computer program comprising computer instructions stored in a computer-readable storage medium. The processor of the electronic device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions, so that the electronic device performs the interface display method provided in the above aspect.
In the embodiment of the application, for the electronic equipment supporting the dual systems, the second system interface taking the target video as the interface background is displayed through the second system control screen with high power consumption, when the second system is switched to the first system, the first system interface taking the continuously played target picture set as the interface background is displayed through the first system control screen with low power consumption, and the target picture set is generated by the target video, so that the background consistency of the first system interface and the second system interface is ensured, and meanwhile, the system interface is dynamically displayed through the system control screen with different power consumption, so that the power consumption of the electronic equipment is reduced.
Drawings
FIG. 1 is a schematic diagram of a second processor corresponding dual-core communication software framework, as shown in an exemplary embodiment of the present application;
FIG. 2 is a schematic diagram of a first processor corresponding dual-core communication software framework, as shown in an exemplary embodiment of the present application;
FIG. 3 illustrates a flowchart of an interface display method provided by an exemplary embodiment of the present application;
FIG. 4 illustrates a flowchart of an interface display method provided by another exemplary embodiment of the present application;
FIG. 5 is a schematic diagram of a process for interaction between a mobile phone and a second system according to an exemplary embodiment of the present application;
FIG. 6 is a schematic diagram of a second system and first system interaction process provided in an exemplary embodiment of the present application;
FIG. 7 illustrates a flowchart of an interface display method provided by another exemplary embodiment of the present application;
FIG. 8 illustrates a process diagram of interface context switching provided by an exemplary embodiment of the present application;
FIG. 9 is a schematic diagram illustrating the process of interface background deletion provided by an exemplary embodiment of the present application;
FIG. 10 is a schematic diagram of a mobile phone, a second system and a first system interaction process according to an exemplary embodiment of the present application;
FIG. 11 illustrates a block diagram of an interface display device provided in an exemplary embodiment of the present application;
fig. 12 is a block diagram illustrating a structure of an electronic device according to an exemplary embodiment of the present application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the present application more apparent, the embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
References herein to "a plurality" means two or more. "and/or", describes an association relationship of an association object, and indicates that there may be three relationships, for example, a and/or B, and may indicate: a exists alone, A and B exist together, and B exists alone. The character "/" generally indicates that the context-dependent object is an "or" relationship.
In the related art, a single processor is provided in an electronic device, and all system events generated in the running process of the device are processed by an operating system running on the processor, so that the processor needs to have a relatively strong data processing capability, and maintain a working state in the running process of the device. However, in the daily use process, the electronic device only needs to implement some functions with low requirements on processing performance in most cases, for example, for a smart watch or a smart bracelet, in order to increase the playability of the device, the video may be set as an interface background of the smart watch or the smart bracelet, and when the smart watch or the smart bracelet displays time, the interface dynamically displays the video. Therefore, the processor needs to maintain the operating state for a long time, which increases the power consumption of the device, resulting in a reduction of the endurance time of the electronic device (especially on wearable devices with smaller battery capacity).
In order to reduce power consumption of the electronic device while ensuring performance of the electronic device, in one possible implementation, the electronic device is provided with at least a first processor and a second processor with different processing performances and power consumption, which are respectively used for running a first system and a second system (i.e. a dual-core dual-system), and a set of system switching mechanism is designed for the dual-core dual-system.
In the running process of the electronic equipment, when an event with high-performance processing requirement exists (such as when an application program is started), the event is processed by the second system by waking up the high-power-consumption processor, so that the triggered event can be timely responded and processed, the performance requirement of the electronic equipment is met, and meanwhile, the second system controls a screen to display a second system interface taking a video as an interface background. When an event with low-performance processing requirement exists, in order to reduce the power consumption of the electronic equipment, a first system running on a low-power processor processes the event, and meanwhile, the first system acquires screen control authority, a first system interface taking a picture set as an interface background is displayed, and the picture set is obtained by processing videos through a second system. Therefore, the processing resource used when the picture set is displayed is lower than the processing resource used when the video is displayed, so that the power consumption of the electronic equipment can be reduced while the consistency of the dynamic interface background is ensured.
In the embodiment of the application, since the first processor and the second processor operate asynchronously, and the first system and the second system need to implement system communication (or referred to as dual-core communication). In one possible application scenario, the first system is a real-time operating system (Real Time Operating System, RTOS) running on a micro control unit (Micro Controller Unit, MCU), and the second system is an Android operating system running on a central processing unit (Central Processing Unit, CPU).
As shown in FIG. 1, a dual core communication software framework of the android operating system is shown in an exemplary embodiment of the present application. The dual-core communication software Framework follows the design principle of low coupling, high reliability and high multiplexing, and comprises the module development of Kernel, HIDL (hardware abstraction layer interface description language), native Service, framework Service (Framework Service), framework API (Framework interface) and APP (application) parts.
The APP module comprises function modules such as a desktop starter, a Setting module, a system UI (user interface) and the like, the Framework API module comprises management modules such as MCUManager (MCU management), sensor manager (Sensor management), location manager and the like, the Framework Service module comprises Service modules such as MCUManager Service (MCU management Service), system Sensor manager, location manager Service and the like, the Native Service module comprises Service modules such as dcccservice, sensor Service and the like, and the HIDL module comprises modules such as Sensor hardware abstraction layer, GPS (global positioning system hardware abstraction layer) and the like. The Kernel module includes dcc_data, mcu_sensor, mcu_gps, etc. DCC Transfer Driver (DCC transport driver).
The transmission layer is used as an interface layer for connecting an upper layer and a lower layer in the dual-core communication software framework, shields transmission details of communication of the lower layer (data link layer) of the system for the application layer, and provides a service channel for the application scene; the application layer is used as a main body of service provision, responds to man-machine interaction, transmits data generated in the man-machine interaction process through the transmission layer, and responds to an external data request.
RTOS adopts the peer-to-peer principle to design. Taking an electronic device as a smart watch for example, as shown in fig. 2, a dual-core communication software framework of an RTOS shown in an exemplary embodiment of the present application is shown.
The dual-core communication software Framework of the RTOS is divided into an application Layer (Application Layer), a Service Layer (Service Layer), a Framework Layer (Framework Layer), a hardware abstraction Layer (Hardware abstraction Layer) and a Platform Layer (Platform Layer).
The application layer comprises application modules such as a watch face (dial), a day Tracker (Daily tracking), a Message center (Message center), a Voice around Apps (voice application), health Apps (Health application), settings and the like; the Service layer comprises a Sport health task, a System manager task (system management task), an AMS (activity management Service), an Audio Service, a Log Service, an OFTP Service, a BT Service, a Delegate Service, an RPC Service, a sensor Service, a storage Service and other Service modules; the Framework layer comprises Framework modules such as Message Pub (Message center), UI Framework (user interface Framework), G2D Engine (G2D Engine), audio Middleware (Audio Middleware), reference (Preference), file system (File system), algorithm (Algorithm), asycevent (in-process asynchronous event) and the like; the hardware abstraction layer comprises hardware abstraction modules such as Screen/TP (Screen/touch Screen), sensors and the like; the platform layer includes board level support packages (Board Support Package, BSP) including Screen/TP, codec, sensors, flash (flash memory), PSRAM (pseudo static random access memory), etc., and LOW level drivers including Uart (universal asynchronous receiver transmitter), ADC (analog to digital converter), GPIO (general purpose input output), SPI (serial peripheral interface), I2C (integrated circuit bus), IOS (input output system), PCM (pulse code modulation), I2S (integrated audio bus), HWTimer (hardware timer).
It should be noted that, the dual-core communication software framework is only used for schematic illustration, and those skilled in the art may further add, delete or modify the framework according to actual needs, and the embodiment of the present application does not limit the specific structure of the dual-core communication software framework.
Referring to fig. 3, a flowchart of an interface display method according to an exemplary embodiment of the present application is shown, where the method applies an electronic device, and the electronic device supports running a first system and a second system, and the method may include the following steps:
in step 310, the second system control screen displays a second system interface, where the second system interface uses the target video as an interface background.
Wherein the operating power consumption of the first system is lower than the operating power consumption of the second system.
In one possible implementation, an electronic device is provided with a first processor and a second processor, wherein the processing performance of the first processor is lower than the processing performance of the second processor (both the processing power and the processing speed of the first processor are lower than the second processor), and the power consumption of the first processor is lower than the power consumption of the second processor. Accordingly, the second system (executed by the second processor) is capable of processing events processed by the first system (executed by the first processor), and the first system is not necessarily capable of processing events processed by the second system.
In another possible implementation, the electronic device may also be provided with a single processor, where the first system and the second system are respectively running on different cores of the processor, where the processing performance of the core running the second system is higher than the processing performance of the core of the first system Yu Yunhang.
For example, taking an electronic device as an intelligent watch as an example, the first processor is an MCU, the second processor is a CPU, the first system is an RTOS, and the second system is an android system. Correspondingly, the events which can be processed by the first system comprise scenes or weak interaction scenes with lower requirements on processing performance, such as dial display, dial interface switching, notification message display and the like; the events which can be processed by the second system comprise scenes with high requirements on processing performance, such as incoming call answering, starting application, dial editing, function setting and the like, or strong interaction scenes.
In one possible implementation, the operating modes of the electronic device include a performance mode, a hybrid mode, and a low power consumption mode, where in the performance mode, both the second processor and the first processor remain awake (respectively, both the first system and the second system are awake); in the low power mode, only the first processor remains awake and the second processor remains off (i.e., the first system is awake and the second system is off); in the hybrid mode, the second processor is in a standby state and is switchable between a sleep and an awake state when events are handled by the first system (i.e., the second system may be in either the awake state or the sleep state when the first system is in the awake state).
Optionally, in the wake-up state, the system-related data is cached in a memory (RAM) so as to be convenient for running the system-related data at any time, in the sleep state, most of the hardware modules of the processor are closed, and the system-related data is stored in a hard disk (ROM) and written into the memory by the hard disk when the system-related data is switched to the wake-up state.
Unlike smart phones and other electronic devices with strong interaction properties, wearable devices are used as auxiliary electronic devices, and only weak interaction exists between the wearable devices and users in most use situations. For example, the user can only watch the wrist through the intelligent watch in most scenes. Therefore, when the wearable device processes the event through the first system, the second processor is controlled to be in the sleep state (the second system is in the sleep state), so that the overall power consumption of the wearable device is reduced. For convenience of description, the wearable device is taken as an example in the following embodiments.
Optionally, when the second system is in the wake-up state, the second system acquires the screen control authority, and controls the screen to display a second system interface, wherein the system interface of the second system takes the target video as an interface background. In one possible application scenario, the first system is in a sleep state or a background running state when the second system is in an awake state.
Optionally, the second system interface is a dial interface.
In step 320, the first system acquires the screen control right in response to the system switching instruction.
At this time, since the screen control authority of the electronic device is located in the second system, when the electronic device receives the system switching instruction, the first system first needs to acquire the screen control authority from the second system.
In one possible implementation, the system switch instruction may be triggered manually by the user or may be triggered automatically, e.g., automatically without operation for a long period of time.
Optionally, the manual triggering by the user may be a touch operation on the electronic device, for example, clicking a screen, or a gesture operation, for example, lifting a wrist, or a physical key operation, for example, clicking a crown, which is not limited in the embodiment of the present application.
In step 330, the first system control screen displays a first system interface, where the first system interface uses a continuously played target picture set as an interface background, and the target picture set is generated based on the target video.
In one possible implementation manner, after the first system acquires the screen control authority from the second system, the control screen displays the first system interface, and the interface background of the first system interface is the continuously played target picture set because the processing performance of the first system is lower than that of the second system. In order to ensure the consistency of the display content of the system interface background after the second system is switched to the first system, the interface background of the first system interface, namely the target picture set, is obtained by processing the interface background of the second system, namely the target video.
Optionally, the first system interface is a dial interface.
In one possible application scenario, the first system is in an awake state, and the second system is in a sleep state or a background running state.
In addition, in a possible implementation manner, in response to the system switching instruction again, the second system acquires the screen control authority again, and the second system control screen displays a second system interface, and the second system interface switches to taking the target video as an interface background.
In summary, in the embodiment of the present application, for an electronic device supporting dual systems, a second system interface using a target video as an interface background is displayed through a second system control screen with high power consumption, and when the second system is switched to a first system, a first system interface using a continuously played target picture set as an interface background is displayed through a first system control screen with low power consumption, and the target picture set is generated by the target video, so as to ensure the background consistency of the first system interface and the second system interface, and meanwhile, the system interface is dynamically displayed through a system control screen with different power consumption, so that the power consumption of the electronic device is reduced.
In the embodiment of the application, the second system performs frame extraction processing on the target video to obtain a target picture set, and sends the target picture set to the first system, and when the first system acquires the screen control authority, the first system interface taking the target picture set as the interface background is controlled to be displayed on the screen. Referring to fig. 4, a flowchart of an interface display method according to another exemplary embodiment of the present application is shown, where the method includes:
In step 401, the second system acquires a target video.
In one possible implementation, the target video is sent by a third party device to the second system.
Alternatively, the third party device may be a mobile phone, a tablet computer, a wearable device, etc., which is not limited in the embodiments of the present application.
For example, as shown in fig. 5, a process of obtaining a target video by the second system will be described by taking a mobile phone as an example. The user can record a section of video material through the video recording function of the mobile phone or select a section of video material from the mobile phone album. Further, the video material is cut through a video cutting tool in the mobile phone, and a video clip is obtained. Alternatively, the duration of the video clip may be 5s, 10s, 30s, etc., which is not limited in the embodiment of the present application. Further, the video clips are cut according to the resolution of the electronic equipment through the mobile phone, and the video clips are adjusted to be videos with the screen parameter size of the equipment, so that target videos are obtained, and the screens of the electronic equipment are adapted. Further, the target video is transmitted to an electronic device, such as a smart watch or the like. Further, a second system of the electronic device receives the target video.
In another possible implementation, the electronic device stores the target video locally and the second system obtains the target video locally.
In step 402, the second system performs frame extraction processing on the target video based on the picture rendering capability of the first system to obtain a target picture set, wherein a frame extraction interval in the frame extraction processing process and the picture rendering capability of the first system are in a negative correlation.
And because the processing performance of the second system is higher than that of the first system, the second system performs frame extraction processing on the target video to obtain a target picture set.
In one possible implementation manner, the second system determines a frame extraction interval based on the maximum picture rendering rate of the first system, and performs frame extraction processing on the target video based on the frame extraction interval to obtain the target picture set. The maximum picture rendering rate of the first system is used for representing the maximum capability of picture rendering of the first system.
In addition, in one possible implementation, the stronger the picture rendering capability of the first system, the shorter the frame extraction interval, the weaker the picture rendering capability of the first system, and the longer the frame extraction interval.
For example, if the picture rendering capability of the first system is to render 10 pictures per second, the frame interval is 1/10 second, and if the picture rendering capability of the first system is to render 20 pictures per second, the frame interval is 1/20 second.
In step 403, the second system sends the target picture set to the first system.
In one possible implementation manner, the electronic device stores the target video acquired by the second system in a storage space corresponding to the first system.
Step 404, the second system control screen displays a second system interface, where the second system interface uses the target video as an interface background.
In this step, please refer to step 310, and the embodiment of the present application will not be repeated.
In step 405, in response to the system switching instruction, the first system acquires the screen control right.
In this step, please refer to step 320, and the embodiment of the present application will not be repeated.
In step 406, the first system control screen displays a first system interface, where the first system interface uses a continuously played target picture set as an interface background, and the target picture set is generated based on the target video.
In this step, please refer to step 330, and the embodiment of the present application will not be repeated.
Exemplary, as shown in fig. 6, a schematic diagram of a first system and a second system interaction process provided in an exemplary embodiment of the present application is shown. After the second system acquires the target video, frame extraction processing is carried out on the target video to obtain a target picture set, and the target picture set is sent to the first system. At this time, the screen control authority is located in the second system, so that the second system controls the screen to display a second system interface taking the target video as the interface background. And responding to the system switching instruction, the first system acquires the screen control authority and controls the screen to display a first system interface taking the continuously played target picture set as an interface background.
In the embodiment of the application, the second system performs frame extraction processing on the target video based on the picture rendering capability of the first system to obtain the target picture set, and after the first system acquires the screen control authority, the first system interface with the continuously played target picture set as the interface background is controlled to be displayed on the screen, so that the power consumption of the electronic equipment is reduced.
In another possible implementation manner, the method is not limited to the above-mentioned second system performing frame extraction processing on the target video to obtain the target picture set, but the third party device may also perform frame extraction processing on the target video to obtain the target picture set, and the target picture set is sent to the electronic device by the third party device. A third party device is exemplified as a cell phone. And processing the video material selected by the user through the mobile phone to obtain a target video, sending the target video to the electronic equipment, and obtaining the target video by the second system so as to control the screen to display a second system interface taking the target video as an interface background. And meanwhile, the mobile phone performs frame extraction processing on the target video according to the picture rendering capability of the first system to obtain a target picture set, and then the target picture set is sent to the electronic equipment. The first system acquires the target picture set, and controls the screen to display a first system interface taking the continuously played target picture set as an interface background after acquiring the screen control authority.
In the embodiment of the application, the first system controls the screen to display the first system interface based on the picture rendering rate and the target picture set. In one possible implementation, the picture rendering rate is not constant, but varies according to the operation scene, so as to further reduce the power consumption of the electronic device. Referring to fig. 7, a flowchart of an interface display method according to another exemplary embodiment of the present application is shown, where the method includes:
in step 701, in the case that the current operation scene is a high power consumption operation scene, the first system determines that the picture rendering rate is a first rate, where the first rate is consistent with the maximum picture rendering rate.
In one possible implementation manner, the first system obtains a scene identifier of a current operation scene, when the scene identifier indicates that the current operation scene is the first operation scene, the current scene is indicated to be the high-power consumption operation scene, and the first system determines that the picture rendering rate is the first rate according to the operation scene. In order to ensure the display effect of the first system interface in the high-power-consumption operation scene, the first rate is the maximum picture rendering rate of the first system.
Alternatively, the high power consumption operation scene may be a scene such as a display time or a display notification, which is not limited in the embodiment of the present application.
In step 702, in the case that the current operation scene is a low power consumption operation scene, the first system determines that the picture rendering rate is a second rate, where the second rate is less than the maximum picture rendering rate.
In another possible implementation manner, the first system obtains a scene identifier of the current operation scene, and when the scene identifier indicates that the current operation scene is the second operation scene, the current operation scene is indicated to be the low-power operation scene, that is, the power consumption of the second operation scene is lower than the power consumption of the first operation scene. The first system adjusts the picture rendering rate to a second rate according to the running scene. Since the current scene is a low power consumption operational scene, the second rate is less than the maximum picture rendering rate in order to reduce power consumption of the electronic device.
Alternatively, the low power operation scene type may be a low power scene, a sleep scene, or the like, which is not limited in the embodiment of the present application.
In addition, in one possible implementation, the second rate is not fixed in the low power operation scenario, but is different according to the type of the low power operation scenario, so that the power consumption of the electronic device is reduced as much as possible.
Illustratively, in low battery scenes, the second rate is 20 pictures per second, in sleep scenes, the second rate is 5 pictures per second, etc.
In step 703, the first system extracts the target picture from the target picture set based on the picture rendering rate.
In one possible implementation, due to different running scenes, the picture rendering rates determined by the first system are different, so that the target pictures extracted from the target pictures are different.
Optionally, in the high power consumption operation scene, the first system extracts the first target picture from the target picture set based on the first rate; optionally, in the low power consumption operation scenario, the first system extracts the second target picture from the target picture set based on the second rate. Since the second rate is smaller than the first rate, the content and the number of the first target picture and the second target picture may be different.
For example, the first rate may be 25 pictures per second, the second rate may be 10 pictures per second, and the first system may extract 50 target pictures from the target picture set based on the first rate and 20 target pictures from the target picture set based on the second rate if the preset display time of the first system interface is 2 seconds.
In step 704, the first system renders the target picture based on the picture rendering rate and controls the screen to display the first system interface with the target picture.
Optionally, in the high-power-consumption operation scene, the first system renders the first target picture based on the first rate, and controls the screen to display a first system interface taking the first target picture as an interface background; optionally, in the low-power-consumption operation scene, the first system renders the second target picture based on the second rate, and controls the screen to display the first system interface taking the second target picture as the interface background.
Illustratively, in a high power consumption operating scenario, the first system renders 50 pictures at a rendering rate of 25 pictures per second. In a low power consumption operating scene, the first system renders 20 pictures at a rendering rate of 10 pictures per second.
In the embodiment of the application, the first system determines different picture rendering rates based on different running scenes, so that the target pictures are further extracted from the target picture set to control the screen to display the first system interface, and the power consumption of the electronic equipment is further reduced.
In addition, in one possible implementation, to improve the playability of the electronic device, the user may change the target video to display a different interface background. The method is described below.
1. And responding to the interface background switching operation, and switching the target video by the second system.
In one possible implementation, in response to the interface context switching operation, the second system switches the first target video to the second target video, and the second system controls the screen to display a second system interface with the second target video as an interface context. The interface background switching operation is performed in a background setting interface under the second system.
Alternatively, the interface background switching operation may be a click operation, a long press operation, a sliding operation, or the like, which is not limited in the embodiment of the present application.
And the electronic equipment receives interface background switching operation of the user, and the second system switches different target videos.
Illustratively, as shown in fig. 8, the second system control screen displays a second system interface 81, and the second system interface 81 uses video 1 as an interface background. The user clicks the video selection control 821 on the background setting interface 82 under the second system, switching the target video to video 3. At this time, the second system control screen displays the interface background of the second system interface 81 switched from video 1 to video 3.
In addition, in the embodiment of the application, after the second system switches the first target video to the second target video, similarly, frame extraction processing is performed on the second target video according to the method in the foregoing embodiment, so as to obtain a second target picture set corresponding to the second target video.
2. The second system sends a picture set switching instruction to the first system.
In one possible implementation manner, after the second system switches the target video, in order to ensure the consistency of the system interface backgrounds of the first system and the second system, the second system needs to send a picture set switching instruction to the first system to inform the first system of switching the interface background of the first system interface.
The picture set switching instruction comprises a switching identification and the second target picture set.
3. The first system switches the target picture set based on the picture switching instruction.
In one possible implementation, after receiving the picture switching instruction, the first system switches the first target picture set (corresponding to the first target video) to the second target picture set. And after the first system acquires the screen control authority, controlling the screen to display a first system interface taking the second target picture set as an interface background.
For example, as shown in fig. 8, after the second system switches the video 1 to the video 3, a picture set switching instruction is sent to the first system, and after the first system receives the instruction, the interface background of the first system interface 83 is switched from the picture set 1 (the picture set corresponding to the video 1) to the picture set 3 (the picture set corresponding to the video 2).
In addition, in another possible implementation, the user may also delete disliked target videos. The method is described below.
1. And in response to the interface background deleting operation, deleting the target video by the second system.
In one possible implementation, the electronic device receives an interface background deleting operation of the user, and the second system deletes the target video, that is, the second system control screen displays the second system interface, and the second system interface no longer uses the target video as the interface background.
Illustratively, as shown in fig. 9, the second system control screen displays a second system interface 91, and the second system interface 91 uses video 1 as an interface background. The user clicks the delete control 921 corresponding to video 1 on the background settings interface 92 under the second system, which deletes video 1. At this time, the second system control screen displays the second system interface, and the interface background of the second system interface no longer displays video 1.
2. The second system sends a picture set deleting instruction to the first system.
In one possible implementation manner, after deleting the target video, in order to ensure the consistency of the backgrounds of the first system and the system interfaces of the second system, the second system needs to send a photo set deleting instruction to the first system to inform the first system to delete the interface background of the first system interface.
The picture set deleting instruction comprises a picture set identifier and is used for indicating to delete a target picture set corresponding to the target video.
3. The first system deletes the target picture set based on the picture set deletion instruction.
In one possible implementation manner, the first system deletes the target picture set after receiving the picture set deletion instruction. When the first system acquires the screen control authority, the screen is controlled to display a first system interface, and the first system interface does not take the target picture set as an interface background.
For example, as shown in fig. 9, after the second system deletes the target video, a picture set deletion instruction is sent to the first system, and after the first system receives the instruction, the interface background of the first system interface 93 is deleted, and the first system interface 93 does not display the picture set 1 (the picture set corresponding to the video 1).
In the embodiment of the application, after the second system switches the target video, a picture set switching instruction is sent to the first system, and the first system switches the target picture set based on the switching instruction so as to ensure the consistency of interface backgrounds of the second system and the first system. After the second system deletes the target video, a picture set deleting instruction is sent to the first system, the first system deletes the corresponding target picture set, and the storage space of the first system and the second system is released, so that unnecessary occupation of the storage space caused by deleting only the target video in the second system and not deleting the target picture set of the first system is avoided.
Taking a third party device as an example of a mobile phone, a complete interaction process between the third party device and the second system and between the third party device and the first system in the embodiment of the application is described. Referring to fig. 10, a schematic diagram of a mobile phone, a second system and a first system interaction process according to an exemplary embodiment of the present application is shown.
The user records a section of video material through the video recording function of the mobile phone, or selects a section of video material from the mobile phone album. Further, the video material is cut through a video cutting tool in the mobile phone, and a video clip is obtained. Further, the video clips are cut according to the resolution of the electronic equipment through the mobile phone, and the video clips are adjusted to be videos with the screen parameter size of the equipment, so that target videos are obtained, and the screens of the electronic equipment are adapted. Further, the target video is transmitted to an electronic device, such as a smart watch, through a mobile phone. Further, the second system of the electronic equipment acquires the target video, and performs frame extraction processing on the target video according to the picture rendering capability of the first system to obtain a target picture set. Further, the second system sends the target picture set to the first system. And simultaneously, the second system control screen displays a second system interface taking the target video as an interface background. When the first system acquires the screen control authority, the first system controls the screen to display a first system interface taking the target picture set as an interface background.
Therefore, in the embodiment of the application, the second system performs frame extraction processing on the target video to obtain the target picture set, and sends the target picture set to the first system, so that the problem that the interface backgrounds of the second system and the second system are inconsistent is avoided, and in addition, the interface of the system is dynamically displayed by controlling the screen through different systems, so that the power consumption of the electronic equipment is reduced.
Referring to fig. 11, a block diagram of an interface display device according to an embodiment of the present application is shown. The apparatus may be implemented as all or part of an electronic device by software, hardware, or a combination of both. The device comprises:
the second system module 1102 is configured to control the screen to display a second system interface, where the second system interface uses the target video as an interface background;
a first system module 1101, configured to obtain a screen control right in response to a system switching instruction;
the first system module 1101 is further configured to control the screen to display a first system interface, where the first system interface uses a continuously played target picture set as an interface background, and the target picture set is generated based on the target video.
Optionally, the second system module 1102 is configured to acquire the target video;
The second system module 1102 is configured to perform frame extraction processing on the target video to obtain the target picture set;
the second system module 1102 is configured to send the target picture set to the first system.
Optionally, the second system module 1102 is configured to perform frame extraction processing on the target video based on the picture rendering capability of the first system to obtain the target picture set, where a frame extraction interval in a frame extraction processing process and the picture rendering capability of the first system are in a negative correlation.
Optionally, the second system module 1102 is configured to determine the frame interval based on a maximum picture rendering rate of the first system;
the second system module 1102 is configured to perform frame extraction processing on the target video based on the frame extraction interval, so as to obtain the target picture set.
Optionally, the first system module 1101 is configured to determine a picture rendering rate based on a current running scene;
the first system module 1101 is configured to control a screen to display the first system interface based on the picture rendering rate and the target picture set.
Optionally, the first system module 1101 is configured to extract a target picture from the target picture set based on the picture rendering rate;
The first system module 1101 is configured to render the target picture based on the picture rendering rate, and control a screen to display the first system interface with the target picture.
Optionally, the first system module 1101 is configured to determine, when the current operation scene is a high power consumption operation scene, that the picture rendering rate is a first rate, where the first rate is consistent with the maximum picture rendering rate;
the first system module 1101 is configured to determine, when the current operation scene is a low power consumption operation scene, that the picture rendering rate is a second rate, where the second rate is less than the maximum picture rendering rate.
Optionally, the second system module 1102 is configured to switch the target video in response to an interface background switching operation;
the second system module 1102 is configured to send a picture set switching instruction to the first system;
the first system module 1101 is configured to switch the target picture set based on the picture set switching instruction.
Optionally, the second system module 1102 is configured to delete the target video in response to an interface background deletion operation;
the second system module 1102 is configured to send a picture set deletion instruction to the first system;
The first system module 1101 is configured to delete the target picture set based on the picture set deletion instruction.
Optionally, the electronic device is a wearable device, and the first system interface and the second system interface are dial interfaces.
In summary, in the embodiment of the present application, for an electronic device supporting dual systems, a second system interface using a target video as an interface background is displayed through a second system control screen with high power consumption, and when the second system is switched to a first system, a first system interface using a continuously played target picture set as an interface background is displayed through a first system control screen with low power consumption, and the target picture set is generated by the target video, so as to ensure the background consistency of the first system interface and the second system interface, and meanwhile, the system interface is dynamically displayed through a system control screen with different power consumption, so that the power consumption of the electronic device is reduced.
Referring to fig. 12, a block diagram of an electronic device according to an exemplary embodiment of the present application is shown. An electronic device in the present application may include one or more of the following components: a processor 1210, and a memory 1220.
Optionally, the processor 1210 includes at least a first processor 1211 and a second processor 1212, where the first processor 1211 is configured to operate a first system, the second processor 1212 is configured to operate a second system, and the power consumption of the first processor 1211 is lower than the power consumption of the second processor 1212, and the performance of the first processor 1211 is lower than the performance of the second processor 1212. The processor 1210 uses various interfaces and lines to connect various portions of the overall electronic device, perform various functions of the electronic device, and process data by executing or executing instructions, programs, code sets, or instruction sets stored in the memory 1220, and invoking data stored in the memory 1220. Alternatively, the processor 1210 may be implemented in hardware in at least one of digital signal processing (Digital Signal Processing, DSP), field programmable gate array (Field-Programmable Gate Array, FPGA), programmable logic array (Programmable Logic Array, PLA). The processor 1210 may integrate one or a combination of several of a central processing unit (Central Processing Unit, CPU), an image processing unit (Graphics Processing Unit, GPU), a Neural network processing unit (Neural-network Processing Unit, NPU), a modem, etc. The CPU mainly processes an operating system, a user interface, an application program and the like; the GPU is used for rendering and drawing the content required to be displayed by the touch display screen; the NPU is used to implement artificial intelligence (Artificial Intelligence, AI) functionality; the modem is used to handle wireless communications. It will be appreciated that the modem may not be integrated into the processor 1210 and may be implemented by a single chip.
The Memory 1220 may include a random access Memory (Random Access Memory, RAM) or a Read-Only Memory (ROM). Optionally, the memory 1220 includes a non-transitory computer readable medium (non-transitory computer-readable storage medium). Memory 1220 may be used to store instructions, programs, code, sets of codes, or sets of instructions. The memory 1220 may include a stored program area and a stored data area, wherein the stored program area may store instructions for implementing an operating system, instructions for at least one function (such as a touch function, a sound playing function, an image playing function, etc.), instructions for implementing the various method embodiments described below, etc.; the storage data area may store data (e.g., audio data, phonebook) created according to the use of the electronic device, etc.
The electronic device in the embodiment of the present application further includes a communication component 1230 and a display component 1240. The communication component 1230 may be a bluetooth component, a WiFi component, an NFC (Near Field Communication ) component, or the like, for communicating with an external device (server or other terminal device) through a wired or wireless network; the display component 1240 is configured to present a graphical user interface and/or receive user interactions.
In addition, those skilled in the art will appreciate that the configuration of the electronic device shown in the above-described figures does not constitute a limitation of the electronic device, and the electronic device may include more or less components than illustrated, or may combine certain components, or may have a different arrangement of components. For example, the electronic device further includes components such as a radio frequency circuit, an input unit, a sensor, an audio circuit, a speaker, a microphone, and a power supply, which are not described herein.
The embodiment of the application also provides a computer readable storage medium, which stores at least one instruction for being executed by a processor to implement the interface display method according to the above embodiment.
Embodiments of the present application provide a computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the electronic device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions, so that the electronic device performs the interface display method provided in the above embodiment.
Those skilled in the art will appreciate that in one or more of the examples described above, the functions described in the embodiments of the present application may be implemented in hardware, software, firmware, or any combination thereof. When implemented in software, these functions may be stored on or transmitted over as one or more instructions or code on a computer-readable medium. Computer-readable media includes both computer storage media and communication media including any medium that facilitates transfer of a computer program from one place to another. A storage media may be any available media that can be accessed by a general purpose or special purpose computer.
The foregoing description of the preferred embodiments is merely exemplary in nature and is in no way intended to limit the invention, since it is intended that all modifications, equivalents, improvements, etc. that fall within the spirit and scope of the invention.

Claims (14)

1. The interface display method is characterized in that the method is used for electronic equipment, the electronic equipment supports to operate a first system and a second system, and the operation power consumption of the first system is lower than that of the second system;
the method comprises the following steps:
the second system control screen displays a second system interface, and the second system interface takes a target video as an interface background;
responding to a system switching instruction, and acquiring screen control authority by the first system;
the first system control screen displays a first system interface, the first system interface takes a continuously played target picture set as an interface background, and the target picture set is generated based on the target video.
2. The method according to claim 1, wherein the method further comprises:
the second system acquires the target video;
the second system performs frame extraction processing on the target video to obtain the target picture set;
The second system sends the target picture set to the first system.
3. The method of claim 2, wherein the second system performs frame extraction processing on the target video to obtain the target picture set, including:
and the second system performs frame extraction processing on the target video based on the picture rendering capability of the first system to obtain the target picture set, wherein a frame extraction interval in the frame extraction processing process and the picture rendering capability of the first system are in negative correlation.
4. The method of claim 3, wherein the second system performs frame extraction processing on the target video based on the picture rendering capability of the first system to obtain the target picture set, comprising:
the second system determines the frame extraction interval based on a maximum picture rendering rate of the first system;
and the second system performs frame extraction processing on the target video based on the frame extraction interval to obtain the target picture set.
5. The method of claim 4, wherein the first system control screen displays a first system interface comprising:
the first system determines a picture rendering rate based on a current running scene;
The first system controls a screen to display the first system interface based on the picture rendering rate and the target picture set.
6. The method of claim 5, wherein the first system controlling a screen to display the first system interface based on the picture rendering rate and the target picture set comprises:
the first system extracts a target picture from the target picture set based on the picture rendering rate;
the first system renders the target picture based on the picture rendering rate and controls a screen to display the first system interface with the target picture.
7. The method of claim 5, wherein the first system determining a picture rendering rate based on the current running scene comprises:
under the condition that the current operation scene is a high-power consumption operation scene, the first system determines that the picture rendering rate is a first rate, and the first rate is consistent with the maximum picture rendering rate;
and under the condition that the current operation scene is a low-power consumption operation scene, the first system determines the picture rendering rate to be a second rate, and the second rate is smaller than the maximum picture rendering rate.
8. The method according to any one of claims 1 to 7, further comprising:
responding to interface background switching operation, and switching the target video by the second system;
the second system sends a picture set switching instruction to the first system;
the first system switches the target picture set based on the picture set switching instruction.
9. The method according to any one of claims 1 to 7, further comprising:
responding to an interface background deleting operation, and deleting the target video by the second system;
the second system sends a picture set deleting instruction to the first system;
the first system deletes the target picture set based on the picture set deletion instruction.
10. The method of any one of claims 1 to 7, wherein the electronic device is a wearable device and the first system interface and the second system interface are dial interfaces.
11. An interface display device, wherein the device is used for an electronic device, the electronic device supports to operate a first system and a second system, and the operation power consumption of the first system is lower than the operation power consumption of the second system;
The device comprises:
the second system module is used for controlling the screen to display a second system interface, and the second system interface takes the target video as an interface background;
the first system module is used for responding to the system switching instruction and acquiring screen control authority;
the first system module is further configured to control the screen to display a first system interface, where the first system interface uses a continuously played target picture set as an interface background, and the target picture set is generated based on the target video.
12. An electronic device comprising a processor and a memory; the memory stores at least one instruction for execution by the processor to implement the interface display method of any one of claims 1 to 10.
13. A computer readable storage medium storing at least one instruction for execution by a processor to implement the interface display method of any one of claims 1 to 10.
14. A computer program product, the computer program product comprising computer instructions stored in a computer readable storage medium; a processor of an electronic device reads the computer instructions from the computer-readable storage medium, the processor executing the computer instructions, causing the electronic device to implement the interface display method of any one of claims 1 to 10.
CN202210070014.5A 2022-01-21 2022-01-21 Interface display method, device, equipment and storage medium Pending CN116521283A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210070014.5A CN116521283A (en) 2022-01-21 2022-01-21 Interface display method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210070014.5A CN116521283A (en) 2022-01-21 2022-01-21 Interface display method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN116521283A true CN116521283A (en) 2023-08-01

Family

ID=87403393

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210070014.5A Pending CN116521283A (en) 2022-01-21 2022-01-21 Interface display method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN116521283A (en)

Similar Documents

Publication Publication Date Title
US11722449B2 (en) Notification message preview method and electronic device
CN111694475B (en) Terminal control method and device and terminal equipment
WO2020211705A1 (en) Contact person recommendation method and electronic device
WO2020006711A1 (en) Message playing method and terminal
JP7459438B2 (en) Method for reducing power consumption of mobile terminals and mobile terminals
CN113391743B (en) Display method and electronic equipment
CN114647452A (en) System switching method, device, equipment and storage medium
CN113760192B (en) Data reading method, data reading apparatus, storage medium, and program product
CN116521283A (en) Interface display method, device, equipment and storage medium
CN115226048A (en) Data transmission method, device, equipment and storage medium
WO2023116237A1 (en) Interface display method and apparatus, device, storage medium, and program product
WO2022213743A1 (en) Display method and apparatus for user interface, and device and storage medium
WO2024109290A1 (en) Application startup method and apparatus, device, and storage medium
WO2023124554A1 (en) Application starting method and apparatus, and device and storage medium
WO2023116236A1 (en) Application starting method and apparatus, and device and storage medium
WO2022213757A1 (en) Application interface display method and apparatus, device, and storage medium
CN115309252A (en) Music control interface display method and device, wearable device and storage medium
EP4365722A1 (en) Method for displaying dock bar in launcher and electronic device
WO2023138235A1 (en) Theme switching method and apparatus, and device, storage medium and program product
WO2023236516A1 (en) Speech recognition method and apparatus, device, and storage medium
WO2023124622A1 (en) Method and apparatus for maintaining communication connection, and device, storage medium and program product
WO2023185152A1 (en) Incoming call processing method and apparatus, device and storage medium
WO2024032140A1 (en) Always-on display method and apparatus, and device and storage medium
WO2022179283A1 (en) Push message sending method, electronic device, and readable medium
CN118093026A (en) Application starting method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination