CN116088944B - Interface display method and device - Google Patents

Interface display method and device Download PDF

Info

Publication number
CN116088944B
CN116088944B CN202210982408.8A CN202210982408A CN116088944B CN 116088944 B CN116088944 B CN 116088944B CN 202210982408 A CN202210982408 A CN 202210982408A CN 116088944 B CN116088944 B CN 116088944B
Authority
CN
China
Prior art keywords
mode
interface
terminal equipment
terminal device
vivid
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210982408.8A
Other languages
Chinese (zh)
Other versions
CN116088944A (en
Inventor
李新博
朱文健
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202210982408.8A priority Critical patent/CN116088944B/en
Publication of CN116088944A publication Critical patent/CN116088944A/en
Application granted granted Critical
Publication of CN116088944B publication Critical patent/CN116088944B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/4401Bootstrapping
    • G06F9/4406Loading of operating system
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/445Program loading or initiating
    • G06F9/44521Dynamic linking or loading; Link editing at or after load time, e.g. Java class loading
    • G06F9/44526Plug-ins; Add-ons
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/70Reducing energy consumption in communication networks in wireless communication networks

Abstract

The application provides an interface display method and device, and relates to the technical field of terminals. If the terminal equipment in the non-bright mode does not monitor the triggering operation input by the user within the first preset duration, starting the target application; and the terminal equipment is switched from the non-bright mode to the bright mode so as to circularly play the images in the preset image set on the first interface corresponding to the target application. At this time, the vividness of the image cyclically played by the terminal device is higher than that of the interface displayed in the non-vivid mode. Therefore, under the condition that the user does not influence the use of the terminal equipment, the terminal equipment can automatically recover to a vivid mode to circularly play the graph without manual intervention, the attractive force to consumers is maintained, and the labor cost is reduced.

Description

Interface display method and device
Technical Field
The present application relates to the field of terminal technologies, and in particular, to an interface display method and device.
Background
Currently, when the terminal device is in the vending phase, the terminal device may be prototyped in a store for the consumer to experience.
Typically, the terminal device is sold as a prototype in a store. In this case, the terminal device may be in a vivid mode and the target application of the terminal device is turned on to cyclically play the images in the preset image set. In this way, the terminal device that consumer experienced can be better attracted.
However, some consumers may exit the target application and turn off the vivid mode of the terminal device to experience other functions (e.g., music, surfing the internet, watching video). In this way, the terminal device is not in the vivid mode after the consumer has experienced leaving the terminal device. Therefore, the images circularly played by the terminal equipment after the target application is subsequently started are poor in consumer attraction.
Disclosure of Invention
The application provides an interface display method and device, which can automatically restart a bright mode after a consumer has experienced leaving from a terminal device. Thus, the terminal equipment has high attractive force for maintaining consumers when playing the images circularly.
In a first aspect, the present application provides an interface display method, including: the method comprises the steps that under a non-bright mode, whether trigger operation input by a user is received in a first preset time period is monitored; if the terminal equipment does not monitor the triggering operation input by the user within the first preset time length, starting the target application; the terminal equipment is switched from the non-vivid mode to the vivid mode so as to circularly play the images in the preset image set on the first interface corresponding to the target application, wherein the vividness of the interface displayed by the terminal equipment in the vivid mode is higher than that of the interface displayed by the terminal equipment in the non-vivid mode.
According to the interface display method, if the terminal equipment in the non-bright mode does not monitor the triggering operation input by the user within the first preset duration, a target application is started; and the terminal equipment is switched from the non-bright mode to the bright mode so as to circularly play the images in the preset image set on the first interface corresponding to the target application. At this time, the vividness of the image cyclically played by the terminal device is higher than that of the interface displayed in the non-vivid mode. Therefore, under the condition that the user does not influence the use of the terminal equipment, the terminal equipment can automatically recover to a vivid mode to circularly play the graph without manual intervention, the attractive force to consumers is maintained, and the labor cost is reduced.
In one possible implementation manner, before the terminal device is in the non-bright mode and the trigger operation of monitoring whether the user input is received within the first preset time period, the method further includes: the terminal device turns on the vivid mode. The terminal equipment circularly plays images in a preset image set on a first interface corresponding to the target application; the terminal device switches from the vivid mode to the non-vivid mode in response to a user's closing operation of the vivid mode.
It will be appreciated that by the above-described operation, the terminal device can be switched from the vivid mode to the non-vivid mode.
In one possible implementation, the terminal device turns on the vivid mode, including: when the terminal equipment is started or restarted, detecting whether a target application is installed; and when the target application is installed, starting a bright mode after a second preset time period.
It can be understood that, when the target application of the terminal device is installed, the terminal device can start the target application to circularly play the images in the preset image set. In this way, the installation target application can be one of the preconditions for turning on the vivid mode. In addition, since the system of the terminal device needs to be initialized for a while in the case of power-on or reboot. Thus, the success rate of turning on the vivid mode after the second preset time period is higher.
In one possible implementation, the terminal device turns on the vivid mode, including: and when the terminal equipment detects that the target application is started, starting a bright mode.
It can be appreciated that, when the target application is started, the images in the preset image set can be circularly played on the first interface corresponding to the target application. Thus, when the terminal equipment detects that the target application is started, the starting vivid mode better meets the requirements of users.
In one possible embodiment, the terminal device switches from the vivid mode to the non-vivid mode in response to a user's turning-off operation of the vivid mode, including: the terminal equipment responds to the exit operation input by the user on the first interface, and closes the target application to exit the first interface; the terminal equipment responds to the opening operation of the user on the second interface, the second interface is displayed, and the second interface comprises a first control; and the terminal equipment responds to the triggering operation of the user on the first control, and is switched from the vivid mode to the non-vivid mode.
It will be appreciated that by the above-described operation, the terminal device can be switched from the vivid mode to the non-vivid mode.
In one possible embodiment, the preconditions for switching on the vivid mode are: the presentation mode is on.
It will be appreciated that the bright mode need only be turned on when the terminal device is placed in a store. Therefore, the demonstration mode is started as a condition for starting the vivid mode, and the vivid mode can be prevented from being started by error touch control when a user normally uses the terminal equipment.
In one possible implementation, the mode of presentation is turned on in the following manner: the terminal equipment displays a third interface, wherein the third interface comprises a second control, and the second control is used for indicating to open or close a demonstration mode; and the terminal equipment responds to the opening operation of the user on the second control, and opens the demonstration mode.
In this way, the presentation mode may be turned on.
In one possible embodiment, the display parameter of the terminal device in the vivid mode is higher than the display parameter in the non-vivid mode, wherein the display parameter includes at least one of brightness, contrast, saturation, resolution, color temperature, and hue.
In a second aspect, the present application further provides an interface display device, including: the processing unit is used for monitoring whether a trigger operation input by a user is received within a first preset duration or not in a non-bright mode; if the triggering operation input by the user is not monitored within the first preset time period, starting the target application; switching from the non-vivid mode to the vivid mode. And the display unit is used for circularly playing the images in the preset image set on the first interface corresponding to the target application, wherein the vividness of the interface displayed by the terminal equipment in the vivid mode is higher than that of the interface displayed by the terminal equipment in the non-vivid mode.
In a third aspect, the present application further provides a terminal device, including a processor and a memory, where the memory is configured to store code instructions; the processor is configured to execute code instructions to cause the terminal device to perform the interface display method as described in the first aspect or any implementation of the first aspect.
In a fourth aspect, the present application also provides a computer-readable storage medium storing instructions that, when executed, cause a computer to perform an interface display method as described in the first aspect or any implementation of the first aspect.
In a fifth aspect, the present application also provides a computer program product comprising a computer program which, when run, causes a computer to perform the interface display method as described in the first aspect or any implementation of the first aspect.
It should be understood that, the second aspect to the fifth aspect of the present application correspond to the technical solutions of the first aspect of the present application, and the beneficial effects obtained by each aspect and the corresponding possible embodiments are similar, and are not repeated.
Drawings
FIG. 1 is an interface diagram of a terminal device in a closed bright mode;
fig. 2 is an interface schematic diagram of a terminal device in a non-bright mode to start a target application to circularly play images in a preset image set;
fig. 3 is a schematic diagram of a hardware system architecture of a terminal device according to an embodiment of the present application;
fig. 4 is a schematic software system architecture of a terminal device according to an embodiment of the present application;
FIG. 5 is a flowchart of an interface display method according to an embodiment of the present disclosure;
fig. 6 is an interface schematic diagram of a terminal device opening demonstration mode provided in an embodiment of the present application;
fig. 7 is one of interface diagrams of a terminal device in a vivid mode according to an embodiment of the present application;
FIG. 8 is a second interface diagram of a terminal device in a vivid mode according to an embodiment of the present application;
fig. 9 is an interface schematic diagram of a terminal device in a bright mode of closing according to an embodiment of the present application;
fig. 10 is a third interface schematic diagram of a terminal device in a bright mode according to an embodiment of the present application;
FIG. 11 is a second flowchart of an interface display method according to an embodiment of the present disclosure;
FIG. 12 is a functional block diagram of an interface display device according to an embodiment of the present disclosure;
fig. 13 is a schematic hardware structure of a terminal device according to an embodiment of the present application;
fig. 14 is a schematic structural diagram of a chip according to an embodiment of the present application.
Detailed Description
In order to clearly describe the technical solutions of the embodiments of the present application, in the embodiments of the present application, the words "first", "second", etc. are used to distinguish the same item or similar items having substantially the same function and effect. For example, the first value and the second value are merely for distinguishing between different values, and are not limited in their order. It will be appreciated by those of skill in the art that the words "first," "second," and the like do not limit the amount and order of execution, and that the words "first," "second," and the like do not necessarily differ.
In this application, the terms "exemplary" or "such as" are used to mean serving as an example, instance, or illustration. Any embodiment or design described herein as "exemplary" or "for example" should not be construed as preferred or advantageous over other embodiments or designs. Rather, the use of words such as "exemplary" or "such as" is intended to present related concepts in a concrete fashion.
In the present application, "at least one" means one or more, and "a plurality" means two or more. "and/or", describes an association relationship of an association object, and indicates that there may be three relationships, for example, a and/or B, and may indicate: a alone, a and B together, and B alone, wherein a, B may be singular or plural. The character "/" generally indicates that the context-dependent object is an "or" relationship. "at least one of" or the like means any combination of these items, including any combination of single item(s) or plural items(s). For example, at least one (one) of a, b, or c may represent: a, b, c, a-b, a-c, b-c, or a-b-c, wherein a, b, c may be single or plural.
Typically, the terminal device is sold as a prototype in a store. In this case, the terminal device may be in a vivid mode and the target application of the terminal device is turned on to cyclically play the images in the preset image set. In this way, the terminal device that consumer experienced can be better attracted. However, some consumers may exit the target application and turn off the vivid mode of the terminal device to experience other functions (e.g., music, surfing the internet, watching video).
Illustratively, as shown in (a) of fig. 1, when the terminal device 100 is put in a sales store as a prototype, the terminal device 100 is in a vivid mode, and the terminal device 100 displays the first interface 101 of the target application. The terminal device 100 plays back the images in the preset image set in a loop at the first interface 101. As shown in (b) of fig. 1, when a consumer needs to experience the video function of the terminal device 100 in the non-vivid mode, the desktop 103 of the terminal device 100 may be displayed in response to a trigger operation of the close control 102 in the first interface 101 by the consumer. Desktop 103 includes icon 104 for a "setup" application. As shown in (c) of fig. 1, the terminal device 100 may display a setting interface 105 of an image mode in response to a triggering operation of the icon 104 of the set "application by the consumer, the setting interface 105 of the image mode including the close control 102. The image mode setting interface 105 also includes a vivid mode option, a standard mode option, and a soft mode option. The terminal device 100 may switch from the vivid mode to the standard mode (i.e., the non-vivid mode) in response to a triggering operation of the standard mode option by the consumer.
As shown in (a) - (b) of fig. 2, the terminal device 100 may close the image mode setting interface 105 and display the desktop 103 of the terminal device 100 in response to a user's trigger operation of the close control 102 in the image mode setting interface 105, the desktop 103 of the terminal device 100 including the icon 108 of the video application. As shown in (b) - (c) of fig. 2, the terminal device 100 may display a video play interface in response to a trigger operation of the icon 108 of the video application by the consumer. After viewing the video playback interface, the consumer leaves the terminal device 100. As shown in fig. 2 (d), if the terminal device 100 does not detect the triggering operation input by the user within the first preset duration, the target application is started, and the images in the preset image set are circularly played on the first interface 101 corresponding to the target application. Since the terminal device 100 is in the non-vivid mode at this time, the vividness of the image played by the terminal device 100 at the first interface 101 is low. In this way, the image played by the terminal device 100 at the first interface 101 is less attractive to the consumer.
It will be appreciated that the above-described terminal device may also be referred to as a User Equipment (UE), a Mobile Station (MS), a Mobile Terminal (MT), etc. The terminal device may be a mobile phone, a smart tv, a wearable device, a tablet (Pad), a computer with wireless transceiving function, a Virtual Reality (VR) terminal device, an augmented reality (augmented reality, AR) terminal device, a wireless terminal in industrial control (industrial control), a wireless terminal in unmanned driving (self-driving), a wireless terminal in smart grid (smart grid), a wireless terminal in transportation security (transportation safety), a wireless terminal in smart city (smart home), a wireless terminal in smart home (smart home), etc. The embodiment of the application does not limit the specific technology and the specific equipment form adopted by the terminal equipment.
In order to better understand the embodiments of the present application, the structure of the terminal device of the embodiments of the present application is described below. Fig. 3 is a schematic structural diagram of a terminal device according to an embodiment of the present application.
The terminal device may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, a sensor module 180, keys 190, an indicator 192, a camera 193, a display screen 194, and the like. The sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is to be understood that the structure illustrated in the embodiments of the present application does not constitute a specific limitation on the terminal device. In other embodiments of the present application, the terminal device may include more or less components than illustrated, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units. Wherein the different processing units may be separate devices or may be integrated in one or more processors. A memory may also be provided in the processor 110 for storing instructions and data.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge a terminal device, or may be used to transfer data between the terminal device and a peripheral device. And can also be used for connecting with a headset, and playing audio through the headset. The interface may also be used to connect other terminal devices, such as AR devices, etc.
The charge management module 140 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger. The power management module 141 is used for connecting the charge management module 140 and the processor 110.
The wireless communication function of the terminal device may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Antennas in the terminal device may be used to cover single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas.
The mobile communication module 150 may provide a solution for wireless communication including 2G/3G/4G/5G or the like applied on a terminal device. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc. The mobile communication module 150 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wirelesslocal area networks, WLAN) (e.g., wireless fidelity (wireless fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), etc. as applied on a terminal device.
The terminal device implements display functions through a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. In some embodiments, the terminal device may include 1 or N display screens 194, N being a positive integer greater than 1.
The terminal device may implement photographing functions through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
The camera 193 is used to capture still images or video. In some embodiments, the terminal device may include 1 or N cameras 193, N being a positive integer greater than 1.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to realize expansion of the memory capability of the terminal device. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions. For example, files such as music, video, etc. are stored in an external memory card.
The internal memory 121 may be used to store computer-executable program code that includes instructions. The internal memory 121 may include a storage program area and a storage data area. The internal memory 121 is provided with a first field for indicating whether the presentation mode is on. Illustratively, when the first field is a binary "0," it is used to indicate that the presentation mode is off. When the first field is a bin "1", it is used to indicate that the presentation mode is on. In addition, the internal memory 121 is provided with a second field for indicating whether the vivid mode is on. Illustratively, when the second field is "on," it is used to indicate that the vivid mode is on. When the second field is "down", it is used to indicate that the vivid mode is off.
The terminal device may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an application processor, and the like. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The speaker 170A, also referred to as a "horn," is used to convert audio electrical signals into sound signals. The terminal device can listen to music through the speaker 170A or listen to hands-free calls. The microphone 170B, also known as a "earpiece", is used to convert an audio electrical signal into a sound signal. When the terminal device picks up a call or voice message, the voice can be picked up by placing the microphone 170B close to the human ear. Microphone 170C, also referred to as a "microphone" or "microphone", is used to convert sound signals into electrical signals.
The pressure sensor 180A is used to sense a pressure signal, and may convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The gyro sensor 180B may be used to determine a motion gesture of the terminal device. The air pressure sensor 180C is used to measure air pressure. The magnetic sensor 180D includes a hall sensor. The acceleration sensor 180E may detect the magnitude of acceleration of the terminal device in various directions (typically three axes). A distance sensor 180F for measuring a distance. The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The ambient light sensor 180L is used to sense ambient light level. The fingerprint sensor 180H is used to collect a fingerprint. The temperature sensor 180J is for detecting temperature. The touch sensor 180K, also referred to as a "touch device". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The bone conduction sensor 180M may acquire a vibration signal.
The keys 190 include a power-on key, a volume key, etc. The keys 190 may be mechanical keys. Or may be a touch key. The terminal device may receive key inputs, generating key signal inputs related to user settings of the terminal device and function control. The indicator 192 may be an indicator light, may be used to indicate a state of charge, a change in charge, a message indicating a missed call, a notification, etc.
The software system of the terminal equipment can adopt a layered architecture, an event driven architecture, a microkernel architecture, a microservice architecture, a cloud architecture or the like.
In the embodiment of the application, taking an Android system with a layered architecture as an example, a software structure of a terminal device is illustrated. Fig. 4 is a software architecture block diagram of a terminal device applicable to the embodiment of the present application. The layered architecture divides the software system of the terminal device into a plurality of layers, each layer having a distinct role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android system may be divided into five layers, an application layer (applications), an application framework layer (application framework), an Zhuoyun rows (Android run) and system libraries, a hardware abstraction layer (hardware abstract layer, HAL), and a kernel layer (kernel), respectively.
The application layer may include a series of application packages that run applications by calling an application program interface (application programming interface, API) provided by the application framework layer. As shown in fig. 4, the application package of the terminal device may include applications such as setup, smart screen, video, recorder, music, event management service, call, navigation, WLAN, bluetooth, etc.
The application framework layer provides APIs and programming frameworks for application programs of the application layer. The application framework layer includes a number of predefined functions. As shown in fig. 4, the application framework layer may include a window manager, a content provider, a view system, a telephony manager, a resource manager, a notification manager, and the like.
The window manager is used for managing window programs. The window manager can acquire the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like. The content provider is used to store and retrieve data and make such data accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebooks, etc. The view system includes visual controls, such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, a display interface including a text message notification icon may include a view displaying text and a view displaying a picture. The telephony manager is arranged to provide communication functions for the terminal device. Such as the management of call status (including on, hung-up, etc.). The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, video files, and the like. The notification manager allows the application to display notification information in a status bar, can be used to communicate notification type messages, can automatically disappear after a short dwell, and does not require user interaction.
The android runtime includes a core library and virtual machines. And the android running time is responsible for scheduling and managing an android system. The core library consists of two parts: one part is a function which needs to be called by java language, and the other part is a core library of android. The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application program layer and the application program framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like. The system library may include a plurality of functional modules. For example: surface manager (surface manager), media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., openGL ES), 2D graphics engines (e.g., SGL), etc.
The surface manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications. Media libraries support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio video encoding formats, such as: MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc. The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like. The 2D graphics engine is a drawing engine for 2D drawing.
The hardware abstraction layer may include a plurality of library modules, such as camera library modules, ma Daku modules, and the like. The Android system can load a corresponding library module for the equipment hardware, so that the purpose of accessing the equipment hardware by an application program framework layer is achieved. The device hardware may include, for example, a motor, camera, etc. in the terminal device.
The kernel layer is a layer between hardware and software. The kernel layer is used for driving the hardware so that the hardware works. The inner core layer at least includes display driving, camera driving, audio driving, sensor driving, motor driving, etc., which is not limited in this embodiment of the present application.
Taking the terminal device as an intelligent screen and the target application as an application of the intelligent screen for the intelligent screen, and the technical scheme of the present application and how the technical scheme of the present application solves the above technical problems are described in detail with reference to fig. 5 to 11. The following embodiments may be implemented independently or combined with each other, and the same or similar concepts or processes may not be described in detail in some embodiments.
The embodiment of the application provides an interface display method which is applied to an intelligent screen. As shown in fig. 5, the interface display method provided in the present application includes:
S501: the intelligent screen turns on the presentation mode.
Illustratively, the smart screen may turn on the presentation mode when the smart screen needs to be placed in a vending store for display to facilitate consumer experience. The internal memory of the smart screen is provided with a first field for indicating whether the presentation mode is on. Illustratively, when the first field is a binary "0," it is used to indicate that the presentation mode is off. When the first field is a bin "1", it is used to indicate that the presentation mode is on.
Specifically, S501 may be specifically implemented as: as shown in fig. 6 (a), the smart screen displays a desktop 103, and the desktop 103 includes an icon 104 of "set". As shown in (a) - (b) of fig. 6, the smart screen displays a third interface 107 in response to a trigger operation of the icon 104 for "set" by a store person, the third interface 107 including a second control 106, the second control 106 for indicating whether to turn on or off the presentation mode. As shown in (b) - (c) of fig. 6, the smart screen turns on the presentation mode in response to the user's opening operation of the second control 106. As such, the first field set by the internal memory for indicating whether the presentation mode is on is switched from the binary number "0" to the binary number "1".
S502: and the intelligent screen starts a bright mode under the precondition that the intelligent screen detects that the demonstration mode is started.
Illustratively, the smart screen detects that a precondition for the presentation mode to be on is satisfied when the first field in the internal memory for indicating whether the presentation mode is on is a binary "1". At this time, the intelligent screen turns on the vivid mode. Illustratively, the internal memory of the smart screen is provided with a second field for indicating whether the vivid mode is on. Illustratively, when the second field is "on," it is used to indicate that the vivid mode is on. When the second field is "down", it is used to indicate that the vivid mode is off.
When the vivid mode is on, a second field set in the internal memory of the smart screen for indicating whether the vivid mode is on is updated from "down" to "on".
It will be appreciated that the bright mode only needs to be turned on when the smart screen is placed in a store. Therefore, the demonstration mode is started as a condition for starting the bright mode, and the bright mode can be prevented from being started by false touch control when a consumer normally uses the intelligent screen.
Alternatively, in S502 described above, the on demonstration mode may be omitted as the condition that the vivid mode is on, so S501 described above may be omitted.
Illustratively, the specific implementation of S502 described above includes, but is not limited to, the following two ways:
first kind: as shown in fig. 7 (a), the smart screen is in a shutdown state. As shown in (b) of fig. 7, when the smart screen is switched from the off state to the on state, detecting whether the smart screen application is installed; when the smart screen application is installed, the smart screen turns on the vivid mode after a second preset time period (e.g., 30 s). Or after a second preset time period (such as 30 s), if the intelligent screen does not respond to the triggering operation, starting the intelligent screen application. The smart screen turns on the vivid mode when detecting that the smart screen application is turned on successfully.
It can be understood that when the intelligent screen application of the intelligent screen is installed, the intelligent screen can start the intelligent screen application only after that, and the images in the preset image set are played in a circulating way. Thus, the installation of a smart screen can be used as one of the preconditions for the activation of the vivid mode. In addition, since the system of the smart screen needs to be initialized for a while in the case of power-on or reboot. Thus, the success rate of turning on the vivid mode after the second preset time period is higher.
Second kind: as shown in fig. 8 (a), the smart screen is in a shutdown state. As shown in fig. 8 (b), the smart screen displays the desktop 103 of the smart screen when switching from the off state to the on state in response to a trigger operation by a store person. The desktop 103 of the smart screen includes a "set" icon 104. The smart screen displays a setting interface 105 of the image mode in response to a trigger operation of the icon 104 of "set" by the store personnel. The image mode setting interface 105 also includes a vivid mode option, a standard mode option, and a soft mode option. As shown in fig. 8 (c), the intelligent screen 100 may switch from the non-vivid mode to the vivid mode in response to a trigger operation of the vivid mode option by the consumer.
S503: and the intelligent screen circularly plays the images in the preset image set on the first interface corresponding to the intelligent screen application.
Illustratively, as shown in (c) of fig. 7 and (d) of fig. 8, after the smart screen application is turned on, the smart screen cyclically plays the images in the preset image set while in the vivid mode. The vividness of the interface displayed in the vivid mode is higher than that of the interface displayed in the non-vivid mode. For example, the smart screen may have a display parameter in the vivid mode that is higher than a display parameter in the non-vivid mode, wherein the display parameter includes at least one of brightness, contrast, saturation, resolution, color temperature, and hue. Thus, the images circularly played when the intelligent screen is in the bright mode can help to attract consumers to experience the intelligent screen.
S504: the smart screen 100 switches from the vivid mode to the non-vivid mode in response to a user's turning-off operation of the vivid mode.
Specifically, the smart screen 100 may close the smart screen application to exit the first interface in response to an exit operation input by the user at the first interface; the intelligent screen 100 responds to the opening operation of the user on the second interface, and displays the second interface, wherein the second interface comprises a first control; the smart screen 100 switches from the vivid mode to the non-vivid mode in response to a user's trigger operation of the first control. It will be appreciated that through the operations described above, the intelligent screen 100 may be switched from a vivid mode to a non-vivid mode.
It will be appreciated that when the vivid mode is off, the second field set by the internal memory 121 of the smart screen 100 for indicating whether the vivid mode is on is updated from "on" to "down".
Illustratively, as shown in fig. 9 (a), when the consumer needs to experience the video function of the smart screen 100100 in the non-vivid mode, the desktop 103 of the smart screen 100100 may be displayed in response to the triggering operation of the close control 102 (i.e., the first control) in the first interface 101 by the consumer. Desktop 103 includes icon 104 for a "setup" application. As shown in (b) of fig. 9, the smart screen 100100 may display a setting interface 105 of an image mode in response to a trigger operation of the icon 104 of the set "application by the consumer, the setting interface 105 of the image mode including the close control 102. The image mode setting interface 105 also includes a vivid mode option, a standard mode option, and a soft mode option. As shown in fig. 9 (c), the smart screen 100100 may be switched from the vivid mode to the standard mode (i.e., the non-vivid mode) in response to a trigger operation of the standard mode option by the consumer.
As shown in (a) - (b) of fig. 10, the smart screen 100100 may close the image mode setting interface 105 and display the desktop 103 of the smart screen 100100 in response to a user's trigger operation of the close control 102 in the image mode setting interface 105, the desktop 103 of the smart screen 100100 including the icon 108 of the video application. As shown in (b) - (c) of fig. 10, the smart screen 100100 may display a video playback interface in response to a triggering operation of the icon 108 of the video application by the consumer.
S505: the intelligent screen 100 is in the non-vivid mode, monitors whether a trigger operation input by a user is received within a first preset time period, and if so, performs S506.
S506: the smart screen 100 starts the smart screen application, and switches from the non-vivid mode to the vivid mode, so as to circularly play the images in the preset image set on the first interface corresponding to the smart screen application.
When the consumer experiences the video function, the smart screen 100 will not receive the triggering operation of the consumer after the video playing interface is closed and the smart screen 100 is separated. In this way, the smart screen 100 monitors that the trigger operation is not received within the first preset time period (e.g., 1min, 2 min), and starts the smart screen application, and switches from the non-bright mode to the bright mode again. It will be appreciated that when the vivid mode is on, the second field set by the internal memory 121 of the smart screen 100 for indicating whether the vivid mode is on is updated from "down" to "on".
Further, as shown in (d) of fig. 10, the images in the preset image set are circularly played on the first interface corresponding to the intelligent screen application. Since the intelligent screen 100 displays the interface in the vivid mode with a vividness higher than that of the interface in the non-vivid mode. For example, the smart screen 100 may have a display parameter in the vivid mode that is higher than a display parameter in the non-vivid mode, where the display parameter includes at least one of brightness, contrast, saturation, resolution, color temperature, and hue. In this way, the images that are played back in the loop of the smart screen 100 while in the vivid mode help to attract the consumer's experience in using the smart screen 100. Thus, the smart screen 100 can keep the consumer attractive by cycling the image displayed on the first interface.
In summary, in the interface display method provided in the embodiment of the present application, if the intelligent screen 100 in the non-bright mode does not monitor the triggering operation input by the user within the first preset duration, the intelligent screen is started; the intelligent screen 100 switches from the non-vivid mode to the vivid mode to cycle through the images in the preset image set on the first interface corresponding to the intelligent screen. At this time, the vividness of the image circularly played by the smart screen 100 is higher than that of the interface displayed in the non-vivid mode. Therefore, under the condition that the user does not influence the use of the intelligent screen 100, the intelligent screen 100 can automatically recover to the vivid mode to circularly play the graphics without manual intervention, the attractive force to consumers is maintained, and the labor cost is reduced.
Next, another implementation manner of the interface display method provided in the embodiment of the present application is described with reference to fig. 11, where, as shown in fig. 11, the interface display method provided in the embodiment of the present application includes:
s1101: the corresponding interface of the "set" application of the intelligent screen 100 starts the presentation mode in response to a trigger operation by the user.
S1102: at power-on, the event management service (hirmsservess) of the smart screen 100 detects whether the smart screen application is installed and whether the presentation mode is on, and if so, performs S1103.
S1103: the event management service of the smart screen 100 waits for a second preset time period and broadcasts a notification "set" the application to turn on the vivid mode.
S1104: the "set" application of the smart screen 100 opens the vivid mode, storing a second field for indicating that the vivid mode is open.
S1105: the smart screen application of the smart screen 100 is turned on, and the images in the preset image set are circularly played on the first interface corresponding to the smart screen application.
S1106: the smart screen application of the smart screen 100 closes the first interface in response to the input exit operation.
S1107: the interface corresponding to the "set" application of the intelligent screen 100 turns off the vivid mode in response to a trigger operation by the user.
S1108: and when the triggering operation is not received after the first preset time, starting the intelligent screen application of the intelligent screen.
S1109: the event management service of the smart screen 100 determines whether the presentation mode is on when detecting that the smart screen application is on, and if so, performs S1110.
S1110: the event management service of the intelligent screen 100 notifies "set" the application to start the vivid mode.
S1111: the setting "application of the smart screen 100, in response to the notification, stores a second field for indicating that the vivid mode is on.
In addition, in the above description, in the image editing prompting method provided in the embodiment of the present application, the mentioned triggering operation may include: the click operation, the long press operation, the gesture trigger operation, and the like are not limited herein.
In addition, in the method for prompting image editing provided in the embodiment of the present application, the mentioned smart screen may be replaced by a PC, a mobile phone, a tablet, or the like, which is not limited herein.
Referring to fig. 12, an interface display apparatus 1200 is further provided in the embodiments of the present application, including: a processing unit 1201, configured to monitor whether a trigger operation input by a user is received within a first preset duration in a non-bright mode; if the triggering operation input by the user is not monitored within the first preset time period, starting the target application; switching from the non-vivid mode to the vivid mode. And a display unit 1202, configured to circularly play the images in the preset image set on the first interface corresponding to the target application, where the vividness of the interface displayed by the terminal device in the vivid mode is higher than the vividness of the interface displayed by the terminal device in the non-vivid mode.
In one possible implementation, the processing unit 1201 is also configured to turn on the vivid mode. The display unit 1202 is configured to circularly play, on a first interface corresponding to the target application, an image in a preset image set; the processing unit 1201 is further configured to switch from the vivid mode to the non-vivid mode in response to a user's turning-off operation of the vivid mode.
In one possible implementation, the processing unit 1201 is specifically configured to detect, when the target application is started or restarted, whether the target application is installed; and when the target application is installed, starting a bright mode after a second preset time period.
Alternatively, in another possible implementation, the processing unit 1201 is specifically configured to turn on the vivid mode when detecting that the target application is turned on.
In one possible implementation manner, the display unit 1202 is specifically configured to close the target application to exit the first interface in response to an exit operation input by the user at the first interface; the display unit 1202 is further configured to display a second interface in response to an opening operation of the second interface by a user, where the second interface includes a first control; the processing unit 1201 is configured to switch from the vivid mode to the non-vivid mode in response to a trigger operation of the first control by a user.
In a possible embodiment, the processing unit 1201 is specifically configured to turn on the vivid mode when recognizing that the demonstration mode is turned on.
In a possible implementation manner, the display unit 1202 is further configured to display a third interface, where the third interface includes a second control, and the second control is configured to instruct to turn on or off the presentation mode; the processing unit 1201 is configured to turn on the presentation mode in response to an opening operation of the second control by the user.
In one possible embodiment, the display parameter of the terminal device in the vivid mode is higher than the display parameter in the non-vivid mode, wherein the display parameter includes at least one of brightness, contrast, saturation, resolution, color temperature, and hue.
Fig. 13 is a schematic hardware structure of a terminal device according to an embodiment of the present application, as shown in fig. 13, where the terminal device includes a processor 1301, a communication line 1304, and at least one communication interface (illustrated in fig. 13 by taking a communication interface 1303 as an example).
Processor 1301 may be a general purpose central processing unit (central processing unit, CPU), microprocessor, application-specific integrated circuit (ASIC), or one or more integrated circuits for controlling the execution of the programs of the present application.
Communication line 1304 may include circuitry for communicating information between the components described above.
The communication interface 1303 uses any transceiver-like device for communicating with other devices or communication networks, such as ethernet, wireless local area network (wireless local area networks, WLAN), etc.
Possibly, the terminal device may also comprise a memory 1302.
The memory 1302 may be, but is not limited to, read-only memory (ROM) or other type of static storage device that can store static information and instructions, random access memory (random access memory, RAM) or other type of dynamic storage device that can store information and instructions, but may also be electrically erasable programmable read-only memory (EEPROM), compact disc-read only memory (compact disc read-only memory) or other optical disk storage, optical disk storage (including compact disc, laser disc, optical disc, digital versatile disc, blu-ray disc, etc.), magnetic disk storage media or other magnetic storage devices, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer. The memory may be self-contained and coupled to the processor via communication line 1304. The memory may also be integrated with the processor.
The memory 1302 is used for storing computer-executable instructions for executing the embodiments of the present application, and is controlled by the processor 1301 to execute the instructions. The processor 1301 is configured to execute computer-executable instructions stored in the memory 1302, thereby implementing the interface display method provided in the embodiment of the present application.
Possibly, the computer-executed instructions in the embodiments of the present application may also be referred to as application program code, which is not specifically limited in the embodiments of the present application.
In a particular implementation, processor 1301 may include one or more CPUs, such as CPU0 and CPU1 of FIG. 13, as an embodiment.
In a specific implementation, as an embodiment, the terminal device may include multiple processors, such as processor 1301 and processor 1305 in fig. 13. Each of these processors may be a single-core (single-CPU) processor or may be a multi-core (multi-CPU) processor. A processor herein may refer to one or more devices, circuits, and/or processing cores for processing data (e.g., computer program instructions).
Fig. 14 is a schematic structural diagram of a chip according to an embodiment of the present application. Chip 140 includes one or more (including two) processors 1410 and a communication interface 1430.
In some implementations, memory 1440 stores the following elements: executable modules or data structures, or a subset thereof, or an extended set thereof.
In an embodiment of the present application, memory 1440 may include read only memory and random access memory and provide instructions and data to processor 1410. A portion of memory 1440 may also include non-volatile random access memory (non-volatile random access memory, NVRAM).
In the illustrated embodiment, memory 1440, communication interface 1430, and memory 1440 are coupled together by bus system 1420. The bus system 1420 may include a power bus, a control bus, a status signal bus, and the like, in addition to a data bus. For ease of description, the various buses are labeled as bus system 1420 in FIG. 14.
The methods described in the embodiments of the present application may be applied to the processor 1410 or implemented by the processor 1410. Processor 1410 may be an integrated circuit chip with signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuitry in hardware or instructions in software in the processor 1410. The processor 1410 described above may be a general purpose processor (e.g., a microprocessor or a conventional processor), a digital signal processor (digital signal processing, DSP), an application specific integrated circuit (application specific integrated circuit, ASIC), an off-the-shelf programmable gate array (field-programmable gate array, FPGA) or other programmable logic device, discrete gates, transistor logic, or discrete hardware components, and the processor 1410 may implement or perform the methods, steps, and logic blocks disclosed in the embodiments herein.
The steps of a method disclosed in connection with the embodiments of the present application may be embodied directly in hardware, in a decoded processor, or in a combination of hardware and software modules in a decoded processor. The software modules may be located in a state-of-the-art storage medium such as random access memory, read-only memory, programmable read-only memory, or charged erasable programmable memory (electrically erasable programmable read only memory, EEPROM). The storage medium is located in memory 1440, and processor 1410 reads information in memory 1440 and performs the steps of the method described above in conjunction with its hardware.
In the above embodiments, the instructions stored by the memory for execution by the processor may be implemented in the form of a computer program product. The computer program product may be written in the memory in advance, or may be downloaded in the form of software and installed in the memory.
The computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on a computer, the processes or functions in accordance with embodiments of the present application are produced in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in a computer-readable storage medium or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by wired (e.g., coaxial cable, fiber optic, digital subscriber line (digital subscriber line, DSL), or wireless (e.g., infrared, wireless, microwave, etc.), or semiconductor medium (e.g., solid state disk, SSD)) or the like.
Embodiments of the present application also provide a computer-readable storage medium. The methods described in the above embodiments may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. Computer readable media can include computer storage media and communication media and can include any medium that can transfer a computer program from one place to another. The storage media may be any target media that is accessible by a computer.
As one possible design, the computer-readable medium may include compact disk read-only memory (CD-ROM), RAM, ROM, EEPROM, or other optical disk memory; the computer readable medium may include disk storage or other disk storage devices. Moreover, any connection is properly termed a computer-readable medium. For example, if the software is transmitted from a website, server, or other remote source using a coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave, then the coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave are included in the definition of medium. Disk and disc, as used herein, includes Compact Disc (CD), laser disc, optical disc, digital versatile disc (digital versatile disc, DVD), floppy disk and blu-ray disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers.
Combinations of the above should also be included within the scope of computer-readable media. The foregoing is merely specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily think about changes or substitutions within the technical scope of the present application, and the changes or substitutions are intended to be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (6)

1. An interface display method, characterized in that the method comprises:
the terminal equipment starts a demonstration mode;
the terminal equipment determines that a target application is installed and the demonstration mode is started;
the terminal equipment starts a bright mode;
the terminal equipment starts the target application and circularly plays images in a preset image set on a first interface corresponding to the target application;
the terminal equipment responds to the exit operation input by the user and closes the first interface;
the terminal equipment responds to the triggering operation of a user, closes the bright mode and enters a non-bright mode;
the terminal equipment is in the non-bright mode, and whether triggering operation input by a user is received in a first preset time period is monitored;
If the terminal equipment does not monitor the triggering operation input by the user within the first preset time period, starting the target application;
the terminal equipment determines that the demonstration mode is started;
the terminal equipment automatically switches from the non-bright mode to the bright mode so as to start the bright mode, and circularly plays the images in the preset image set on the first interface corresponding to the target application, wherein the brightness of the interface displayed by the terminal equipment in the bright mode is higher than that of the interface displayed by the terminal equipment in the non-bright mode.
2. The method according to claim 1, wherein the mode of presentation is turned on in the following manner:
the terminal equipment displays a third interface, wherein the third interface comprises a second control, and the second control is used for indicating to open or close the demonstration mode;
and the terminal equipment responds to the opening operation of the user on the second control, and opens the demonstration mode.
3. The method according to claim 1 or 2, wherein the display parameters of the terminal device in the vivid mode are higher than the display parameters in the non-vivid mode, wherein the display parameters include at least one of brightness, contrast, saturation, resolution, color temperature, and hue.
4. A terminal device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor, when executing the computer program, causes the terminal device to perform the method according to any of claims 1 to 3.
5. A computer-readable storage medium storing a computer program, which, when executed by a processor, causes a computer to perform the method of any one of claims 1 to 3.
6. A computer program product comprising a computer program which, when run, causes a computer to perform the method of any one of claims 1 to 3.
CN202210982408.8A 2022-08-16 2022-08-16 Interface display method and device Active CN116088944B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210982408.8A CN116088944B (en) 2022-08-16 2022-08-16 Interface display method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210982408.8A CN116088944B (en) 2022-08-16 2022-08-16 Interface display method and device

Publications (2)

Publication Number Publication Date
CN116088944A CN116088944A (en) 2023-05-09
CN116088944B true CN116088944B (en) 2024-04-02

Family

ID=86212518

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210982408.8A Active CN116088944B (en) 2022-08-16 2022-08-16 Interface display method and device

Country Status (1)

Country Link
CN (1) CN116088944B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108196772A (en) * 2018-01-19 2018-06-22 捷开通讯(深圳)有限公司 Demonstration exchange method, mobile terminal and the mobile terminal exhibition booth of a kind of mobile terminal
CN110662110A (en) * 2019-10-17 2020-01-07 深圳Tcl新技术有限公司 Mode switching method of smart television, smart television and storage medium
WO2021237425A1 (en) * 2020-05-25 2021-12-02 深圳传音控股股份有限公司 Screen brightness adjustment method, terminal and storage medium
CN114035921A (en) * 2021-11-15 2022-02-11 北京安云世纪科技有限公司 Application operation mode switching method, device, equipment and storage medium
CN114512094A (en) * 2020-11-16 2022-05-17 华为技术有限公司 Screen color adjusting method, device, terminal and computer readable storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108196772A (en) * 2018-01-19 2018-06-22 捷开通讯(深圳)有限公司 Demonstration exchange method, mobile terminal and the mobile terminal exhibition booth of a kind of mobile terminal
CN110662110A (en) * 2019-10-17 2020-01-07 深圳Tcl新技术有限公司 Mode switching method of smart television, smart television and storage medium
WO2021237425A1 (en) * 2020-05-25 2021-12-02 深圳传音控股股份有限公司 Screen brightness adjustment method, terminal and storage medium
CN114512094A (en) * 2020-11-16 2022-05-17 华为技术有限公司 Screen color adjusting method, device, terminal and computer readable storage medium
CN114035921A (en) * 2021-11-15 2022-02-11 北京安云世纪科技有限公司 Application operation mode switching method, device, equipment and storage medium

Also Published As

Publication number Publication date
CN116088944A (en) 2023-05-09

Similar Documents

Publication Publication Date Title
KR102522266B1 (en) Application display method and electronic device
CN110109636B (en) Screen projection method, electronic device and system
US20230216990A1 (en) Device Interaction Method and Electronic Device
CN111240547A (en) Interactive method for cross-device task processing, electronic device and storage medium
US20230004259A1 (en) Card sharing method, electronic device, and communications system
CN109857401B (en) Display method of electronic equipment, graphical user interface and electronic equipment
CN113225616A (en) Video playing method and device, computer equipment and readable storage medium
CN115016631B (en) Process scheduling method and terminal equipment
CN113709026B (en) Method, device, storage medium and program product for processing instant communication message
CN116088944B (en) Interface display method and device
CN113079332B (en) Mobile terminal and screen recording method thereof
CN114390190B (en) Display equipment and method for monitoring application to start camera
CN113641431A (en) Method and terminal equipment for enhancing display of two-dimensional code
CN112351144A (en) Mobile terminal and state prompting method thereof
CN111787157A (en) Mobile terminal and operation response method thereof
CN113255644B (en) Display device and image recognition method thereof
CN116737330B (en) Task processing method and electronic equipment
CN116048829B (en) Interface calling method, device and storage medium
CN116684521B (en) Audio processing method, device and storage medium
CN113179362B (en) Electronic device and image display method thereof
CN111142648B (en) Data processing method and intelligent terminal
CN111258699B (en) Page display method and communication terminal
WO2023040848A9 (en) Device control method and apparatus
CN117909071A (en) Image display method, electronic device, storage medium, and chip system
CN117278666A (en) Wallpaper setting method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant