CN117707403A - Display method and related device - Google Patents

Display method and related device Download PDF

Info

Publication number
CN117707403A
CN117707403A CN202310891688.6A CN202310891688A CN117707403A CN 117707403 A CN117707403 A CN 117707403A CN 202310891688 A CN202310891688 A CN 202310891688A CN 117707403 A CN117707403 A CN 117707403A
Authority
CN
China
Prior art keywords
interface
animation
layer
user
display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310891688.6A
Other languages
Chinese (zh)
Inventor
兰向宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202310891688.6A priority Critical patent/CN117707403A/en
Publication of CN117707403A publication Critical patent/CN117707403A/en
Pending legal-status Critical Current

Links

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application provides a display method and a related device. The method comprises the following steps: displaying a first interface of a first application in a full screen mode, wherein the first interface comprises a first control; receiving an operation that a user presses a first control and slides, and displaying a first animation and a layer carrying the first animation; when the user lifting operation is received at the first moment, hiding the bearing layer, and displaying a first interface in a first area of the screen, wherein the size of the first area is smaller than that of a full screen; when the user lifting operation is received at the second moment, the image layer is hidden, the first interface is displayed in a full screen mode, and the second moment is earlier than the first moment. Therefore, when the user does not finish the operation of triggering the terminal equipment to display the first interface in the first area, the terminal equipment can hide the layer first, so that the terminal equipment does not need to hide the layer after displaying the first interface in a full screen, the problem that the user can see the layer to disappear can be avoided, and the screen flash is avoided.

Description

Display method and related device
Technical Field
The embodiment of the application relates to the technical field of terminals, in particular to a display method and a related device.
Background
Along with the development of terminal technology and the maturity of touch technology, the display modes of interfaces are also continuously enriched. For example, the terminal device may display the interface full screen, split screen, display the interface in a floating window, minimize the interface, reduce the size of the interface for one-handed operation, and so on.
Currently, when the terminal device displays an interface of an application in a full screen manner, a control can be displayed on the interface, and a user presses and slides the control to trigger the terminal device to adjust the display mode of the interface. In the process of sliding the control by the user, the terminal equipment can display the animation of the sliding of the interface, but when the user slides half of the animation to raise the hand, the terminal equipment can display the interface in a full screen mode, but the picture layer carrying the animation is in a visible state at the moment, the terminal equipment can execute the operation of hiding the picture layer, and at the moment, the user can see the disappearance of the picture layer and the phenomenon of screen flashing.
Disclosure of Invention
The embodiment of the application provides a display method and a related device, which are applied to the technical field of terminals and can avoid terminal equipment screen flash.
In a first aspect, an embodiment of the present application provides a display method. The method comprises the following steps: the terminal equipment displays a first interface of a first application in a full screen mode, wherein the first interface comprises a first control. In the embodiment of the application, when the terminal device displays the first interface of the first application in a full screen manner, a first control is set on the first interface, and a user can operate the first control to trigger the terminal device to adjust the display mode of the first interface. The display manner of the first interface may include, but is not limited to: full screen display, split screen display, display in a floating window, minimized display, and the size of the first interface becomes smaller to facilitate one-handed operation, etc.
In some embodiments, to facilitate user operation of the first control, the first control may be disposed on top of the first interface.
When the user executes the operation of pressing the first control and sliding, the terminal device can receive the operation of pressing the first control and sliding by the user, and in response to the operation, the terminal device can display a first animation and a layer carrying the first animation. The first animation may characterize the first interface moving following the user's swipe.
In this embodiment of the present application, when a user performs a lifting operation at a first moment, for example, a finger of the user or a stylus leaves the screen, and the terminal device may receive the operation of lifting the user at the first moment, the terminal device may hide and bear the layer, and display the first interface in a first area of the screen, where the size of the first area is smaller than the size of the full screen. In addition, when the user performs the lift-up operation without performing the operation of triggering the terminal device to display the first interface on the first screen, for example, the user performs the lift-up operation at a second time, which is earlier than the first time. When the terminal equipment receives the operation of lifting the user at the second moment, the terminal equipment can hide the layer and display the first interface in a full screen mode.
In the embodiment of the application, when the user does not execute the operation of triggering the terminal device to display the first interface on the first screen and executes the lifting operation, the terminal device can hide the layer and display the first interface in a full screen mode, so that the terminal device can be prevented from hiding the layer after the first interface is displayed, the process that the user sees the layer disappeared is avoided, and the terminal device is prevented from flashing.
In one possible implementation manner, the user presses the first control and slides the first control, and when the first control is lifted at a first moment, the split screen display of the terminal device can be triggered. For example, when the user lifting operation is received at a first time, the terminal device may hide the layer, display the first interface in the first area, and display the second interface in the second area of the screen. In some embodiments, the second interface may be a desktop, or an interface of a second application. The second application may be different from the first application.
In one possible implementation, the user presses the first control and slides, and when lifted at a first moment, the terminal device may be triggered to display the first interface in the form of a floating window. For example, when the user lifting operation is received at a first moment, the terminal device may hide a layer, display a desktop, and display the first interface in the first area in the form of a floating window above the desktop.
In one possible implementation, the first animation may include multiple frames of images, each of which may include content in the first interface.
In one possible implementation, to reduce the drawing workload of the terminal device, each frame of image may include a display frame, and an identification of the first application above the display frame.
In this implementation manner, the terminal device does not need to draw the content in the first interface on each frame of image, and the workload of the terminal device can be increased when the content of the first interface is more.
In one possible implementation manner, the terminal device may perform the steps in the display method when the terminal device is in the landscape state. Because when the terminal equipment is in the horizontal screen state, the user operates the first control to trigger the left and right split screen display of the terminal equipment, the viewing habit of the user can be adapted more.
The following describes the display method in the embodiment of the present application from the perspective of interaction between internal modules in the terminal device:
in some embodiments, the terminal device includes a window management service WMS, a layer composition module SF, and a display screen.
The display screen may display the first interface of the first application in full screen. And receiving the operation that the user presses the first control and slides, wherein the display screen can display the first animation and a layer carrying the first animation. When receiving an operation that a user presses the first control and slides, the WMS transmits a first notification to the SF. The SF may create a layer carrying a first animation in response to the first notification and draw the first animation. The display screen may display the first animation.
The WMS may transmit a third notification to the SF when the user lifting operation is received at the first time. The SF responds to the third notification, the layer can be hidden, and the display screen displays the first interface in the first area.
When the user lifting operation is received at the second moment, the WMS may also send a second notification to the SF in order to avoid terminal equipment flashing. In response to a second notification, the SF may hide the layer and the display screen displays the first interface full screen.
Specifically, the terminal device may further include: and the touch data processing module. When the user lifting operation is received at the second moment, the touch data processing module may send a second instruction to the WMS, where the second instruction is used to instruct the user to perform the lifting operation. The WMS may send a second notification to the SF in response to the second indication.
In a possible implementation manner, the SF conceals the layer, and specifically includes: the SF destroys the layer or marks the layer as invisible.
In a second aspect, embodiments of the present application provide a terminal device, which may also be referred to as a terminal (terminal), a User Equipment (UE), a Mobile Station (MS), a Mobile Terminal (MT), or the like. The terminal device may be a mobile phone, a smart television, a wearable device, a tablet (Pad), a computer with wireless transceiving function, a Virtual Reality (VR) terminal device, an augmented reality (augmented reality, AR) terminal device, a wireless terminal in industrial control (industrial control), a wireless terminal in unmanned driving (self-driving), a wireless terminal in teleoperation (remote medical surgery), a wireless terminal in smart grid (smart grid), a wireless terminal in transportation safety (transportation safety), a wireless terminal in smart city (smart city), a wireless terminal in smart home (smart home), or the like.
The terminal device includes: comprising the following steps: a processor and a memory; the memory stores computer-executable instructions; the processor executes computer-executable instructions stored in the memory to cause the terminal device to perform a method as in the first aspect.
In a third aspect, embodiments of the present application provide a computer-readable storage medium storing a computer program. The computer program, when executed by a processor, implements a method as in the first aspect.
In a fourth aspect, embodiments of the present application provide a computer program product comprising a computer program which, when run, causes a computer to perform the method as in the first aspect.
In a fifth aspect, embodiments of the present application provide a chip comprising a processor for invoking a computer program in a memory to perform a method as described in the first aspect.
It should be understood that, the second aspect to the fifth aspect of the present application correspond to the technical solutions of the first aspect of the present application, and the beneficial effects obtained by each aspect and the corresponding possible embodiments are similar, and are not repeated.
Drawings
Fig. 1 is a schematic structural diagram of a terminal device provided in an embodiment of the present application;
Fig. 2 is a block diagram of a terminal device according to an embodiment of the present application;
fig. 3A is a schematic view of a scenario in which the display method provided in the embodiment of the present application is applicable;
fig. 3B is a schematic view of another scenario to which the display method provided in the embodiment of the present application is applicable;
FIG. 4 is a flow chart of a display method;
FIG. 5 is a schematic diagram of an interface of a terminal device;
fig. 6 is a flowchart illustrating an embodiment of a display method according to an embodiment of the present application;
FIG. 7 is a schematic diagram of a first time and a second time according to an embodiment of the present disclosure;
fig. 8 is an interface schematic diagram of a terminal device provided in an embodiment of the present application;
fig. 9 is a flowchart of another embodiment of a display method according to an embodiment of the present application;
fig. 10 is a flowchart of another embodiment of a display method according to an embodiment of the present application.
Detailed Description
For purposes of clarity in describing the embodiments of the present application, in the embodiments of the present application, words such as "exemplary" or "such as" are used to indicate by way of example, illustration, or description. Any embodiment or design described herein as "exemplary" or "for example" should not be construed as preferred or advantageous over other embodiments or designs. Rather, the use of words such as "exemplary" or "such as" is intended to present related concepts in a concrete fashion.
In the embodiments of the present application, "at least one" means one or more, and "a plurality" means two or more. "and/or", describes an association relationship of an association object, and indicates that there may be three relationships, for example, a and/or B, and may indicate: a alone, a and B together, and B alone, wherein a, B may be singular or plural. The character "/" generally indicates that the context-dependent object is an "or" relationship. "at least one of" or the like means any combination of these items, including any combination of single item(s) or plural items(s). For example, at least one (one) of a, b, or c may represent: a, b, c, a-b, a-c, b-c, or a-b-c, wherein a, b, c may be single or plural.
The term "at … …" in the embodiment of the present application may be instantaneous when a certain situation occurs, or may be a period of time after a certain situation occurs, which is not particularly limited in the embodiment of the present application. In addition, the first interface is displayed only as an example, and the first interface may include more or less content.
The terminal device in the embodiment of the application may also be any form of electronic device. For example, the electronic device may be a handheld device including a display screen, an in-vehicle device, or the like. For example, the electronic device may be: a mobile phone, tablet, palm, notebook, mobile internet device (mobile internet device, MID), wearable device, virtual Reality (VR) device, augmented reality (augmented reality, AR) device, wireless terminal in industrial control (industrial control), wireless terminal in unmanned (self driving), wireless terminal in teleoperation (remote medical surgery), wireless terminal in smart grid (smart grid), wireless terminal in transportation security (transportation safety), wireless terminal in smart city (smart city), wireless terminal in smart home (smart home), cellular phone, cordless phone, session initiation protocol (session initiation protocol, SIP) phone, wireless local loop (wireless local loop, WLL) station, personal digital assistant (personal digital assistant, PDA), handheld device with wireless communication function, public computing device or other processing device connected to wireless modem, vehicle-mounted device, wearable device, terminal device in 5G network or evolving land mobile terminal (public land mobile network), and the like, without limiting the examples of this.
By way of example, and not limitation, in embodiments of the present application, the electronic device may also be a wearable device. The wearable device can also be called as a wearable intelligent device, and is a generic name for intelligently designing daily wear by applying wearable technology and developing wearable devices, such as glasses, gloves, watches, clothes, shoes and the like. The wearable device is a portable device that is worn directly on the body or integrated into the clothing or accessories of the user. The wearable device is not only a hardware device, but also can realize a powerful function through software support, data interaction and cloud interaction. The generalized wearable intelligent device includes full functionality, large size, and may not rely on the smart phone to implement complete or partial functionality, such as: smart watches or smart glasses, etc., and focus on only certain types of application functions, and need to be used in combination with other devices, such as smart phones, for example, various smart bracelets, smart jewelry, etc. for physical sign monitoring.
In addition, in the embodiment of the application, the electronic device may also be a terminal device in an internet of things (internet of things, ioT) system, and the IoT is an important component of future information technology development, and the main technical characteristic of the IoT is that the article is connected with a network through a communication technology, so that man-machine interconnection and an intelligent network for internet of things are realized.
The terminal device in the embodiment of the present application may also be referred to as: a User Equipment (UE), a Mobile Station (MS), a Mobile Terminal (MT), an access terminal, a subscriber unit, a subscriber station, a mobile station, a remote terminal, a mobile device, a user terminal, a wireless communication device, a user agent, or a user equipment, etc.
The hardware configuration of the terminal device is described below for convenience of understanding. Fig. 1 is a schematic structural diagram of a terminal device according to an embodiment of the present application. As shown in fig. 1, the terminal device includes: the terminal device may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, keys 190, a motor 191, an indicator 192, a camera 193, a display 194, and a subscriber identity module (subscriber identification module, SIM) card interface 195, etc. The sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It will be appreciated that the structure illustrated in the embodiments of the present invention does not constitute a specific limitation on the terminal device. In other embodiments of the present application, the terminal device may include more or less components than illustrated, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors. The terminal device may execute the method provided in the embodiments of the present application through the processor 110.
The terminal device implements display functions through a GPU, a display screen 194, an application processor, and the like. The terminal device may display the page by way of a GPU, a display screen 194, and an application processor, for example. The GPU is a microprocessor for image processing, and is connected to the display screen 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. The display panel may employ a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (AMOLED) or an active-matrix organic light-emitting diode (matrixorganic light emitting diode), a flexible light-emitting diode (flex), a mini, a Micro led, a Micro-OLED, a quantum dot light-emitting diode (quantum dot lightemitting diodes, QLED), or the like. In some embodiments, the terminal device may include 1 or N display screens 194, N being a positive integer greater than 1.
The internal memory 121 may be used to store computer-executable program code that includes instructions. The internal memory 121 may include a storage program area and a storage data area. For example, the internal memory 121 may store codes corresponding to the methods provided in the embodiments of the present application.
In addition, above the above components, the device also runs an operating system. Such as an iOS operating system, an android operating system, or a Windows operating system, etc., to which embodiments of the present application are not limited. Running applications may be installed on the operating system.
The software system of the terminal device may adopt a layered architecture, an event driven architecture, a microkernel architecture, a microservice architecture, a cloud architecture, or the like, which will not be described herein. Fig. 2 is a block diagram of a terminal device according to an embodiment of the present application. The layered architecture divides the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface. In some embodiments, the terminal device may be divided into four layers, from top to bottom, an application layer, a framework layer, a hardware abstraction layer (hardware abstraction layer, HAL), a kernel layer (kernel), respectively.
The application layer may include a series of application packages. The application layer may run applications by calling an application program interface (application programming interface, API) provided by the framework layer. By way of example, the application packages may include social applications, video applications, system first interface (system UI), and the like.
Taking the example in fig. 2 where the application layer may comprise a first application, the first application may be, for example, a third party application or a system application. In some embodiments, the first application may not be a desktop application.
The framework layer provides an application programming interface (application programming interface, API) and programming framework for application programs of the application layer. The framework layer includes some predefined functions.
As shown in fig. 2, the frame layer may include: window management services (window manager service, WMS), layer composition modules (SF), and touch data processing modules, among others.
WMSs are used to manage windows. For example, the WMS may obtain the display size, determine if there is a status bar, lock the screen, touch the screen, drag the screen, intercept the screen, and so on. The display manager is used to manage the lifecycle of the display, it decides how to control its logical display according to the currently connected physical display device, and upon a state change, sends notifications and the like to the system and applications.
In this embodiment of the present application, the WMS may further trigger the SF to hide the layer of the bearer interface, so as to avoid the terminal device from having a splash screen, which may be specifically described in the following embodiments.
In some embodiments, the SF may perform layer composition on an interface to be displayed on the screen, so that the display screen may display the interface.
In addition, the functions of the WMS, SF, and touch data processing module may be described with reference to the following embodiments.
The touch data processing module is configured to receive a touch event (MotionEvent) from a touch driver, and trigger the WMS and SF to display a corresponding interface based on the touch event, which may be specifically described in the following embodiments. In some embodiments, touch events may include, but are not limited to: ACTION DOWN event, ACTION MOVE event, ACTION UP event, etc. Wherein the ACTION DOWN event characterizes a user touching the screen, e.g. a user's finger touching the screen. The ACTION _ MOVE event characterizes a swipe, such as a user's finger swipe on the screen. The ACTION UP event characterizes the user performing a lift operation, e.g., the user's finger off the screen. In some embodiments, the user lifting operation may also be referred to as user lifting, user loosening, and the like.
The HAL layer aims at abstracting hardware, and can provide a unified interface for inquiring hardware equipment for an upper-layer application or can also provide a data storage service for the upper-layer application. As shown in fig. 2, for example, the HAL layer may include: display driver module, sensor hardware abstraction (sensor his). The display driving module is used for controlling the display driving to display the contents such as pictures, the first interface, notification information and the like on the display screen. The sensor hardware abstraction can control the sensor and monitor event change notifications for sensor drives. In some embodiments, the plurality of modules in the HAL layer may follow a hardware abstraction layer interface description language (HAL interface definition language, HIDL) or AIDL.
The kernel layer is a layer between hardware and software. As shown in fig. 2, the kernel layer may include one or more of the following: sensor drive, display drive, touch drive, etc. In an embodiment of the present application, the sensor driver includes a sensor subsystem framework and a sensor subsystem driver, located at a coprocessor (SCP).
In this embodiment of the present application, the touch driver may receive touch data from the touch data acquisition module, and send a touch event to the touch data processing module based on the touch data. In some embodiments, the touch data may include, but is not limited to: touch location, touch duration, touch distance, etc. In some embodiments, the touch data may include capacitance sample values at each location on the display screen, and the touch driver may determine a touch location, a touch duration, a touch distance, etc., based on the capacitance sample values at each location on the display screen. In the embodiment of the present application, a description is not given of how the touch driver determines the touch action of the user based on the touch data.
It is understood that the terminal device may include an application processor (application process, AP) and a Coprocessor (CP). The application processor comprises the framework layer, a hardware abstraction layer, a kernel layer and the like; the co-processor includes a sensor hub.
In some embodiments, the terminal device may also include a hardware layer. For example, the hardware layer may include: and the sensor, the display screen, the touch data acquisition module and other hardware. In some embodiments, the touch data acquisition module may be a touch panel, and the touch driver may drive the touch panel to work, where the touch panel is used for acquiring touch data. In some embodiments, the touch panel may be integrated with the display screen or may be provided separately from the display screen, which is not limited in this embodiment of the application.
It should be understood that in some embodiments, layers that implement the same function may be referred to by other names, or layers that implement the functions of multiple layers may be taken as one layer, or layers that implement the functions of multiple layers may be divided into multiple layers. The embodiments of the present application are not limited in this regard.
Fig. 3A is a schematic view of a scenario where the display method provided in the embodiment of the present application is applicable. In fig. 3A, taking a terminal device as an example of a tablet pc, in this embodiment of the present application, the screen state of the terminal device is not limited, for example, the terminal device may be in a horizontal screen state or a vertical screen state, and in fig. 3A, the tablet pc is in a horizontal screen state as an example.
Referring to a in fig. 3A, the terminal device may display a first interface of the first application, the first interface including the first control 31, full screen. For example, in a in fig. 3A, the first application is taken as a memo application as an example, and the first interface of the memo application may be, for example, a memo application home page. It should be appreciated that the first interface may be any of the interfaces of the first application. The position and the form of the first control 31 are not limited in the embodiments of the present application, for example, the first control 31 may be at a preset position of the first interface. In some embodiments, referring to a in fig. 3A, to facilitate user operation of the first control 31, the first control 31 may be at the top of the first interface and displayed in the form of a bar.
The user operates the first control 31, and can trigger the terminal device to adjust the display mode of the first interface. In some embodiments, the display manner of the first interface may include, but is not limited to: full screen display, split screen display, display in a floating window, minimized display, and reduced size of the first interface for one-handed operation, etc., as embodiments of the present application are not limited in this regard. In the following, explanation will be given taking "the user operates the first control 31 to trigger the terminal device to display the first interface in a split screen manner, and the user operates the first control 31 to trigger the terminal device to display the first interface in a floating window manner" as an example.
First, a first control 31 is taken as an example of a first interface which can be displayed by triggering the terminal device to split the screen by a user. In some embodiments, the user may perform an "operation of pressing the first control 31 and sliding", which may also be referred to as the user performing an operation of dragging the first control 31. The user performs "the operation of pressing the first control 31 and sliding" and may trigger the terminal device to display the first interface in a split screen manner.
For example, during a user pressing the first control 31 and sliding, the terminal device may display a first animation that characterizes the first interface moving following the user's sliding. The first animation may include multiple frames of images. In some embodiments, the size of each frame of image may be less than the full screen size. In some embodiments, the size of each frame of image may be a preset size. The position of each frame of image may be different.
In some embodiments, each frame of image may include content that is displayed by the first interface, i.e., each frame of image is a scaled down version of the first interface. It can also be said that each frame of image can be a first interface having a preset size.
In some embodiments, when each frame of image includes the content displayed by the first interface, the terminal device needs to draw the content in each frame of image, in order to reduce the drawing workload of the terminal device, each frame of image may include a display frame, and the identification of the first application is displayed on the display frame. The identification of the first application may include, but is not limited to: the name of the first application, an icon of the first application, and the like are used to represent information of the first application.
Illustratively, referring to b-d in fig. 3A, taking any three frames of images in the first animation as an example, the terminal device may display the first animation following the operation of the user pressing the first control 31 and sliding. For example, each frame of image of the first animation may include a display frame 32, and an icon 33 of the first application on the display frame. It should be understood that the shape, size, color, etc. of the display frame 32 are not limited in the embodiments of the present application, nor is the position of the first application icon 33 in the display frame 32 limited.
When the user presses the first control 31 and slides to the first preset area of the terminal device, the first control is lifted, and the terminal device can be triggered to display the first interface in a split screen mode. For example, referring to e in fig. 3A, the first preset area may be an area on the right side of the screen, and the position of the first preset area is not limited in the embodiment of the present application.
For example, taking an example in which the terminal may split the screen into two areas, after the user performs the lifting operation, referring to f in fig. 3A, the terminal device may display a first interface in a first area and a second interface in a second area. The embodiment of the application does not limit the dividing mode, the position and the size of the first area and the second area. In fig. 3A, the first region is illustrated on the right side of the second region, and the second region and the first region are the same in size.
In some embodiments, the second interface may be a desktop, or an interface of another application. The other application may be, for example, an application in the terminal device that has been backed up to the background. It should be understood that the details of the second interface are not shown in f of fig. 3A, and that the second interface is displayed in a text representing the second area of the "second interface".
Next, referring to fig. 3B, an example will be described in which the user operates the first control 31 to trigger the terminal device to display the first interface in the form of a floating window. Illustratively, the user pressing the first control 31 and sliding to the second preset area of the terminal device and then lifting the first control may trigger the terminal device to display the first interface in the form of a floating window. For example, referring to e in fig. 3B, the second preset area may be an area on the left side of the screen, and the position of the second preset area is not limited in the embodiment of the present application. It should be appreciated that a-d in FIG. 3B may refer to the description in a-d in FIG. 3A, unlike a-d in FIG. 3A, in which the user slides the first control 31 in a different direction.
After the user performs the lifting operation, referring to f in fig. 3B, the terminal device may display a desktop, and display a first interface in a first area in the form of a floating window above the desktop. Alternatively, the terminal device may display a desktop, and a floating window of the first interface is displayed in a first area above the desktop. In some embodiments, the first area may be a preset area, or the first area may be an area where the user is lifted. It should be understood that f in fig. 3B is exemplified as including a plurality of application icons on the desktop, and a portion of the application icons are obscured by the floating window of the first interface.
In some embodiments, in the embodiments of the present application, the operation mode that the user operates the first control 31 to trigger the terminal device to display the first interface in a split screen manner and trigger the terminal device to display the first interface in a floating window manner is not limited. For example, the user may further press the first control 31 and slide a first preset distance, and trigger the terminal device to display the first interface in a split screen manner. For example, the user may also press the first control 31 and slide a second preset distance, triggering the terminal device to display the first interface in the form of a floating window. The first preset distance and the second preset distance are different.
In some embodiments, for example, the user may further perform a double-click operation on the first control 31, triggering the terminal device to display the first interface on a split screen. For example, the user may also perform a long-press operation on the first control 31, triggering the terminal device to display the first interface in the form of a floating window.
In some embodiments, the user performs an operation of "triggering the terminal device to display the first interface in a split screen" or "triggering the terminal device to display the first interface in a floating window form," which may be referred to as a first operation.
The display procedure of the first interface in fig. 3A and 3B will be described herein with reference to the structure of the terminal device shown in fig. 2:
Fig. 4 is a flow chart of a display method. Referring to fig. 4, the display method may include:
s401, displaying a first interface of a first application on a display screen in a full screen mode, wherein the first interface comprises a first control.
The display screen displays the first interface of the first application full screen, which may be referred to as a in fig. 3A.
S402, receiving an operation that a user presses a first control and slides, and the WMS sends a first notification to the SF, wherein the first notification is used for indicating the SF to draw a first animation.
In some embodiments, the user may perform a press down operation on the first control, e.g., the user may touch where the first control is located. The touch control driver can determine that the user presses the first control based on the touch control data acquired by the touch control data acquisition module, and the touch control driver can send an action_down event to the touch control data processing module, wherein the touch position is the position of the first control. In addition, when the user presses the first control and slides, the touch control driver may determine that the user presses the first control and slides based on the touch control data collected by the touch control data collection module, and the touch control driver may send an action_move event to the touch control data processing module.
In some embodiments, the touch data processing module may determine that the user performs an operation of pressing the first control and sliding in response to the action_move event from the touch driver, and the touch data processing module may send the first indication to the WMS. In some embodiments, the first indication may be, for example, an ACTION _ MOVE event. It should be understood that the step of the touch data processing module sending the first indication to the WMS is not shown in fig. 4.
Wherein the first indication is used for indicating to draw a first animation. The first animation may characterize a process in which the first interface moves following a slide of the user. The first animation may be described with reference to fig. 3A.
S403, the SF responds to the first notification to create a layer carrying the first animation.
In some embodiments, the layer may be an animation-leash layer. In some embodiments, the layer is used to carry the first animation, which can also be said to be mounted on the layer.
In some embodiments, the size of the layer carrying the first animation may be the same as the size of the full screen. In some embodiments, the size of the layer carrying the first animation may be the same as the size of the image in the first animation, which is not limited in the embodiments of the present application, and the following embodiments are described by taking the example that the size of the layer carrying the first animation is the same as the size of the full screen.
S404, the WMS sends a second instruction to the SF, wherein the second instruction is used for indicating information of the image in the first animation.
S405, the SF draws the first animation according to the second instruction.
In some embodiments, when the user performs an operation of pressing the first control and starting the sliding, the WMS may send a first notification to the SF to instruct the SF to draw the first animation. The WMS may continuously send a second indication to the SF during the user sliding of the first control, for example, the WMS may send a plurality of second indications to the SF, and the number of second indications may be equal to the number of frames of the image in the first animation. One second instruction may correspond to one frame of image, and the second instruction may include information of the image to which the second instruction corresponds. And each time the WMS sends a second instruction to the SF, the SF can draw a frame of image in the first animation according to the second instruction until the first animation is drawn.
For example, the second indication may be a transaction instruction, and the transaction instruction may include information of the image. The information of the image may include, but is not limited to: the size, position, visibility, etc. of the window to which the image belongs.
In some embodiments, during the process of the user sliding the first control, the WMS may send a second indication to the SF once, where the second indication includes information of each frame of image in the first animation. Thus, the SF can sequentially draw multi-frame images according to the second instruction, and the first animation is obtained.
S406, the display screen displays the first animation and a layer carrying the first animation.
The process of displaying the first animation on the display screen may be described with reference to fig. 3A. In some embodiments, after the SF draws the first animation, the SF may send the first animation and the layer carrying the first animation to display, for example, the SF may send the first animation and the layer carrying the first animation to the display driver, so that the display driver may drive the display screen to display the first animation and the layer carrying the first animation.
In some embodiments, the display screen displays the first animation, and the layer 34 carrying the first animation, the layer 34 carrying the first animation being represented in diagonal shading in fig. 3A and 3B.
In some embodiments, the first application may draw a first animation and send the first animation to the SF. The SF may then display the first animation so that the display screen displays the first animation.
At the first moment, the WMS transmits a third notification to the SF in response to the user lifting operation S407.
In the embodiment of the application, the first moment and the second moment are adopted to respectively represent the two moments of lifting after the user executes the operation of pressing the first control and sliding. The first moment characterizes the moment when the user lifts up after the user finishes the first operation, the second moment characterizes the moment when the user does not finish the first operation, and the second moment is earlier than the first moment.
Referring to e in fig. 3A, the user performs the first operation and then lifts up, and the time of lifting up the user may be referred to as a first time. If the user performs the operation of pressing the first control and sliding, the lifting operation is performed at the second moment, and the user does not perform the first operation, so that the operation performed by the user cannot trigger the terminal device to display the first interface in a split screen manner or display the first interface in a floating window manner.
In some embodiments, when the user performs the first operation and performs the lifting operation at the first time, the touch data processing module may send a third indication to the WMS according to the touch data from the touch data acquisition module. The third indication is used for indicating that the user has performed the first operation, and the first interface can be displayed in the first area. It should be understood that the step of the touch data processing module sending the third indication to the WMS is not shown in fig. 4.
Wherein the WMS may send a third notification to the SF in response to the third indication. Third is known to inform the SF to hide the layer carrying the first animation. In some embodiments, the third notification may be an apply instruction.
S408, the SF hides the layer carrying the first animation.
S409, the display screen displays a first interface in the first area.
In this embodiment of the present application, after the user performs the first operation, the SF may first hide the layer carrying the first animation, and then the display screen displays the first interface in the first area, so that the layer is already hidden when the display screen displays the first interface, and the user may directly see the first interface, and no splash screen will occur.
When the user does not perform the first operation, the lifting operation is performed, for example, the user presses the first control and slides the first control, but does not slide to the first preset area or the second preset area, and the lifting operation is performed, because the user does not perform the first operation, the operation performed by the user cannot trigger the terminal device to display the first interface in a split screen manner or display the first interface in a floating window manner, and the terminal device can display the first interface in a full screen manner.
Referring to c in fig. 5, the user presses the first control and slides, but performs the lifting operation without sliding to the first preset area, and the touch data processing module determines that the user has not performed the first operation according to the touch data from the touch data acquisition module, and does not send the third instruction to the WMS, and similarly, the WMS does not send the third notification (apply instruction) to the SF. Because the SF does not receive the third notification, the SF does not hide the layer 34 carrying the first animation, and when the display screen displays the first interface in full screen, the display screen may display the layer 34 carrying the first animation because the layer 34 carrying the first animation is not hidden, as shown by d in fig. 5.
In addition, after the display screen displays the first interface in full screen, the WMS determines that the first animation is canceled (or the display is ended), and at this time, the WMS sends a third notification to the SF. The SF, in response to the third notification, may hide the layer 34 carrying the first animation so that the user can see the layer 34 disappearing, as shown by e in fig. 5. After the terminal equipment displays the first interface in a full screen mode, the disappearance of the image layer causes the terminal equipment to flash the screen, and the screen of the user is enabled to flash. It should be understood that a-b in FIG. 5 may refer to the description in a-b in FIG. 3A.
In this embodiment of the present application, when the user performs the lifting operation without performing the first operation, the touch data processing module may instruct the WMS to send a second notification of hiding the layer to the SF when it is detected that the user performs the lifting operation, so that the SF may hide the layer carrying the first animation in response to the second notification, and thus, after the display screen displays the first interface in full screen, the problem that the SF hides the layer carrying the first animation only does not occur, and a terminal device may be prevented from flashing.
Fig. 6 is a flowchart of an embodiment of a display method according to an embodiment of the present application. Referring to fig. 6, the display method provided in the embodiment of the present application may include:
S601, displaying a first interface of a first application on a display screen in a full screen mode, wherein the first interface comprises a first control.
S602, receiving an operation that a user presses a first control and slides, and the WMS sends a first notification to the SF, wherein the first notification is used for indicating the SF to draw a first animation.
S603, the SF responds to the first notification to create a layer carrying the first animation.
S604, the WMS sends a second instruction to the SF, wherein the second instruction is used for indicating information of the image in the first animation.
S605, the SF draws the first animation according to the second instruction.
S606, the display screen displays the first animation and a layer carrying the first animation.
S601 to S606 can refer to the descriptions in S401 to S406.
At the second moment, in response to the user lifting operation, the touch data processing module sends a fourth instruction to the WMS, where the fourth instruction is used to instruct the user to perform the lifting operation.
In some embodiments, the first time may be a time when the user performs the lifting operation after performing the first operation, and the second time may be a time when the user performs the lifting operation without performing the first operation. In some embodiments, the time when the user presses the first control and starts sliding may be taken as the third time, and the second time may be any time between the first time and the third time.
Referring to fig. 7, for example, at time T1, the user presses the first control and begins to slide, and time T2 is: after the first control is slid, the terminal equipment can be triggered to display the first interface in a split screen mode or display the earliest moment of the first interface in a floating window mode. The second time may be any time between the time T1 and the time T2, and the first time may be the time T2, or a time within a period of time after the time T2.
Even if the user performs the lifting operation while the user does not complete the first operation, for example, the user performs the lifting operation at the second time, the touch data processing module may determine that the user does not complete the first operation and performs the lifting operation based on the touch data from the touch data collection module in response to the user lifting operation, and the touch data processing module may send a fourth indication to the WMS indicating that the user performs the lifting operation.
In some embodiments, the fourth indication is further used to indicate that the first interface is displayed full screen.
S608, the WMS sends a second notification to the SF.
The second notification is used for indicating the SF to hide the layer carrying the first animation and indicating the first interface to be displayed in a full screen mode.
S609, SF hides the layer carrying the first animation.
In some embodiments, the SF hiding layer carrying the first animation may specifically be: the SF destroys the layer or the SF marks the layer as invisible.
S610, the display screen displays the first interface in a full screen mode.
In this embodiment of the present application, because the user does not complete the first operation, the display screen is not triggered to display the first interface in the first area, and the display screen may display the first interface in full screen.
Referring to a-b in fig. 8, a user pressing and sliding a first control, the display screen may display a first animation and a layer 34 carrying the first animation. Referring to c in fig. 8, when the user slides the first control to the first preset area, that is, when the user does not perform the first operation, the lifting operation is performed, based on the description in S608-S610, the SF may first hide the layer 34 carrying the first animation, as shown by d in fig. 8, and the layer 34 carrying the first animation is not displayed on the display screen. In addition, in response to the user performing the lift-up operation, the display screen may display the first interface full-screen, as shown by d in fig. 8. It should be understood that a-b in FIG. 8 may refer to the description in a-b in FIG. 3A.
In this embodiment of the present application, even if the user performs the lifting operation when the first operation is not completed, in order to avoid the terminal device from flashing, when the touch data processing module detects that the user performs the lifting operation, the WMS user may be notified that the lifting operation has been performed, and the WMS may notify the SF to hide the layer carrying the first animation. Therefore, the SF can hide the layer carrying the first animation before displaying the first interface in a full screen mode, the problem that the layer disappears after displaying the first interface in a full screen mode can be avoided, and the terminal equipment can be prevented from flashing.
In some embodiments, a WMS may include a plurality of modules, where the modules interact with each other, and the steps of the WMS in the foregoing embodiments may be implemented. The following describes the interaction process between the touch data processing module, each module in the WMS, and the SF, with reference to fig. 9.
In some embodiments, referring to fig. 9, the wms may include: a gesture management consumer module (HnMultiWd Gesture Manager Consumer), an activity task management module (Activity Task Manager), a gesture management module (HnMultiWd Gesture Manager), a gesture animation management module (Hn Gesture Animation), a gesture animation controller (Hn Gesture Animation Controller), and an animation assistant module (HnMultiWdTop Full Animation Helper).
Referring to fig. 9, a display method provided in an embodiment of the present application may include:
s901, displaying a first interface of a first application on a display screen in a full screen mode, wherein the first interface comprises a first control.
S902, receiving the operation that the user presses the first control and slides, and sending an action_MOVE event to HnMultiWd Gesture Manager Consumer by the touch data processing module.
S903, hnMultiWd Gesture Manager Consumer instructs Activity Task Manager to start drawing the first animation.
For example, hnMultiWd Gesture Manager Consumer can call start Gesture Animation a function, instructing Activity Task Manager to begin drawing a first animation.
S904, activity Task Manager instructs HnMultiWd Gesture Manager to start drawing the first animation.
For example, activity Task Manager can call startHnGesture Animation a function, instructing HnMultiWd Gesture Manager to begin drawing a first animation.
S905, hnMultiWd Gesture Manager instructs Hn Gesture Animation to start drawing the first animation.
For example, hnMultiWd Gesture Manager can call start Gesture Animation a function, instructing Hn Gesture Animation to begin drawing a first animation.
S906, hn Gesture Animation instructs Hn Gesture Animation Controller to start drawing the first animation.
For example, hn Gesture Animation may call a start Animation function, instructing Hn Gesture Animation Controller to begin drawing a first Animation.
S907, hn Gesture Animation Controller send a first notification to the SF.
S908, the SF creates a layer carrying the first animation in response to the first notification.
S909, hn Gesture Animation Controller transmits a second instruction to the SF, the second instruction being information indicating an image in the first animation.
S910, the SF draws the first animation according to the second instruction.
In some embodiments, the display screen may display the first animation and a layer carrying the first animation, which is not shown in fig. 9.
S911, when receiving the user lifting operation at the second moment, the touch data processing module sends the fourth instruction to HnMultiWd Gesture Manager Consumer.
S912, hnMultiWd Gesture Manager Consumer determines that the first animation is canceled.
S913, hnMultiWd Gesture Manager Consumer determines that the layer carrying the first animation needs to be hidden.
S914, hnMultiWd Gesture Manager Consumer sends a gesture end indication to HnMultiWdTop Full Animation Helper.
In some embodiments, the gesture end indication characterizes the user performing a lift operation, and may also characterize the first animation to cancel the display, requiring hiding the layer carrying the first animation.
S915, hnMultiWdTop Full Animation Helper determines to execute the operation of the reset layer.
In some embodiments, hnMultiWdTop Full Animation Helper may determine that a reset layer (resetSurfaceLayer) needs to be performed. Resetting a layer may be understood as adjusting an attribute of the layer, and in an embodiment of the present application, resetting a layer may be understood as hiding a layer carrying the first animation, for example, adjusting an attribute of the layer to be invisible.
S916, hnMultiWdTop Full Animation Helper sends a second notification to the SF.
S917, SF hides the layer carrying the first animation.
S918, the display screen displays the first interface in full screen.
It should be understood that the embodiment of the present application has the same implementation and technical effects as the embodiment shown in fig. 6 described above, and reference may be made to the description in the above embodiment.
In the foregoing embodiments, the process of implementing the display method by interaction between the modules in the terminal device is described, and in some embodiments, the display method provided in the embodiments of the present application is described below with reference to fig. 10 in terms of the terminal device as an execution body.
Referring to fig. 10, the display method provided in the embodiment of the present application may include:
s1001, displaying a first interface of the first application in full screen, where the first interface includes a first control.
S1002, receiving the operation that the user presses the first control and slides, and displaying the first animation and the layer carrying the first animation.
S1001 to S1002 may refer to the descriptions in S401 to S407.
S1003, when the user lifting operation is received at the first moment, hiding a layer carrying the first animation, and displaying a first interface in a first area of the screen, wherein the size of the first area is smaller than that of the full screen.
S1004, hiding a layer carrying the first animation when receiving the user lifting operation at a second moment, and displaying the first interface in a full screen mode, wherein the second moment is earlier than the first moment.
It should be understood that S1003 and S1004 are steps that are alternatively performed.
In S1003, when the user performs the lifting operation after performing the first operation, that is, when the terminal device receives the user lifting operation at the first time, the terminal device may hide the layer carrying the first animation because the WMS may transmit the third notification to the SF. In addition, since the user performs the first operation, the terminal device may display the first interface in the first area of the screen, specifically referring to fig. 3A and 3B.
In S1004, when the user has performed the lifting operation without performing the first operation, that is, when the terminal device receives the user lifting operation at the second time, the second time is earlier than the first time. Because the touch data processing module responds to the user lifting operation and sends a fourth indication to the WMS, the WMS may send a second notification to the SF indicating to hide the layer, so that the terminal device may hide the layer carrying the first animation. In addition, since the user does not perform the first operation, the terminal device may display the first interface full screen, and in particular, refer to fig. 8.
The embodiment of the present application has the same implementation and technical effects as the embodiment shown in fig. 6 described above, and reference may be made to the description in the above embodiment.
The display method provided by the embodiment of the application has been described above, and the device for executing the display method provided by the embodiment of the application is described below. It will be appreciated by those skilled in the art that the methods and apparatus may be combined and referenced with each other, and that the related apparatus provided in the embodiments of the present application may perform the steps in the above-described display method. In some embodiments, the related apparatus may include a terminal device, a chip, a computer-readable storage medium, a computer program product, and the like, as described below.
The display method provided by the embodiment of the application can be applied to the electronic equipment with the communication function. The electronic device includes a terminal device, and specific device forms and the like of the terminal device may refer to the above related descriptions, which are not repeated herein.
The embodiment of the application provides a terminal device, which comprises: comprising the following steps: a processor and a memory; the memory stores computer-executable instructions; the processor executes the computer-executable instructions stored in the memory to cause the terminal device to perform the display method described above.
The embodiment of the application provides a chip. The chip comprises a processor for invoking a computer program in a memory to perform the technical solutions in the above embodiments. The principle and technical effects of the present invention are similar to those of the above-described related embodiments, and will not be described in detail herein.
Embodiments of the present application also provide a computer-readable storage medium. The computer-readable storage medium stores a computer program. The computer program realizes the above method when being executed by a processor. The methods described in the above embodiments may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. If implemented in software, the functions may be stored on or transmitted over as one or more instructions or code on a computer-readable medium. Computer readable media can include computer storage media and communication media and can include any medium that can transfer a computer program from one place to another. The storage media may be any target media that is accessible by a computer.
In one possible implementation, the computer readable medium may include RAM, ROM, compact disk-read only memory (CD-ROM) or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium targeted for carrying or storing the desired program code in the form of instructions or data structures and accessible by a computer. Also, any connection is properly termed a computer-readable medium. For example, if the software is transmitted from a website, server, or other remote source using a coaxial cable, fiber optic cable, twisted pair, digital subscriber line (Digital Subscriber Line, DSL), or wireless technologies such as infrared, radio, and microwave, then the coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave are included in the definition of medium. Disk and disc, as used herein, includes optical disc, laser disc, optical disc, digital versatile disc (Digital Versatile Disc, DVD), floppy disk and blu-ray disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers. Combinations of the above should also be included within the scope of computer-readable media.
The present embodiments provide a computer program product comprising a computer program which, when executed, causes a computer to perform the above-described method.
Embodiments of the present application are described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processing unit of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processing unit of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
The foregoing detailed description of the invention has been presented for purposes of illustration and description, and it should be understood that the foregoing is by way of illustration and description only, and is not intended to limit the scope of the invention.

Claims (12)

1. A display method, characterized in that it is applied to a terminal device, the method comprising:
displaying a first interface of a first application in a full screen mode, wherein the first interface comprises a first control;
receiving the operation that a user presses the first control and slides, and displaying a first animation and a layer carrying the first animation;
when the user lifting operation is received at the first moment, hiding the layer, and displaying the first interface in a first area of a screen, wherein the size of the first area is smaller than that of a full screen;
when the user lifting operation is received at a second moment, hiding the image layer, and displaying the first interface in a full screen mode, wherein the second moment is earlier than the first moment.
2. The method of claim 1, wherein the terminal device is in a landscape state.
3. The method of claim 1 or 2, wherein displaying the first interface in a first area of a screen comprises:
and displaying the first interface in the first area, and displaying a second interface in the second area of the screen.
4. The method of claim 1 or 2, wherein displaying the first interface in a first area of a screen comprises:
And displaying a desktop, and displaying the first interface in the first area in the form of a floating window above the desktop.
5. The method of any of claims 1-4, wherein the first animation comprises a plurality of frames of images, each frame of images comprising: a display frame, and an identification of the first application above the display frame.
6. The method of any of claims 1-5, wherein the first control is at a top of the first interface.
7. The method according to any one of claims 1-6, wherein the terminal device comprises a window management service WMS, a layer composition module SF, and a display screen, and when receiving an operation that a user presses the first control and slides, the method further comprises:
the WMS sends a first notification to the SF;
the SF creates the layer and draws the first animation;
the displaying a first animation, comprising:
the display screen displays the first animation;
when the user lifting operation is received at the second moment, hiding the layer, and displaying the first interface in a full screen mode, wherein the method comprises the following steps:
when receiving the operation of lifting the user at a second moment, the WMS sends a second notification to the SF;
The SF conceals the layers;
and displaying the first interface on the display screen in a full screen mode.
8. The method of claim 7, wherein hiding the layer and displaying the first interface in a first area of a screen when the user-lifted operation is received at a first time comprises:
when the operation of lifting the user is received at the first moment, the WMS sends a third notification to the SF;
the SF conceals the layers;
and the display screen displays the first interface in the first area.
9. The method of claim 7 or 8, wherein the SF hiding the layer comprises:
the SF destroys the layer or marks the layer as invisible.
10. A terminal device, comprising: a processor and a memory;
the memory stores computer-executable instructions;
the processor executing computer-executable instructions stored in the memory to cause the terminal device to perform the method of any one of claims 1-9.
11. A computer readable storage medium storing a computer program, characterized in that the computer program when executed by a processor implements the method according to any one of claims 1-9.
12. A computer program product comprising a computer program which, when run, causes a computer to perform the method of any of claims 1-9.
CN202310891688.6A 2023-07-19 2023-07-19 Display method and related device Pending CN117707403A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310891688.6A CN117707403A (en) 2023-07-19 2023-07-19 Display method and related device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310891688.6A CN117707403A (en) 2023-07-19 2023-07-19 Display method and related device

Publications (1)

Publication Number Publication Date
CN117707403A true CN117707403A (en) 2024-03-15

Family

ID=90146758

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310891688.6A Pending CN117707403A (en) 2023-07-19 2023-07-19 Display method and related device

Country Status (1)

Country Link
CN (1) CN117707403A (en)

Similar Documents

Publication Publication Date Title
US11221695B2 (en) Electronic device
WO2022100315A1 (en) Method for generating application interface, and related apparatus
KR102364420B1 (en) Electronic device and method of controlling the electronic device based on touch input
CN114911380B (en) Card display method, electronic device, and computer-readable storage medium
US9952711B2 (en) Electronic device and method of processing screen area of electronic device
EP3018561B1 (en) Virtual environment for sharing information
CN109840061A (en) The method and electronic equipment that control screen is shown
EP3586316B1 (en) Method and apparatus for providing augmented reality function in electronic device
KR20180097384A (en) Electronic apparatus and controlling method thereof
KR20180095409A (en) Electronic device and method for displaying screen thereof
EP3958117A1 (en) User interface layout method and electronic device
KR102576654B1 (en) Electronic apparatus and controlling method thereof
KR102328102B1 (en) Electronic apparatus and screen display method thereof
KR20180024337A (en) A driving method for a display including a curved display area and a display driving circuit and an electronic device supporting the same
WO2022222752A1 (en) Display method and related apparatus
EP3447673B1 (en) Electronic device and control method therefor
CN115185440B (en) Control display method and related equipment
EP3327551A1 (en) Electronic device for displaying image and method for controlling the same
CN117707403A (en) Display method and related device
EP3521987A1 (en) Method and device for displaying page, graphical user interface, and mobile terminal
CN116679864B (en) Touch interaction processing method and electronic equipment
WO2024017145A1 (en) Display method and electronic device
CN116088715B (en) Message reminding method and electronic equipment
CN116737292B (en) Display mode switching method, electronic equipment and readable storage medium
CN117707451A (en) Display method and related device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination