CN114398016A - Interface display method and device - Google Patents

Interface display method and device Download PDF

Info

Publication number
CN114398016A
CN114398016A CN202210029399.0A CN202210029399A CN114398016A CN 114398016 A CN114398016 A CN 114398016A CN 202210029399 A CN202210029399 A CN 202210029399A CN 114398016 A CN114398016 A CN 114398016A
Authority
CN
China
Prior art keywords
area
interface
touch
display
detected
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210029399.0A
Other languages
Chinese (zh)
Other versions
CN114398016B (en
Inventor
于林
张志成
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jinhua Hongzheng Technology Co ltd
Original Assignee
Jinhua Hongzheng Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jinhua Hongzheng Technology Co ltd filed Critical Jinhua Hongzheng Technology Co ltd
Priority to CN202210029399.0A priority Critical patent/CN114398016B/en
Publication of CN114398016A publication Critical patent/CN114398016A/en
Application granted granted Critical
Publication of CN114398016B publication Critical patent/CN114398016B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
    • G06F3/1431Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display using a single graphics controller
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Graphics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application provides an interface display method and device, and relates to the technical field of mobile internet. The interface display method comprises the following steps: displaying a first interface comprising a first area on a display of the terminal equipment, wherein the first area comprises a plurality of first touch areas corresponding to a plurality of equipment types and a plurality of second interfaces one to one; detecting touch operation on a plurality of first touch areas, and in response to the detected touch operation, displaying a second interface which corresponds to the detected first touch area and comprises a second area on a display of the terminal equipment, wherein the second area or an interface jumped to through the second area comprises a plurality of second touch areas which correspond to a plurality of working modes and a plurality of third interfaces one to one; and detecting touch operation on the plurality of second touch areas, and displaying a third interface corresponding to the detected second touch area on the display of the terminal equipment in response to the detected touch operation. The application can improve the experience of the user in using the APP.

Description

Interface display method and device
Technical Field
The application relates to the technical field of mobile internet, in particular to an interface display method and device.
Background
With the rapid development of internet technology, on one hand, massive applications (apps) are produced, and more users use various apps such as news, entertainment, shopping, and office through terminal devices such as smart phones and tablet personal computers (PADs). On the other hand, a user often has multiple terminal devices (e.g., a wearable device, a mobile phone, a tablet computer, a notebook computer, etc.), and a technology for sharing screen display content among different terminal devices (i.e., a multi-screen interaction technology) can provide a good use experience for the user. However, the existing APP design cannot enable the multi-screen interaction technology to be better applied, so that the user experience is reduced.
Disclosure of Invention
The embodiment of the application provides an interface display method and device, and the existing APP design is improved, so that the multi-screen interaction technology is better applied, and the user experience is improved.
In a first aspect, an interface display method is provided, which is applied to a terminal device, and includes:
displaying a first interface on a display of the terminal device, wherein the first interface comprises a first area, the first area comprises a plurality of first touch areas, the plurality of first touch areas correspond to the plurality of device types one by one, and the plurality of first touch areas correspond to the plurality of second interfaces one by one;
detecting touch operation on a plurality of first touch areas;
responding to the touch operation of the detected first touch area, displaying a second interface corresponding to the detected first touch area on a display of the terminal equipment, wherein the second interface comprises a second area, the second area or an interface jumped to through the second area comprises a plurality of second touch areas, the plurality of second touch areas correspond to the plurality of working modes one by one, and the plurality of second touch areas correspond to the plurality of third interfaces one by one;
detecting touch operation on a plurality of second touch areas;
and responding to the detected touch operation of the second touch area, and displaying a third interface corresponding to the detected second touch area on a display of the terminal equipment.
In one possible implementation, the different second interfaces differ in one or more of the following information: the system comprises a page column, a working module layout and a functional module; the different third interfaces differ in one or more of the following information: the system comprises page columns, a working module layout and a functional module.
In a possible implementation manner, the second area or the interface jumped to by the second area includes a first adding area, and the first adding area is used for adding the working mode, and the method further includes:
detecting an operation on an interface jumped to through the first adding area;
and displaying the added work mode on the display in response to the detected operation on the interface jumped to through the first adding area.
In a possible implementation manner, the second area or the interface jumped to by the second area includes a first deletion area, and the first deletion area is used for deleting the operating mode, and the method further includes:
detecting an operation on a first deletion area;
and deleting the corresponding working mode in response to the detected operation on the first deletion area.
In a possible implementation manner, the second touch area or the interface jumped to by the second touch area includes a configuration area, and the configuration area is used for configuring a working mode corresponding to the second touch area, and the method further includes:
detecting an operation on a configuration area;
and displaying the updated third interface on the display of the terminal equipment in response to the detected operation on the configuration area.
In a possible implementation manner, the configuration area includes a text display template configuration area, and the text display template configuration area is used for configuring a text display template, and the method further includes:
detecting an operation on a text display template configuration area;
and responding to the detected operation on the text display template configuration area, if the text needs to be displayed, displaying the text on the display of the terminal equipment according to the detected configured text display template.
In one possible implementation manner, the configuration area includes a text display template modification area, and the text display template modification area is used for modifying the text display template, and the method further includes:
detecting an operation on a text display template modification area;
and modifying the text display template in response to the detected operation on the text display template modification area.
In a possible implementation manner, the third interface includes a third area, and the third area includes a plurality of working modules, and the method further includes:
detecting operations on a plurality of work modules;
and responding to the detected operation on the first working module, and operating the first working module, wherein the first working module is one of the plurality of working modules.
In a second aspect, a terminal device is provided, which includes: one or more functional modules for implementing any of the methods provided by the first aspect.
In a third aspect, a terminal device is provided, which includes: a memory in which a computer program is stored, and a processor, the computer program, when executed on the processor, causing the processor to perform any of the methods provided by the first aspect.
In a fourth aspect, a chip is provided, comprising: a processor coupled to the memory through the interface, and an interface, when the processor executes the computer execution program or the computer execution instructions in the memory, causing any one of the methods provided by the first aspect to be performed.
In a fifth aspect, a computer-readable storage medium is provided, which comprises computer-executable instructions, which, when executed on a computer, cause the computer to perform any one of the methods provided in the first aspect.
In a sixth aspect, there is provided a computer program product comprising computer executable instructions which, when run on a computer, cause the computer to perform any one of the methods provided in the first aspect.
According to the method, the same APP can display the interfaces corresponding to various terminal devices and various working modes on one terminal device, on one hand, a user can select the interface suitable for the currently used terminal device according to the currently used terminal device, for example, when a mobile phone is used for office work, the interface corresponding to the mobile phone can be adopted, and when a computer is used for office work, the interface corresponding to the computer can be adopted, so that the user experience is improved. On the other hand, if the interface needs to be shared, the interface can be adapted to another device, so that the use experience of the user is improved, for example, when a mobile phone is used for office work, if the interface needs to be shared on a computer, the interface corresponding to the computer can be selected on the terminal device, so that the screen of the computer can be adapted. On the other hand, the working mode can be selected according to the current requirement, so that the working efficiency is improved, for example, if a conference needs to be carried out, the conference mode can be adopted, and the configuration of the conference mode can be more suitable for the conference, so that the working efficiency can be improved.
Drawings
Fig. 1 is a schematic structural diagram of a terminal device provided in the present application.
Fig. 2 is a schematic flow chart of an interface display method provided in the present application.
Fig. 3 is a schematic diagram of a displayed interface provided in the present application.
FIG. 4 is a schematic view of yet another displayed interface provided herein.
FIG. 5 is a schematic view of yet another displayed interface provided herein.
FIG. 6 is a schematic view of yet another displayed interface provided herein.
FIG. 7 is a schematic view of yet another displayed interface provided herein.
FIG. 8 is a schematic view of yet another displayed interface provided herein.
FIG. 9 is a schematic view of yet another displayed interface provided herein.
FIG. 10 is a schematic view of yet another displayed interface provided herein.
FIG. 11 is a schematic view of yet another displayed interface provided herein.
FIG. 12 is a schematic view of yet another displayed interface provided herein.
FIG. 13 is a schematic illustration of yet another displayed interface provided herein.
Fig. 14 is a schematic composition diagram of an interface display device provided in the present application.
Detailed Description
In the description of the present application, "/" indicates an OR meaning, for example, A/B may indicate A or B, unless otherwise indicated. "and/or" herein is merely an association describing an associated object, and means that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. In the description of the present application, "at least one" means one or more, "a plurality" means two or more than two, unless otherwise specified.
In addition, in order to facilitate clear description of technical solutions of the embodiments of the present application, in the embodiments of the present application, terms such as "first" and "second" are used to distinguish the same items or similar items having substantially the same functions and actions. Those skilled in the art will appreciate that the terms "first," "second," etc. do not denote any order or quantity, nor do the terms "first," "second," etc. denote any order or importance.
The method provided by the embodiment of the application can be applied to various terminal devices capable of performing touch (for example, finger touch, stylus touch, mouse touch and the like), and the terminal devices can be wireless terminals or wired terminals. A terminal device may refer to a device that provides voice and/or data connectivity to a user, such as a handheld device having wireless connection capability or other processing device connected to a wireless modem. The terminal device may be a smartphone, a satellite radio, a computer, a Personal Communication Service (PCS) phone, Virtual Reality (VR) glasses, Augmented Reality (AR) glasses, a machine type communication terminal, an internet of things terminal, a communication device mounted on a vehicle, a communication device mounted on an unmanned aerial vehicle, or the like. The terminal device may also be referred to as a User Equipment (UE), a terminal, a subscriber unit (subscriber unit), a subscriber station (subscriber station), a mobile station (mobile), a remote station (remote station), an access point (access point), an access terminal (access terminal), a user terminal (user terminal), a user agent (user agent), and the like.
Fig. 1 is a schematic structural diagram of a terminal device provided in the present application, where the terminal device 100 includes: radio Frequency (RF) circuitry 110, memory 120, other input devices 130, display screen 140, sensors 150, audio circuitry 160, I/O subsystem 170, processor 180, and power supply 190. Those skilled in the art will appreciate that the terminal device architecture shown in fig. 1 does not constitute a limitation of the terminal device and may include more or fewer components than shown, or combine certain components, or split certain components, or a different arrangement of components. Those skilled in the art will appreciate that the display 140 belongs to a User Interface (UI) and that the terminal device 100 may include more or less User interfaces than shown. The following specifically describes each constituent element of the terminal device 100 with reference to fig. 1:
the RF circuit 110 may be used for receiving and transmitting signals during a message transmission or a call. Specifically, after receiving downlink information of the access network device (e.g., base station), the processor 180 processes the received downlink information, and sends uplink data to the access network device. In general, the RF circuit 110 includes, but is not limited to, an antenna, at least one Amplifier, a transceiver, a coupler, a Low Noise Amplifier (LNA), a duplexer, and the like. In addition, the RF circuitry 110 may also communicate with networks and other devices via wireless communications. The wireless communication may use any communication standard or protocol, including but not limited to Global System for Mobile communications (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), Long Term Evolution (LTE), email, Short Messaging Service (SMS), etc.
The memory 120 may be used to store software programs and modules, and the processor 180 executes various functional applications and data processing of the terminal device 100 by operating the software programs and modules stored in the memory 120. The memory 120 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required for at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the terminal device 100. Further, the memory 120 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
Other input devices 130 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of terminal device 100. In particular, other input devices 130 may include, but are not limited to, one or more of a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, a light mouse (a light mouse is a touch-sensitive surface that does not display visual output, or is an extension of a touch-sensitive surface formed by a touch screen), and the like. The other input devices 130 are connected to other input device controllers 171 of the I/O subsystem 170 and interact with the processor 180 under the control of the other input device controllers 171.
The display screen 140 may be used to display information input by or provided to the user and various menus of the terminal device 100, and may also accept user input. Specifically, the display screen 140 may include a display panel 141 and a touch panel 142. The Display panel 141 may be configured by a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like. The touch panel 142, also referred to as a touch screen, a touch sensitive screen, etc., may collect contact or non-contact operations (for example, operations performed by a user on or near the touch panel 142 using any suitable object such as a finger, a stylus, etc., and may also include somatosensory operations; the operations include single-point control operations, multi-point control operations, etc.) and drive the corresponding connection device according to a preset program. Alternatively, the touch panel 142 may include two parts, i.e., a touch detection device and a touch controller. The touch detection device detects the touch direction and gesture of a user, detects signals brought by touch operation and transmits the signals to the touch controller; the touch controller receives the touch information from the touch detection device, converts the touch information into information that can be processed by the processor, sends the information to the processor 180, and receives and executes a command sent by the processor 180. In addition, the touch panel 142 may be implemented by various types such as a resistive type, a capacitive type, an infrared ray, a surface acoustic wave, and the like, and the touch panel 142 may also be implemented by any technology developed in the future. Further, the touch panel 142 may cover the display panel 141, a user may operate on or near the touch panel 142 covered on the display panel 141 according to the content displayed on the display panel 141 (the display content includes, but is not limited to, a soft keyboard, a virtual mouse, virtual keys, icons, etc.), the touch panel 142 detects the operation on or near the touch panel 142, and transmits the operation to the processor 180 through the I/O subsystem 170 to determine a user input, and then the processor 180 provides a corresponding visual output on the display panel 141 through the I/O subsystem 170 according to the user input. Although in fig. 1, the touch panel 142 and the display panel 141 are two independent components to implement the input and output functions of the terminal device 100, in some embodiments, the touch panel 142 and the display panel 141 may be integrated to implement the input and output functions of the terminal device 100.
The terminal device 100 may also include at least one sensor 150, such as a light sensor, a motion sensor, and other sensors. In particular, the light sensor may include an ambient light sensor and a proximity sensor. Wherein the ambient light sensor can adjust the brightness of the display panel 141 according to the brightness of the ambient light, and the proximity sensor can turn off the display panel 141 and/or the backlight when the terminal device 100 moves to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in various directions (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used for applications (such as horizontal and vertical screen switching, related games, magnetometer attitude calibration), vibration recognition related functions (such as pedometer, tapping), and the like, for recognizing the attitude of the terminal device. The terminal device 100 may further be configured with other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which are not described herein again.
The audio circuitry 160, speaker 161, and microphone 162 may provide an audio interface between a user and the terminal device 100. The audio circuit 160 may transmit the converted signal of the received audio data to the speaker 161, and convert the signal into a sound signal for output by the speaker 161; on the other hand, the microphone 162 converts the collected sound signal into a signal, converts the signal into audio data after being received by the audio circuit 160, and outputs the audio data to the RF circuit 110 to be transmitted to another device (such as another terminal device) or outputs the audio data to the memory 120 for further processing.
The I/O subsystem 170 controls input and output of external devices, and may include other input device controllers 171, a sensor controller 172, and a display controller 173. Optionally, one or more other input control device controllers 171 receive signals from and/or transmit signals to other input devices 130, and other input devices 130 may include physical buttons (push buttons, rocker buttons, etc.), dials, slide switches, joysticks, click wheels, mice, and the like. It is noted that other input device controllers 171 may be connected to any one or more of the above-described devices. The display controller 173 in the I/O subsystem 170 receives signals from the display screen 140 and/or sends signals to the display screen 140. After the display screen 140 detects the user input, the display controller 173 converts the detected user input into an interaction with the user interface object displayed on the display screen 140, i.e., realizes a human-machine interaction. The sensor controller 172 may receive signals from one or more sensors 150 and/or transmit signals to one or more sensors 150.
The processor 180 is a control center of the terminal device 100, connects various parts of the entire terminal device using various interfaces and lines, and performs various functions and data processing of the terminal device 100 by running or executing software programs and/or modules stored in the memory 120 and calling data stored in the memory 120, thereby performing overall monitoring of the terminal device. Alternatively, processor 180 may include one or more processing units. Preferably, the processor 180 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 180.
The terminal device 100 also includes a power supply 190 (such as a battery) for powering the various components. Preferably, the power source may be logically connected to the processor 180 through a power management system, so that functions of managing charging, discharging, and power consumption are implemented through the power management system.
Although not shown, the terminal device 100 may further include a camera, a bluetooth module, and the like, which will not be described herein.
In the embodiments of the present application, the processor 180 may execute the software programs and modules in the memory 120 to perform the methods provided by the present application. The method provided by the application can be applied to daily software (such as office software) and an electronic government system, and the application is not limited.
At present, when a user uses an APP, interface content on a mobile phone may need to be shared on a computer, a television or a projection screen, on one hand, the layout of an interface on the mobile phone may not be suitable for the computer, the television or the projection screen, resulting in a poor projection effect, and therefore, an interface layout suitable for the computer, the television or the projection screen needs to be designed. On the other hand, because the size of a computer, a television or a projection screen is larger than that of a mobile phone, a part of hidden interfaces (for example, interfaces which can be displayed only by a multi-level interface on the mobile phone) can be actually and directly put into a label interface, so that the operation efficiency of a user is improved. Based on these considerations, the present application provides an interface display method, see fig. 2, which is executed by a terminal device and includes the following steps 201 to 205:
201. the method comprises the steps of displaying a first interface on a display of the terminal device, wherein the first interface comprises a first area, the first area comprises a plurality of first touch areas, the first touch areas correspond to various device types one by one, and the first touch areas correspond to the second interfaces one by one.
The first interface may be a tag interface (i.e., an interface corresponding to a tag, which may also be referred to as a primary interface) of a certain APP (assumed as a first APP), and the terminal device may display the first interface on a display of the terminal device after detecting a touch operation of a user on a touch area where the first APP is located. The first interface may also be a secondary interface (i.e., an interface jumped to through a primary interface) or a tertiary interface (i.e., an interface jumped to through a secondary interface) or a lower level interface of the first APP, which is not limited in this application.
The device types in this application may include one or more of a mobile phone, a PAD, a computer, a projector, and others, and this application is not limited.
The first area in the application can be located at any position in the first interface, the multiple first touch areas included in the first area correspond to multiple device types one to one, the multiple device types correspond to the multiple second interfaces one to one, and after a user performs touch operation on one first touch area, the terminal device can display the second interface corresponding to the first touch area on the display. For example, assuming that there are three first touch areas in the first area, the corresponding relationship between the first touch areas, the device types, and the second interface can be seen in table 1. After the user performs a touch operation on the first touch area 1, the terminal device may display a second interface 1 on the display, where the second interface 1 is an interface corresponding to the mobile phone. After the user performs a touch operation on the first touch area 2, the terminal device may display a second interface 2 on the display, where the second interface 2 is an interface corresponding to the PAD. That is to say, the application can realize that the interface of the first APP on various terminal devices is displayed on one terminal device.
TABLE 1
A first touch area Type of device Second interface
First touch area 1 Device type 1 (e.g., cell phone) Second interface 1
First touch area 2 Device type 2 (e.g., PAD) Second interface 2
First touch area 3 Device type 3 (e.g., notebook) Second interface 3
For example, referring to fig. 3, after the user clicks the first APP in (a) in fig. 3, the terminal device may display a tab interface of a tab name workbench shown in (b) in fig. 3, in which other tab interfaces may be jumped to by clicking a tab name home page, a message, or my tab. The first area and the first touch area in (b) of fig. 3 can be referred to as an illustration, wherein the first touch area 1 can also be referred to as a mobile touch area, and the first touch area 2 can also be referred to as a desktop touch area. The second interface corresponding to the mobile touch area is a second interface corresponding to the mobile phone (hereinafter referred to as a mobile interface), and the interface corresponding to the desktop touch area is a PAD or a second interface corresponding to the notebook (hereinafter referred to as a desktop interface). In fig. 3 (b), a moving interface is presently shown.
Optionally, the different second interfaces differ in one or more of the following information: page column, work module layout, function module, etc. For example, the number, position, name, etc. of the page columns (i.e., the work columns) on the mobile interface may be different from those of the desktop interface, the positions, numbers, names, etc. of the work modules on the mobile interface may be different from those of the desktop interface, the layout of the work modules on the mobile interface, for example, the number of rows and columns, may be different from those of the desktop interface, and the function modules on the mobile interface, for example, the related modules for setting the work modes, may be different from those of the desktop interface.
For example, the mobile interface displayed on the mobile phone can be referred to as (a) in fig. 4 and (b) in fig. 4, and the desktop interface displayed can be referred to as fig. 5. Compared with a desktop interface, a mobile interface displayed on the mobile phone has a different work column position, and the desktop interface comprises a functional module which is not in the mobile interface. It should be noted that, because the desktop interface is designed according to the PAD or the screen size of the computer, only a part of the desktop interface may be displayed on the mobile phone or the desktop interface is reduced and displayed on the mobile phone actually, and if the desktop interface on the mobile phone is shared with other devices, such as a projection screen and a computer, a complete interface may be displayed. In addition, because the size of a computer, a television or a projection screen is larger than that of a mobile phone, a part of hidden interfaces (for example, interfaces which can be displayed only by a multi-level interface on the mobile phone) can be directly placed in a label interface, so that the operation efficiency of a user is improved.
202. Touch operation on the plurality of first touch areas is detected.
In step 202, in a specific implementation, the terminal device may detect whether the user has a touch operation on each displayed area through a sensor in the terminal device.
203. And responding to the detected touch operation of the first touch area, and displaying a second interface corresponding to the detected first touch area on a display of the terminal equipment. The second interface comprises a second area, the second area or the interface jumped to through the second area comprises a plurality of second touch areas, the plurality of second touch areas correspond to the plurality of working modes one by one, and the plurality of second touch areas correspond to the plurality of third interfaces one by one.
For example, based on (b) in fig. 3, if the user performs a touch operation on the mobile touch area, the terminal device may display a mobile interface, and if the user performs a touch operation on the desktop touch area, the terminal device may display a desktop interface.
The second area in the application can be located at any position in the second interface, the plurality of second touch areas included in the second area or included in the interface jumped to through the second area correspond to the plurality of working modes one to one, the plurality of working modes correspond to the plurality of third interfaces one to one, and after a user performs touch operation on one second touch area, the terminal device can display the third interface corresponding to the second touch area on the display. For example, assuming that there are three second touch areas, the corresponding relationship among the second touch areas, the working mode and the third interface can be seen in table 2. After the user performs a touch operation on the second touch area 1, the terminal device may display the third interface 1 on the display. After the user performs a touch operation on the second touch area 2, the terminal device may display the third interface 2 on the display. That is to say, the application can realize that the interface of the first APP in multiple working modes is displayed on one terminal device.
TABLE 2
Second touch area Mode of operation Third interface
Second touch area 1 Mode of operation 1 Third interface 1
Second touch area 2 Mode of operation 2 Third interface 2
Second touch area 3 Mode of operation 3 Third interface 3
The working modes may include a conference mode, a normal mode, and other working modes, which are not limited in this application and are only examples.
In one case, referring to (a) in fig. 4, the second interface includes a second area, the second area includes a plurality of second touch areas (i.e., a second touch area 1 and a second touch area 2 in the drawing), the second touch area 1 may be referred to as a normal mode touch area, the second touch area 2 may be referred to as a conference mode touch area, a third interface corresponding to the normal mode touch area may be referred to as a normal mode interface, and a third interface corresponding to the conference mode touch area may be referred to as a conference mode interface. In fig. 4 (a), a normal mode interface is presently shown.
In another case, referring to (b) of fig. 4, the second interface includes a second region therein. Referring to fig. 6 (a), the user performs a touch operation on the second area, and the interface to jump to includes a plurality of second touch areas, referring to fig. 6 (b).
204. And detecting touch operation on the plurality of second touch areas.
205. And responding to the detected touch operation of the second touch area, and displaying a third interface corresponding to the detected second touch area on a display of the terminal equipment.
In one case, referring to (a) in fig. 4, if the user performs a touch operation on the normal-mode touch area, the terminal device may display a normal-mode interface, and if the user performs a touch operation on the conference-mode touch area, the terminal device may display a conference-mode interface.
In another case, referring to fig. 6 (b), if the user performs a touch operation on the enable button in the normal-mode touch area, the terminal device may display a normal-mode interface, and if the user performs a touch operation on the close button in the normal-mode touch area, the terminal device may display another mode interface (e.g., a default operating mode interface). If the user performs a touch operation on the enable button in the conference mode touch area, the terminal device may display a conference mode interface, and if the user performs a touch operation on the close button in the conference mode touch area, the terminal device may display another mode interface (e.g., a default operating mode interface). It should be noted that, both fig. 6 (b) and fig. 5 only expand the content that can be set in the normal mode, the content that can be set in the conference mode is folded, the content can be expanded by clicking a triangle before the conference mode, and a setting page similar to the normal mode is expanded.
Optionally, one or more of the following information of the different third interfaces are different: page column, work module layout, function module, etc.
For example, a normal mode interface in the mobile interface may refer to fig. 7 (a), and a conference mode interface may refer to fig. 7 (b), where the conference mode interface places the work module related to the conference at a front position compared to the normal mode interface, so that the user may find the work module more conveniently. Under the desktop interface, the difference between the common mode interface and the conference mode interface is similar, and the description is omitted. Of course, the difference between the normal mode and the conference mode is only exemplified here, and the difference between the two modes may be other in practical implementation.
According to the method, the same APP can display the interfaces corresponding to various terminal devices and various working modes on one terminal device, on one hand, a user can select the interface suitable for the currently used terminal device according to the currently used terminal device, for example, when a mobile phone is used for office work, the interface corresponding to the mobile phone can be adopted, and when a computer is used for office work, the interface corresponding to the computer can be adopted, so that the user experience is improved. On the other hand, if the interface needs to be shared, the interface can be adapted to another device, so that the use experience of the user is improved, for example, when a mobile phone is used for office work, if the interface needs to be shared on a computer, the interface corresponding to the computer can be selected on the terminal device, so that the screen of the computer can be adapted. On the other hand, the working mode can be selected according to the current requirement, so that the working efficiency is improved, for example, if a conference needs to be carried out, the conference mode can be adopted, and the configuration of the conference mode can be more suitable for the conference, so that the working efficiency can be improved.
In order to meet the requirements of users, the working modes in the application can be added, deleted or modified by the users, and the following description is divided into three cases.
Case 1, addition of operating mode
In case 1, optionally, the second area or the interface jumped to by the second area further includes a first adding area, and the first adding area is used for adding the working mode, and the method further includes:
detecting an operation on an interface jumped to through the first adding area;
and displaying the added work mode on the display in response to the detected operation on the interface jumped to through the first adding area.
For example, in the case that the second area further includes the first adding area, the first adding area may be referred to as (a) in fig. 4. In the case where the first addition region is further included in the interface to which the second region jumps, the first addition region may be referred to as fig. 5, or (b) in fig. 6. The user jumps to other interfaces by clicking the first adding area, the names and the layouts (for example, the number of the working modules contained in each line) of the working modes and the buttons of the text display templates can be arranged on the other interfaces, and after the user finishes setting, new working modes can be added.
Case 2, deletion of operating mode
In case 2, optionally, the second area or the interface jumped to by the second area includes a first deletion area, and the first deletion area is used to delete the operating mode, and the method further includes:
detecting an operation on a first deletion area;
and deleting the corresponding working mode in response to the detected operation on the first deletion area.
For example, in the case that the second area further includes a first deletion area, a deletion area may be added after each operation mode for deleting the corresponding operation mode. In the case where the first deletion area is also included in the interface to which the second area jumps, the first deletion area may be referred to as fig. 5, or (b) in fig. 6. One working mode can correspond to one first deleting area, and a user can delete the working mode corresponding to the first deleting area by clicking the first deleting area.
Case 3, modification of operating mode
In case 3, optionally, the second touch area or the interface jumped to by the second touch area includes a configuration area, and the configuration area is used to configure a working mode corresponding to the second touch area, and the method further includes:
detecting an operation on a configuration area;
and displaying the updated third interface on the display of the terminal equipment in response to the detected operation on the configuration area.
For example, in the case that the second touch area further includes a configuration area, the configuration area may refer to fig. 5 or (b) of fig. 6. The configuration area may also be in the interface that jumps to through the second touch area. For example, for an interface to which the second touch area jumps after being pressed for a long time in fig. 4, the function in the interface may refer to the configuration area shown in fig. 5 or fig. 6 (b). For example, the interface to jump to after long pressing the second touch area 1 in fig. 4 can be seen in (a) of fig. 8. The initially displayed third interface may be a default third interface or a third interface displayed last time by the terminal device, and the updated third interface may be a third interface redisplayed based on an operation of the user. The page layout, for example, the number of work modules per line, and, for example, the text display template used, etc., may be configured in the configuration area.
According to the method, the user can set the working mode meeting the working requirement of the user through the user-defined working mode, so that the working efficiency is improved, and the user experience is improved.
It should be noted that the terminal device may also directly adjust the layout of the workspaces on the third interface, for example, see (a) in fig. 7 and (b) in fig. 7, the third interface may include a plurality of workspaces, the user may add a workspaces by clicking the plus sign in the workspaces area, the process of adding a workspaces is similar to the process of adding a job module hereinafter, and may refer to the process for understanding, and no further description is given. Illustratively, referring to fig. 9, based on the work bar shown in (a) of fig. 7, the user adds a work bar named as a representative contact, and adds a work module named as a representative directory under the work bar.
In case 3, the text display template in the present application may also be configured by the user (for example, whether the configuration is used or not) or added or deleted or modified, and the following description is divided into four cases to be described separately.
Case (1) configuration of text display template
In case (1), optionally, the configuration area includes a text display template configuration area, and the text display template configuration area is used for configuring a text display template, and the method further includes:
detecting an operation on a text display template configuration area;
and responding to the detected operation on the text display template configuration area, if the text needs to be displayed, displaying the text on the display of the terminal equipment according to the detected configured text display template.
For example, the text display template configuration area may be referred to as (b) in fig. 5 or (b) in fig. 6. There may be more than one text display template corresponding to one operation mode. Referring to fig. 5, or (b) in fig. 6, the user may click a use button corresponding to the text display template, so that when the text needs to be displayed, the text is displayed on the display of the terminal device according to the detected configured text display template.
Case (2) modification of text display template
In case (2), the configuration area includes a text display template modification area, and the text display template modification area is used for modifying the text display template, and the method further includes:
detecting an operation on a text display template modification area;
and modifying the text display template in response to the detected operation on the text display template modification area.
For example, the text display template modification area may be referred to as (b) in fig. 5 or (b) in fig. 6. Referring to fig. 5, or (b) in fig. 6, the user may modify the text display template by clicking a modification button corresponding to the text display template, for example, to modify information such as word space, line space, word size, title level, alignment mode, chart display mode, and the like.
Case (3) deletion of text display template
In case (3), the configuration area includes a text display template deletion area for deleting the text display template, and the method further includes:
detecting an operation on a text display template deleting area;
and deleting the corresponding text display template in response to the detected operation on the text display template deletion area.
For example, the text display template deletion area may be referred to as (b) in fig. 5 or (b) in fig. 6. Referring to fig. 5, or (b) of fig. 6, the user may delete the corresponding text display template by clicking a delete button corresponding to the text display template.
Case (4) addition of text display template
In case (4), the configuration area includes a text display template adding area, and the text display template adding area is used for adding a text display template, and the method further includes:
detecting an operation of adding an area to the text display template;
and adding the text display template in response to the detected operation on the text display template adding area.
For example, the text display template adding area may be referred to as (b) in fig. 5, or (b) in fig. 6, or (a) in fig. 8. Referring to fig. 5, or (b) of fig. 6, or (a) of fig. 8, the user may add a text display template by clicking an add new template button. Illustratively, based on the example shown in (a) in fig. 8, the interface after adding the text display template 4 can be referred to as (b) in fig. 8.
The text display template in the application can be a display template of a word or a PPT and the like. Since the text uploaded to the first APP may have multiple sources, the text format (e.g., font size, first line indentation, alignment, etc.) is also different, which results in very inefficient processing of the text. According to the method provided by the embodiment of the application, the text format can be unified and the working efficiency is improved by setting the unified text display template. In specific implementation, before the text display template is used for displaying, the text content can be extracted into different text streams, the attributes of the text streams are converted into corresponding template items, and the corresponding template items are applied to the template and displayed on a software interface.
Optionally, the user may also manage the work module, for example, delete, add, or move the work module, specifically, delete or move the work module in the following manner one, add or delete the work module in the manner two, and add the work module in the manner three.
In the first mode, the third interface comprises a third area, the third area comprises a plurality of working modules, and at the moment, the terminal equipment detects the operation on the plurality of working modules; and responding to the detected operation on the first working module, and operating the first working module, wherein the first working module is one of the plurality of working modules.
For example, referring to fig. 10, a user presses a work module for a long time, and when the work module shakes, the user can drag the work module to move the position of the work module, and if the work module also exists at the original position, the work module at the original position can be moved to other places, or the user can click a cross mark on the upper left of the work module to delete the work module.
In the second mode and the third mode, one or more workspaces correspond to a management button, the management button is used for managing the working modules under the corresponding workspaces, the interface jumped to by the management button comprises adding and deleting buttons of a plurality of working modules, and at the moment, the terminal equipment detects the operation of the adding or deleting buttons of the plurality of working modules; and adding or deleting the second working module in response to the detected operation of the adding or deleting button of the second working module, wherein the second working module is one of the plurality of working modules.
For example, when the user clicks the management button corresponding to the work bar named as representing the contact in (a) in fig. 11, the work modules that can be added or deleted in (b) in fig. 11 may be displayed, at this time, the user clicks the addition or deletion to add or delete the corresponding work module, it should be noted that, for the work modules that have been added, only the deletion button is valid, and for the work modules that have not been added, only the addition button is valid. Based on the interface shown in (a) in fig. 11, the interface after adding the work module named as the double-linked list under the work bar named as the representative contact can be seen in fig. 12.
In a third mode, the work module areas under one or more work columns in the third interface comprise adding icons, the interface jumped to by the adding icons comprises adding buttons of a plurality of work modules, and at the moment, the terminal equipment detects the operation of the adding buttons of the plurality of work modules; and adding the third working module in response to the detected operation of the adding button of the third working module, wherein the third working module is one of the plurality of working modules.
For example, when the user clicks the add icon in the work bar with the name representing the contact in (a) in fig. 13, the work modules that can be added in (b) in fig. 13 may be displayed, and at this time, the user clicks add to add the corresponding work module.
In the embodiment of the application, the user can conveniently move or delete or add the working module according to the demand, so that the interface can be adjusted according to the demand, the working efficiency is improved, and the user experience is improved.
The touch operation in the above embodiments of the present application may be a single click, a double click, a long press, or the like, and the present application is not limited thereto.
In this application, the developer can adopt current grammar to design first APP to make first APP realize above-mentioned function. The interfaces shown in the various figures in this application are only examples, and in practical implementation, the positions of the modules in these interfaces may be different, or may be located in interfaces of different levels, and this application is not limited. In the above examples of the present application, the operation of the present application is mostly exemplified by taking a mobile interface as an example, and the operation under a desktop interface or an interface corresponding to other terminal types is similar and can be understood by reference.
The service scenario described in the embodiment of the present application is for more clearly illustrating the technical solution of the embodiment of the present application, and does not constitute a limitation on the technical solution provided in the embodiment of the present application. As can be known to those skilled in the art, with the occurrence of a new service scenario, the technical solution provided in the embodiment of the present application is also applicable to similar technical problems.
The above description has presented the embodiments of the present application primarily from a method perspective. It is to be understood that the terminal device includes at least one of a hardware structure and a software module corresponding to each function in order to implement the above-described functions. Those of skill in the art would readily appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as hardware or combinations of hardware and computer software. Whether a function is performed as hardware or computer software drives hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiment of the present application, the terminal device may be divided into the functional units according to the above method example, for example, each functional unit may be divided corresponding to each function, or two or more functions may be integrated into one processing unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit. It should be noted that the division of the unit in the embodiment of the present application is schematic, and is only a logic function division, and there may be another division manner in actual implementation.
Exemplarily, fig. 14 shows a schematic diagram of a possible structure of an interface display device (denoted as the interface display device 140) according to the above embodiment, where the interface display device 140 includes a display unit 1401, a detection unit 1402, and an execution unit 1403. Wherein:
a display unit 1401, configured to display a first interface on a display of the terminal device, where the first interface includes a first area, the first area includes a plurality of first touch areas, the plurality of first touch areas are in one-to-one correspondence with a plurality of device types, and the plurality of first touch areas are in one-to-one correspondence with a plurality of second interfaces;
a detecting unit 1402, configured to detect touch operations on the plurality of first touch areas;
an execution unit 1403, configured to display, through the display unit 1401, a second interface corresponding to the detected first touch area on the display of the terminal device in response to the touch operation of the detected first touch area, where the second interface includes a second area, and the second area or an interface jumped to through the second area includes a plurality of second touch areas, where the plurality of second touch areas correspond to the plurality of working modes one to one, and the plurality of second touch areas correspond to the plurality of third interfaces one to one;
a detecting unit 1402, further configured to detect touch operations on the plurality of second touch areas;
the execution unit 1403 is further configured to display, through the display unit 1401, a third interface corresponding to the detected second touch area on the display of the terminal device in response to the touch operation of the detected second touch area.
Optionally, the different second interfaces differ in one or more of the following information: the system comprises a page column, a working module layout and a functional module; the different third interfaces differ in one or more of the following information: the system comprises page columns, a working module layout and a functional module.
Optionally, the second area or the interface jumped to through the second area includes a first adding area, the first adding area is used for adding a working mode,
a detecting unit 1402, further configured to detect an operation on an interface jumped to through the first adding area;
an execution unit 1403, further configured to display the added operation mode on the display through the display unit 1401 in response to the detected operation on the interface jumped to through the first adding region.
Optionally, the second area or the interface jumped to through the second area includes a first deletion area, where the first deletion area is used to delete the operating mode,
a detecting unit 1402, further configured to detect an operation on the first deletion area;
an execution unit 1403 is further configured to delete the corresponding operating mode in response to the detected operation on the first deletion area.
Optionally, the second touch area or the interface jumped to through the second touch area includes a configuration area, where the configuration area is used to configure the working mode corresponding to the second touch area,
a detection unit 1402, further configured to detect an operation on the configuration area;
an execution unit 1403, further configured to display the updated third interface on the display of the terminal device through the display unit 1401 in response to the detected operation on the configuration area.
Optionally, the configuration area includes a text display template configuration area, and the text display template configuration area is used for configuring a text display template,
a detecting unit 1402, further configured to detect an operation on the text display template configuration area;
the execution unit 1403 is further configured to, in response to the detected operation on the text display template configuration area, display a text on the display of the terminal device through the display unit 1401 according to the detected configured text display template if the text needs to be displayed.
Optionally, the configuration area includes a text display template modification area, the text display template modification area is used for modifying a text display template,
a detecting unit 1402, further configured to detect an operation on the text display template modification area;
an execution unit 1403 is further configured to modify the text display template in response to the detected operation on the text display template modification area.
Optionally, the third interface includes a third area, the third area includes a plurality of working modules therein,
a detecting unit 1402, further configured to detect operations on the plurality of work modules;
the execution unit 1403 is further configured to operate a first work module in response to the detected operation on the first work module, where the first work module is one of the plurality of work modules.
For example, the interface display device 140 may be a single device or a chip system.
The integrated unit in fig. 14, if implemented in the form of a software functional module and sold or used as a separate product, may be stored in a computer-readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present application may be essentially implemented or make a contribution to the prior art, or all or part of the technical solutions may be implemented in the form of a software product stored in a storage medium, and including several instructions for causing a computer device (which may be a personal computer, a server, a network device, or the like) or a processor (processor) to execute all or part of the steps of the methods described in the embodiments of the present application. A storage medium storing a computer software product comprising: various media capable of storing program codes, such as a usb disk, a removable hard disk, a read-only memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
Embodiments of the present application further provide a computer-readable storage medium, which includes computer-executable instructions, and when the computer-executable instructions are executed on a computer, the computer is caused to execute any one of the methods described above.
Embodiments of the present application also provide a computer program product comprising computer executable instructions, which when executed on a computer, cause the computer to perform any of the above methods.
An embodiment of the present application further provides a chip, including: a processor coupled to the memory through the interface, and an interface, when the processor executes the computer program or instructions in the memory, causing any of the methods provided by the above embodiments to be performed.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented using a software program, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. The procedures or functions described in accordance with the embodiments of the present application are all or partially generated upon loading and execution of computer program instructions on a computer. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored on a computer readable storage medium or transmitted from one computer readable storage medium to another computer readable storage medium, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center via wire (e.g., coaxial cable, fiber optic, Digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). Computer-readable storage media can be any available media that can be accessed by a computer or can comprise one or more data storage devices, such as servers, data centers, and the like, that can be integrated with the media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., Solid State Disk (SSD)), among others.
While the present application has been described in connection with various embodiments, other variations to the disclosed embodiments can be understood and effected by those skilled in the art in practicing the claimed application, from a review of the drawings, the disclosure, and the appended claims. In the claims, the word "comprising" does not exclude other elements or steps, and the word "a" or "an" does not exclude a plurality. A single processor or other unit may fulfill the functions of several items recited in the claims. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage.
Although the present application has been described in conjunction with specific features and embodiments thereof, it will be evident that various modifications and combinations can be made thereto without departing from the spirit and scope of the application. Accordingly, the specification and figures are merely exemplary of the present application as defined in the appended claims and are intended to cover any and all modifications, variations, combinations, or equivalents within the scope of the present application. It will be apparent to those skilled in the art that various changes and modifications may be made in the present application without departing from the spirit and scope of the application.

Claims (10)

1. An interface display method is applied to terminal equipment, and the method comprises the following steps:
displaying a first interface on a display of the terminal device, wherein the first interface comprises a first area, the first area comprises a plurality of first touch areas, the plurality of first touch areas correspond to a plurality of device types one by one, and the plurality of first touch areas correspond to a plurality of second interfaces one by one;
detecting touch operation on the plurality of first touch areas;
responding to the touch operation of the detected first touch area, displaying a second interface corresponding to the detected first touch area on a display of the terminal equipment, wherein the second interface comprises a second area, the second area or an interface jumped to through the second area comprises a plurality of second touch areas, the second touch areas correspond to a plurality of working modes one by one, and the second touch areas correspond to a plurality of third interfaces one by one;
detecting touch operation on the plurality of second touch areas;
and responding to the detected touch operation of the second touch area, and displaying a third interface corresponding to the detected second touch area on a display of the terminal equipment.
2. The method of claim 1, wherein the different second interfaces differ in one or more of the following information: the system comprises a page column, a working module layout and a functional module; the different third interfaces differ in one or more of the following information: the system comprises page columns, a working module layout and a functional module.
3. The method according to claim 1 or 2, wherein the second area or the interface jumped to by the second area comprises a first adding area, and the first adding area is used for adding an operation mode, and the method further comprises:
detecting an operation on an interface jumped to through the first adding area;
displaying the added operating mode on the display in response to the detected operation on the interface jumped to through the first adding area.
4. The method according to claim 1 or 2, wherein the second area or the interface jumped to by the second area includes a first deletion area, and the first deletion area is used for deleting the working mode, and the method further includes:
detecting an operation on the first deletion area;
and deleting the corresponding working mode in response to the detected operation on the first deletion area.
5. The method according to claim 1 or 2, wherein the second touch area or the interface jumped to by the second touch area includes a configuration area, and the configuration area is used for configuring a working mode corresponding to the second touch area, and the method further includes:
detecting an operation on the configuration area;
and displaying the updated third interface on the display of the terminal equipment in response to the detected operation on the configuration area.
6. The method of claim 5, wherein the configuration area comprises a text display template configuration area, and wherein the text display template configuration area is used for configuring a text display template, and wherein the method further comprises:
detecting an operation on the text display template configuration area;
responding to the detected operation on the text display template configuration area, if the text needs to be displayed, displaying the text on the display of the terminal equipment according to the detected configured text display template.
7. The method of claim 5, wherein the configuration area includes a text display template modification area for modifying a text display template, the method further comprising:
detecting an operation on the text display template modification area;
modifying the text display template in response to the detected operation on the text display template modification area.
8. The method of claim 1 or 2, wherein a third area is included in the third interface, the third area including a plurality of work modules therein, the method further comprising:
detecting operations on the plurality of work modules;
and responding to the detected operation on a first working module, and operating the first working module, wherein the first working module is one of the plurality of working modules.
9. A terminal device, comprising: one or more functional modules for implementing the method of any one of claims 1-8.
10. A terminal device, comprising: memory and a processor, the memory having stored therein a computer program which, when run on the processor, causes the processor to perform the method of any of claims 1-8.
CN202210029399.0A 2022-01-12 2022-01-12 Interface display method and device Active CN114398016B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210029399.0A CN114398016B (en) 2022-01-12 2022-01-12 Interface display method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210029399.0A CN114398016B (en) 2022-01-12 2022-01-12 Interface display method and device

Publications (2)

Publication Number Publication Date
CN114398016A true CN114398016A (en) 2022-04-26
CN114398016B CN114398016B (en) 2024-06-11

Family

ID=81230699

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210029399.0A Active CN114398016B (en) 2022-01-12 2022-01-12 Interface display method and device

Country Status (1)

Country Link
CN (1) CN114398016B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024113957A1 (en) * 2022-11-28 2024-06-06 荣耀终端有限公司 Game service management method and electronic device

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1622619A (en) * 2004-12-24 2005-06-01 北京中星微电子有限公司 A multi-screen display method and device
US20120240056A1 (en) * 2010-11-17 2012-09-20 Paul Webber Email client mode transitions in a smartpad device
CN104461433A (en) * 2014-12-19 2015-03-25 北京奇艺世纪科技有限公司 Individual interface display method and device
CN108563374A (en) * 2018-03-05 2018-09-21 维沃移动通信有限公司 A kind of interface display method and terminal device
US20180329587A1 (en) * 2017-05-12 2018-11-15 Apple Inc. Context-specific user interfaces
CN109917956A (en) * 2019-02-22 2019-06-21 华为技术有限公司 It is a kind of to control the method and electronic equipment that screen is shown
CN109997348A (en) * 2017-09-25 2019-07-09 华为技术有限公司 A kind of display methods and terminal of terminal interface
CN110308834A (en) * 2019-04-25 2019-10-08 维沃移动通信有限公司 The setting method and terminal of application icon display mode
CN110442297A (en) * 2019-08-08 2019-11-12 Oppo广东移动通信有限公司 Multi-screen display method, split screen display available device and terminal device
CN110688179A (en) * 2019-08-30 2020-01-14 华为技术有限公司 Display method and terminal equipment
CN111190559A (en) * 2019-12-04 2020-05-22 深圳市东向同人科技有限公司 Screen projection control synchronization method, mobile terminal and computer readable storage medium
CN111221450A (en) * 2020-01-02 2020-06-02 杭州网易云音乐科技有限公司 Information display method and device, electronic equipment and storage medium
CN111367456A (en) * 2020-02-28 2020-07-03 青岛海信移动通信技术股份有限公司 Communication terminal and display method in multi-window mode
CN112532869A (en) * 2018-10-15 2021-03-19 华为技术有限公司 Image display method in shooting scene and electronic equipment
KR20210091298A (en) * 2018-11-26 2021-07-21 후아웨이 테크놀러지 컴퍼니 리미티드 Application display method and electronic device
CN113711172A (en) * 2019-04-16 2021-11-26 苹果公司 Systems and methods for interacting with companion display modes of an electronic device with a touch-sensitive display

Patent Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1622619A (en) * 2004-12-24 2005-06-01 北京中星微电子有限公司 A multi-screen display method and device
US20120240056A1 (en) * 2010-11-17 2012-09-20 Paul Webber Email client mode transitions in a smartpad device
CN104461433A (en) * 2014-12-19 2015-03-25 北京奇艺世纪科技有限公司 Individual interface display method and device
US20180329587A1 (en) * 2017-05-12 2018-11-15 Apple Inc. Context-specific user interfaces
CN109997348A (en) * 2017-09-25 2019-07-09 华为技术有限公司 A kind of display methods and terminal of terminal interface
CN108563374A (en) * 2018-03-05 2018-09-21 维沃移动通信有限公司 A kind of interface display method and terminal device
CN112532869A (en) * 2018-10-15 2021-03-19 华为技术有限公司 Image display method in shooting scene and electronic equipment
KR20210091298A (en) * 2018-11-26 2021-07-21 후아웨이 테크놀러지 컴퍼니 리미티드 Application display method and electronic device
CN109917956A (en) * 2019-02-22 2019-06-21 华为技术有限公司 It is a kind of to control the method and electronic equipment that screen is shown
CN113711172A (en) * 2019-04-16 2021-11-26 苹果公司 Systems and methods for interacting with companion display modes of an electronic device with a touch-sensitive display
CN110308834A (en) * 2019-04-25 2019-10-08 维沃移动通信有限公司 The setting method and terminal of application icon display mode
CN110442297A (en) * 2019-08-08 2019-11-12 Oppo广东移动通信有限公司 Multi-screen display method, split screen display available device and terminal device
CN110688179A (en) * 2019-08-30 2020-01-14 华为技术有限公司 Display method and terminal equipment
CN111190559A (en) * 2019-12-04 2020-05-22 深圳市东向同人科技有限公司 Screen projection control synchronization method, mobile terminal and computer readable storage medium
CN111221450A (en) * 2020-01-02 2020-06-02 杭州网易云音乐科技有限公司 Information display method and device, electronic equipment and storage medium
CN111367456A (en) * 2020-02-28 2020-07-03 青岛海信移动通信技术股份有限公司 Communication terminal and display method in multi-window mode

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024113957A1 (en) * 2022-11-28 2024-06-06 荣耀终端有限公司 Game service management method and electronic device

Also Published As

Publication number Publication date
CN114398016B (en) 2024-06-11

Similar Documents

Publication Publication Date Title
US10917515B2 (en) Method for switching applications in split screen mode, computer device and computer-readable storage medium
CN106775420B (en) Application switching method and device and graphical user interface
US11237724B2 (en) Mobile terminal and method for split screen control thereof, and computer readable storage medium
US10275295B2 (en) Method and apparatus for presenting clipboard contents on a mobile terminal
CN109062467B (en) Split screen application switching method and device, storage medium and electronic equipment
JP5931298B2 (en) Virtual keyboard display method, apparatus, terminal, program, and recording medium
US10775979B2 (en) Buddy list presentation control method and system, and computer storage medium
CN105786878B (en) Display method and device of browsing object
CN107949826B (en) Message display method, user terminal and graphical user interface
CN108984064B (en) Split screen display method and device, storage medium and electronic equipment
CN108549519B (en) Split screen processing method and device, storage medium and electronic equipment
CN105975190B (en) Graphical interface processing method, device and system
CN114930289A (en) Widget processing method and related device
CN105302452B (en) Operation method and device based on gesture interaction
CN108958606B (en) Split screen display method and device, storage medium and electronic equipment
WO2020007114A1 (en) Method and apparatus for switching split-screen application, storage medium, and electronic device
TW201516844A (en) Apparatus and method for selecting object
WO2020007144A1 (en) Switching method and device for split screen application, storage medium and electronic device
US20150278186A1 (en) Method for configuring application template, method for launching application template, and mobile terminal device
US10101894B2 (en) Information input user interface
CN109062469B (en) Split screen display method and device, storage medium and electronic equipment
CN114398016B (en) Interface display method and device
CN103412722A (en) Method and device for selecting multiple objects and terminal equipment
CN108920086B (en) Split screen quitting method and device, storage medium and electronic equipment
CN107003759B (en) Method for selecting text

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant