CN118042036A - Method for displaying graphical user interface and mobile terminal - Google Patents

Method for displaying graphical user interface and mobile terminal Download PDF

Info

Publication number
CN118042036A
CN118042036A CN202410029388.1A CN202410029388A CN118042036A CN 118042036 A CN118042036 A CN 118042036A CN 202410029388 A CN202410029388 A CN 202410029388A CN 118042036 A CN118042036 A CN 118042036A
Authority
CN
China
Prior art keywords
application
screen area
mobile terminal
interface
touch
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202410029388.1A
Other languages
Chinese (zh)
Inventor
严斌
薛康乐
马冬
尹帮实
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202410029388.1A priority Critical patent/CN118042036A/en
Publication of CN118042036A publication Critical patent/CN118042036A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72484User interfaces specially adapted for cordless or mobile telephones wherein functions are triggered by incoming communication events
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)
  • Telephone Function (AREA)

Abstract

The application provides a method for displaying a graphical user interface and a mobile terminal comprising a touch screen and a camera. The mobile terminal displays an application interface of a first application in the first screen area; when the second application has a notification, the mobile terminal displays the notification in the second screen area; and when the first touch operation executed on the notification is detected, the mobile terminal continuously displays an application interface of the first application in the first screen area, and displays an application interface of the second application related to the notification on the application interface of the first application. The application improves the diversified interaction function of the mobile terminal, simplifies the operation process of the user, and improves the user experience.

Description

Method for displaying graphical user interface and mobile terminal
Technical Field
The present application relates to the field of man-machine interaction, and in particular, to a method for displaying a graphical user interface and a mobile terminal.
Background
Under the current development trend of terminal equipment such as smart phones, functions which can be realized by the terminal are continuously expanded, consumers are not only required to meet the call requirements of the terminal equipment, and the requirements on display are also continuously improved. From the function machine of the digital key to the smart phone of the resistor screen and the capacitor screen, the display screen size is also developed to 5.5 inches, and even larger, and the large-screen mobile phone is increasingly favored by consumers. With the white-heat of intelligent terminal competition, the concept of ultra-large screen ratio and even full-screen mobile phone is generated, and the industry aims to improve the screen ratio of the mobile phone in view of the portability of the mobile phone size. With the generation of various display screen structures and sizes for improving the mobile phone screen duty ratio, how to provide more diversified interactive functions in the display screen becomes a key for improving the user experience.
Disclosure of Invention
In order to solve the technical problems, the embodiment of the application provides a plurality of methods for displaying graphical user interfaces and a mobile terminal, which aim to improve diversified interaction functions of the mobile terminal, simplify the operation process of a user and improve the user experience.
In a first aspect, an embodiment of the present application provides a method for displaying a graphical user interface on a mobile terminal, the method including the steps of: firstly, the mobile terminal displays an application interface of a first application in the first screen area; then, when the second application has a notification, the mobile terminal displays the notification in the second screen area; and then, when the first touch operation performed on the notification is detected, the mobile terminal continuously displays an application interface of the first application in the first screen area, and displays an application interface of the second application related to the notification on the application interface of the first application. Therefore, the method can fully utilize the display area of the mobile terminal, display the interface content of the first application in the first screen area, such as video call, video and audio play (movie, television, animation, etc.), game, etc., when the notification of the second application exists, the mobile terminal can display the information related to the notification in the second screen area, on one hand, compared with the mode of popping up the interface in the first screen area for prompting, the first screen area is not occupied, the display content of the first screen area is not influenced, on the other hand, the notified content can be permanently displayed in the second screen area, and the problem that the user cannot see the notified content clearly and the notification disappears is avoided; and a convenient function is provided, so that the user can rapidly process the notification of the second application and continue to watch the interface content of the first application.
In some possible implementations, the mobile terminal displays the application interface of the second application on a partial area of the application interface of the first application.
In some possible implementations, the mobile terminal displays the application interface of the second application on a partial area of the application interface of the first application, and the mobile terminal displays the application interface of the second application on the application interface of the first application in a transparent or semitransparent manner so as to improve the display effect of the mobile terminal on the touch screen.
In some possible implementations, when the mobile terminal detects a second touch operation performed on the application interface of the second application while displaying the application interface of the second application related to the notification, the mobile terminal processes the notification to enable the user to continue to view interface content of the first application and quickly process the notification of the second application, including viewing, replying, and the like.
In some possible implementations, after the mobile terminal processes the notification, the mobile terminal further continues to display an application interface of the first application in the first screen area, and does not display an application interface of the second application. After the mobile terminal processes the notification, closing an application interface of the second application displayed on the first screen area, wherein the processing process is concise and smooth, and is suitable for the use habit of a user.
In some possible implementations, when the mobile terminal detects a third touch operation performed on the first screen area and on an area outside the application interface of the second application while displaying the application interface related to the notification of the second application, the mobile terminal executes an instruction corresponding to the third touch operation in the second application, and continues to display the application interface related to the notification in the second application in the first screen area, that is, when the application interface of the second application and the application interface of the first application are displayed on the first screen, it may be selected whether to execute the instruction related to the second application or the instruction related to the second application according to the position where the touch operation performed in the first screen area is detected.
In some possible implementations, when the mobile terminal displays the application interface related to the notification of the second application, and detects a fourth touch operation, the mobile terminal continues to display the application interface of the first application in the first screen area, and does not display the application interface of the second application, the mobile terminal may exit the application interface of the second application displayed in the first screen area according to the corresponding fourth touch operation, the fourth touch operation may be determined according to a preset, and the step of detecting the fourth touch operation by the mobile terminal includes any one of the following steps: the mobile terminal detects a touch operation of a notification corresponding to the second application in the second screen area; the mobile terminal detects touch operation of an application interface of the second application; the mobile terminal detects a touch operation performed on an area outside an application interface of the first application and an application interface of the second application.
In some possible implementations, the mobile terminal displays an icon of at least one of the second applications in the second screen area.
In some possible implementations, the method further includes: and when a first touch operation performed on the icon of the second application is detected, displaying an application interface of the second application in the first screen area, and displaying the icon of the first application in the second screen area.
In some possible implementations, in combination with the foregoing embodiment, the method further includes: and when the mobile terminal does not display the application interface of the second application in the second screen area, displaying the icon of the second application in the second screen area.
The second application may be a system-level application or an application provided by a third service party. And in some embodiments, the mobile terminal can quickly enter the second application in response to detecting the touch operation of the user displaying at least one icon of the second application in the second screen area, thereby providing the functions of quickly entering the common application and quickly returning to the application interface of the previous access application, simplifying the operation flow of the user, and improving the use experience of the user.
In some possible implementations, the mobile terminal prompts the notification by displaying the content of the notification in the second screen area.
In some possible implementations, the mobile terminal prompts the notification by displaying sender information of the notification in the second screen area, where the sender information includes any one or any several of a sender image, sender identity information, and sender sending time.
In some possible implementations, the mobile terminal prompts the notification by changing a display manner of an icon of the second application corresponding to the notification in the second screen area.
In some possible implementations, the method further includes: the mobile terminal displays a control in the second screen area; and when detecting a third touch operation executed on the control, hiding the icon of the at least one second application and/or the notification of the second application by the mobile terminal. The mobile terminal selects to extinguish the second screen area based on touch operation so as to save electricity consumption or reduce interference of display content of the second screen area on display content of the first screen area; or the mobile terminal cooperates the first display screen area and the first screen area to display a continuous interface content together based on touch operation, and the whole display improves the display effect, so that diversified display modes and functions are provided.
In some possible implementations, when the mobile terminal displays the notification in the second screen area, the content of the notification is displayed in a scrolling manner in the second screen area so as to adapt to the size of the second screen area and completely display the content of the notification.
In a second aspect, the present application provides a method for displaying a graphical user interface on a mobile terminal, wherein the mobile terminal includes a touch screen and a camera, the touch screen includes a first screen area and a second screen area, the first screen area is a rectangular area, the second screen area is an area of the touch screen extending from the first screen area to the periphery of the camera, and the method performed on the mobile terminal includes the steps of: firstly, the mobile terminal displays an application interface of a first application in the first screen area, and displays an icon of a second application in the second screen area; then, when a first touch operation performed on the icon of the second application is detected, the mobile terminal displays an application interface of the second application in the first screen area and displays the icon of the first application in the second screen area. The mobile terminal displays at least one icon of the second application, such as some commonly used applications, in the second screen area, when detecting the touch operation performed on the icon of the second application, rapidly enters the second application, and switches and displays the application interface of the second application in the first screen area, so that the touch operation of the application interface needing to exit the first application first is saved, the function of rapidly entering the commonly used application is provided, the interaction process is simplified, and the user experience is improved.
In some possible implementations, after the mobile terminal displays the application interface of the second application in the first screen area, when the mobile terminal does not display the application interface of the second application in the second screen area, an icon of the second application is displayed in the second screen area. And in response to detecting touch operation on the icon of the first application, switching and displaying an application interface of the second application in the first screen area, so that a function of quickly returning to the previous access application is provided, and the interaction process is simple, convenient and quick.
In a third aspect, the present application provides a method for displaying on a mobile terminal, where the mobile terminal includes a touch screen and a camera, the touch screen includes a first screen area and a second screen area, the first screen area is a rectangular area, the second screen area is an area where the touch screen extends from the first screen area to around the camera, and the method performed on the mobile terminal includes: firstly, the mobile terminal starts the camera to shoot; then, the mobile terminal displays the image acquired by the camera on the first screen area, and improves the display brightness of all or part of the second screen area to exceed a threshold value so as to supplement light to the shot object. According to the method, the mobile terminal takes the second screen area as the light supplementing lamp in the shooting process of the front-end camera, and can supplement light for a shot object (such as a human face) without additionally arranging a light supplementing lamp device, so that an image with better brightness can be obtained even when the ambient light of a user is darker, the effect of the light supplementing lamp is provided in the shooting, makeup, human face recognition and other applications, and the quality of the obtained image is improved in a dark light environment. Therefore, the mobile terminal has more various interaction functions, the use requirement of a user in shooting scenes is improved, and the use experience of the user is improved.
In some possible embodiments, the mobile terminal displays a preset color in all or at least one partial area of the second screen area to supplement light to the photographed object. In some embodiments, the mobile terminal adjusts the second screen area to a monochrome mode, for example, white, yellow, blue, and the like, so that the imaging color of the photographed object can be improved, the imaging effect can be improved, and the photographing experience of the user can be provided.
In some possible embodiments, the method further comprises: and the mobile terminal also displays a prompt pattern pointing to the position of the camera in the second screen area.
In a fourth aspect, the present application provides a mobile terminal, wherein the mobile terminal includes: a camera; the touch screen comprises a first screen area and a second screen area, wherein the first screen area is a rectangular area, and the second screen area is an area of the touch screen extending from the first screen area to the periphery of the camera; one or more processors; a memory; and one or more computer programs, wherein the one or more computer programs are stored in the memory, the one or more computer programs comprising instructions that, when executed by the mobile terminal, cause the mobile terminal to perform the steps of: firstly, displaying an application interface of a first application in the first screen area; displaying a notification in the second screen area when the notification is available to the second application; then, when a first touch operation performed on the notification is detected, continuing to display an application interface of the first application in the first screen area, and displaying an application interface of the second application related to the notification on the application interface of the first application.
In some possible implementations, in the step of displaying an application interface of the second application related to the notification, the instructions, when executed by the mobile terminal, cause the mobile terminal to perform the steps of: and displaying the application interface of the second application on a partial area of the application interface of the first application.
In some possible implementations, in the step of displaying an application interface of the second application related to the notification, the instructions, when executed by the mobile terminal, cause the mobile terminal to perform the steps of: displaying the application interface of the second application on the application interface of the first application in a transparent or semitransparent mode.
In some possible implementations, the instructions, when executed by the mobile terminal, cause the mobile terminal to perform the steps of: and processing the notification when a second touch operation performed on an application interface of the second application is detected.
In some possible implementations, after processing the notification, the instructions, when executed by the mobile terminal, cause the mobile terminal to further perform the steps of: and continuously displaying the application interface of the first application in the first screen area, and not displaying the application interface of the second application.
In some possible implementations, the instructions, when executed by the mobile terminal, cause the mobile terminal to display an application interface of the second application related to the notification, to perform the steps of: and when a third touch operation which is executed on the first screen area and is executed on an area outside the application interface of the second application is detected, executing an instruction corresponding to the third touch operation in the second application, and continuously displaying the application interface related to the notification in the second application in the first screen area.
In some possible implementations, the instructions, when executed by the mobile terminal, cause the mobile terminal to perform the steps of: and when the fourth touch operation is detected, continuing to display the application interface of the first application in the first screen area, and not displaying the application interface of the second application.
Wherein, the fourth touch operation is detected, and when the instruction is executed by the mobile terminal, the mobile terminal is caused to execute any one of the following steps: detecting a touch operation of a notification corresponding to the second application in the second screen area; detecting touch operation of an application interface of the second application; a touch operation performed on an area outside an application interface of the second application in an application interface of the first application is detected.
In some possible implementations, the instructions, when executed by the mobile terminal, cause the mobile terminal to further perform the steps of: and displaying at least one icon of the second application in the second screen area.
In some possible implementations, in the step of prompting the notification in the second screen area, the instructions, when executed by the mobile terminal, cause the mobile terminal to perform at least any one of the following steps: displaying the content of the notification in the second screen area; displaying sender information of the notification in the second screen area, wherein the sender information comprises any one or more of a sender image, sender identity information and sender sending time; and changing the display mode of the icon of the second application corresponding to the notification in the second screen area.
In some possible implementations, the instructions, when executed by the mobile terminal, cause the mobile terminal to further perform the steps of: displaying a control in the second screen area; and hiding the icon of the at least one second application and/or the notification of the second application when a third touch operation performed on the control is detected.
In some possible implementations, in displaying the notification in the second screen area, the instructions, when executed by the mobile terminal, cause the mobile terminal to perform the steps of: and displaying the content of the notification in a scrolling manner in the second screen area.
In a fifth aspect, the present application provides a mobile terminal, wherein the mobile terminal includes: a camera; the touch screen comprises a first screen area and a second screen area, wherein the first screen area is a rectangular area, and the second screen area is an area of the touch screen extending from the first screen area to the periphery of the camera; one or more processors; a memory; and one or more computer programs, wherein the one or more computer programs are stored in the memory, the one or more computer programs comprising instructions that, when executed by the mobile terminal, cause the mobile terminal to perform the steps of: firstly, displaying an application interface of a first application in the first screen area, and displaying an icon of a second application in the second screen area; and then when a first touch operation performed on the icon of the second application is detected, displaying an application interface of the second application in the first screen area, and displaying the icon of the first application in the second screen area.
In some possible implementations, after the first screen area displays the application interface of the second application, the instructions, when executed by the mobile terminal, cause the mobile terminal to further perform the steps of: and when the mobile terminal does not display the application interface of the second application in the second screen area, displaying the icon of the second application in the second screen area.
In a sixth aspect, the present application provides a mobile terminal displayed on the mobile terminal, wherein the camera; the touch screen comprises a first screen area and a second screen area, wherein the first screen area is a rectangular area, and the second screen area is an area of the touch screen extending from the first screen area to the periphery of the camera; one or more processors; a memory; and one or more computer programs, wherein the one or more computer programs are stored in the memory, the one or more computer programs comprising instructions that, when executed by the mobile terminal, cause the mobile terminal to perform the steps of: starting the camera to shoot; and then, displaying the image acquired by the camera on the first screen area, and improving the display brightness of all or part of the second screen area to exceed a threshold value so as to supplement light to the shot object.
In some possible implementations, the instructions, when executed by the mobile terminal, cause the mobile terminal to further perform the steps of: and displaying a preset color in all or at least one partial area of the second screen area so as to supplement light to the shot object.
In some possible implementations, the instructions, when executed by the mobile terminal, cause the mobile terminal to further perform the steps of: and displaying a prompt pattern pointing to the position of the camera in the second screen area.
In a seventh aspect, the present application provides a Graphical User Interface (GUI) stored in an electronic device comprising a touch screen, a memory, one or more processors, the touch screen comprising a first screen area and a second screen area, the first screen area being a rectangular area, the second screen area being an area of the touch screen extending from the first screen area around the camera, the one or more processors being configured to execute one or more computer programs stored in the memory, characterized in that the graphical user interface comprises: a first GUI displayed in the first screen area, the first GUI including an application interface for a first application; in response to detecting that the second application has a notification, a second GUI displayed in the second screen area, the second GUI including the notification; in response to detecting a first touch operation performed on the notification, continuing to display the first GUI in the first screen area and displaying a third GUI on the first GUI, the third GUI including an application interface of the second application related to the notification.
In some possible embodiments, the first GUI further comprises: at least one icon of the second application.
In an eighth aspect, the present application provides a computer program product, wherein the computer program product, when run on a mobile terminal, causes the mobile terminal to perform the method according to any one of the first to third aspects and embodiments thereof.
In a ninth aspect, the present application provides a computer readable storage medium comprising instructions, wherein the instructions, when run on a mobile terminal, cause the mobile terminal to perform the method of any one of the first to third aspects and embodiments thereof.
In a tenth aspect, the present application provides a mobile terminal having a touch screen, the mobile terminal comprising: the touch screen comprises a display unit, a detection unit and a control unit, wherein the display unit is used for displaying an application interface of a first application in a first screen area of the touch screen; the detection unit is used for detecting whether the second application has a notification and detecting whether the second application has a touch operation on the touch screen; the control unit is used for enabling the display unit to display the notification in a second screen area of the touch screen when the detection unit detects that the notification exists in the second application, continuously displaying an application interface of the first application in the first screen area when the detection unit detects that the first touch operation is performed on the notification, and displaying an application interface, related to the notification, of the second application on the application interface of the first application. The mobile terminal further comprises a camera, the touch screen comprises a first screen area and a second screen area, the first screen area is a rectangular area, and the second screen area is an area, extending from the first screen area to the periphery of the camera, of the touch screen.
In some possible embodiments, the display unit is further configured to display an icon of at least one of the second applications in the second screen area.
In some possible implementations, the control unit is further configured to, when the detection unit detects a first touch operation performed on the icon of the second application, cause the display unit to display an application interface of the second application in the first screen area, and display the icon of the first application in the second screen area.
In some possible implementations, in combination with the foregoing embodiment, the display unit is further configured to display an icon of the second application in the second screen area when the second screen area does not display an application interface of the second application.
The second application may be a system-level application or an application provided by a third service party.
In some possible implementations, the display unit is configured to display a control in the second screen area; the control unit is further used for detecting a third touch operation performed on the control through the detection unit, so that the display unit hides the icon of the at least one second application and/or the notification of the second application.
In some possible implementations, the display unit is configured to display an application interface of the second application on a partial area of the application interface of the first application.
In some possible implementations, the control unit is further configured to process the notification when the detection unit detects a second touch operation performed on an application interface of the second application.
In some possible implementations, after the control unit processes the notification, the processing unit is further configured to cause the display unit to continue displaying the application interface of the first application in the first screen area, and not display the application interface of the second application.
In some possible implementations, when the display unit displays an application interface related to the notification of the second application, the control unit is configured to, when the detection unit detects a third touch operation performed on an area of the first screen area and outside an application interface of the second application, perform an instruction corresponding to the third touch operation in the second application, and cause the display unit to continue displaying, in the first screen area, the application interface related to the notification in the second application.
In some possible implementations, when the display unit displays the application interface related to the notification of the second application, the control unit is configured to, when detecting a fourth touch operation at the detection unit, continue displaying the application interface of the first application in the first screen area, and not display the application interface of the second application. Further, the control unit exits the application interface of the second application displayed in the first screen area according to the corresponding fourth touch operation. The fourth touch operation may be determined according to a preset, and the step of detecting the fourth touch operation by the detection unit includes any one of the following: the mobile terminal detects a touch operation of a notification corresponding to the second application in the second screen area; the mobile terminal detects touch operation of an application interface of the second application; the mobile terminal detects a touch operation performed on an area outside an application interface of the first application and an application interface of the second application.
In some possible implementations, the display unit prompts the notification by displaying the content of the notification in the second screen area.
In some possible implementations, the display unit prompts the notification by displaying sender information of the notification in the second screen area, where the sender information includes any one or any several of a sender image, sender identity information, and sender transmission time.
In some possible implementations, the display unit prompts the notification by changing a display manner of an icon of the second application corresponding to the notification in the second screen area.
In some possible implementations, the display unit is further configured to display a control in the second screen area; the control unit is further used for detecting a third touch operation performed on the control through the detection unit, so that the display unit hides the icon of the at least one second application and/or the notification of the second application.
In some possible implementations, the display unit displays the content of the notification in a scrolling manner in the second screen area when the notification is displayed in the second screen area.
In an eleventh aspect, the present application provides a mobile terminal having a touch screen, the mobile terminal comprising: the touch screen comprises a display unit and a control unit, wherein the display unit is used for displaying an application interface of a first application in a first screen area of the touch screen and displaying an icon of a second application in a second screen area of the touch screen; the detection unit is used for detecting whether touch operation is performed on the touch screen or not; the control unit is used for detecting a first touch operation executed on the icon of the second application in the detection unit, so that the display unit displays an application interface of the second application in the first screen area, and displays the icon of the first application in the second screen area.
In some possible implementations, the display unit is further configured to display, after the first screen area displays the application interface of the second application, an icon of the second application in the second screen area when the second screen area does not display the application interface of the second application.
In a twelfth aspect, the present application provides a mobile terminal having a touch screen, the mobile terminal comprising: the device comprises a shooting unit, a display unit and a control unit, wherein the shooting unit is used for shooting images; the display unit is used for displaying the image acquired by the camera on a first screen area of the touch screen; the control unit is used for controlling the display unit to improve the display brightness of all or part of the second screen area of the touch screen to exceed a threshold value so as to supplement light to the shot object.
In some possible implementations, the display unit is further configured to display a preset color in all or at least one partial area of the second screen area, so as to supplement light to the photographed object.
In some possible implementations, the display unit is further configured to display a prompt pattern pointing to the camera position in the second screen area.
Drawings
Fig. 1 is a schematic diagram of a hardware structure of a mobile terminal according to some embodiments of the present application;
FIG. 2 is a schematic diagram of a portion of hardware, software of a mobile terminal provided in accordance with some embodiments of the present application;
FIGS. 3A and 3B are schematic diagrams provided in accordance with some embodiments of the application;
FIG. 4 is a graphical user interface schematic of a mobile terminal provided in accordance with some embodiments of the application;
FIG. 5 is a flow chart of a GUI display process in a mobile terminal provided in accordance with some embodiments of the present application;
FIGS. 6A and 6B are schematic diagrams of GUI display procedures in a mobile terminal provided in accordance with some embodiments of the present application;
fig. 7A and 7B are schematic views of GUI display procedures in a mobile terminal provided according to some embodiments of the present application;
Fig. 8A and 8B are schematic views of GUI display procedures in a mobile terminal provided according to some embodiments of the present application;
Fig. 9A, 9B, and 9C are schematic diagrams of GUI display procedures in a mobile terminal provided according to some embodiments of the present application;
FIGS. 10A and 10B are schematic diagrams of GUI display procedures in a mobile terminal provided in accordance with some embodiments of the present application;
FIGS. 11A and 11B are schematic diagrams of GUI display procedures in a mobile terminal provided in accordance with some embodiments of the present application;
Fig. 12A, 12B, and 12C are schematic diagrams of GUI display procedures in a mobile terminal provided according to some embodiments of the present application;
fig. 13A and 13B are schematic views of GUI display procedures in a mobile terminal provided according to some embodiments of the present application;
Fig. 14A and 14B are schematic diagrams of GUIs displayed in a mobile terminal provided according to some embodiments of the present application;
Fig. 15A and 15B are schematic views of GUI display procedures in a mobile terminal provided according to some embodiments of the present application;
FIGS. 16A and 16B are diagrams of GUI display procedures in a mobile terminal provided in accordance with some embodiments of the present application;
17A, 17B and 17C are schematic diagrams of GUI display procedures in a mobile terminal provided in accordance with some embodiments of the present application;
fig. 18 is a schematic diagram of a GUI display process in a mobile terminal provided according to some embodiments of the present application.
Detailed Description
The application is further described below with reference to the drawings and examples.
Reference will now be made in detail to the embodiments, examples of which are illustrated in the accompanying drawings. In the following detailed description, numerous specific details are set forth in order to provide a thorough understanding of the application. It will be apparent, however, to one skilled in the art that the application may be practiced without these specific details. In other instances, well-known methods, procedures, components, circuits, and networks have not been described in detail so as not to unnecessarily obscure aspects of the embodiments.
It will be further understood that, although the terms "first," "second," etc. may be used herein to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another element. For example, a first contact may be named a second contact, and similarly, a second contact may be named a first contact without departing from the scope of the application. The first contact and the second contact are both contacts, but they may not be the same contact, or may be the same contact in some scenarios.
The terminology used in the description of the application herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the specification of this application and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates to the contrary. It will also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
As used herein, the term "if" may be interpreted to mean "when..or" after..or "in response to a determination" or "in response to detection" depending on the context. Similarly, the phrase "if a condition or event is identified" or "if a condition or event is detected" may be interpreted to mean "upon identification of the condition or event" or "in response to the identification of the condition or event" or "upon detection of the condition or event or" in response to the detection of the condition or event ".
Embodiments of mobile terminals, user interfaces for such mobile terminals, and associated processes for using such mobile terminals are described below. In some embodiments, the mobile terminal is a portable mobile terminal, such as a cell phone, tablet computer, etc., that also includes other functions, such as personal digital assistant and/or music player functions. Exemplary embodiments of portable mobile terminals include, but are not limited to, piggy-back Or other operating system. Other portable mobile terminals may also be used, such as a laptop computer or tablet computer having a touch-sensitive surface (e.g., a touch screen display and/or a touch pad).
In the following discussion, a mobile terminal is presented that includes a display and a touch-sensitive surface. It should be understood, however, that the mobile terminal may include one or more other physical user interface devices, such as a physical keyboard, mouse, and/or joystick.
Mobile terminals typically support a variety of applications, such as one or more of the following: drawing applications, word processing applications, web page creation applications, spreadsheet applications, gaming applications, telephony applications, video conferencing applications, email applications, instant messaging applications, workout support applications, photo management applications, gallery applications, digital video camera applications, web browsing applications, digital music player applications, and/or digital video player applications.
Various applications executable on the mobile terminal may use at least one common physical user interface device, such as a touch-sensitive surface. One or more functions of the touch-sensitive surface and corresponding information displayed on the mobile terminal may be displayed and/or changed from one application to the next and/or within the corresponding application. In this way, a common physical architecture of the mobile terminal (such as a touch-sensitive surface) may support various applications with a user interface that is intuitively clear to the user.
Attention is now directed to embodiments of a portable mobile terminal having a touch sensitive display. Fig. 1 is a block diagram illustrating a portable mobile terminal (illustrated as handset 100) having a touch sensitive display 115 according to some embodiments. Touch-sensitive display 115 is sometimes referred to as a "touch screen" and may also be referred to as or as a touch-sensitive display system, and may also be referred to as a display system having a touch-sensitive surface (touch-SENSITIVE SURFACE) and a display screen (display).
It should be understood that the handset 100 shown in fig. 1 is only one example of the mobile terminal described above, and that the handset 100 may have more or fewer components than shown in the figures, may combine two or more components, or may have a different configuration of components. The various components shown in the figures may be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
As shown in fig. 1, the mobile phone 100 may include: processor 101, radio frequency circuitry 102, memory 103, input/output subsystem 104, bluetooth device 105, sensor 106, wi-Fi device 107, positioning device 108, audio circuitry 109, peripheral interface 110, and power system 111. The components may communicate via one or more communication buses or signal lines. Those skilled in the art will appreciate that the hardware architecture shown in fig. 1 is not limiting and that more or fewer components than shown may be included or that certain components may be combined or that different arrangements of components may be provided.
The following describes the various components of the handset 100 in detail with reference to fig. 1:
The processor 101 is a control center of the mobile phone 100, connects various parts of the mobile phone 100 using various interfaces and lines, and performs various functions and processes of the mobile phone 100 by running or executing an application (App) stored in the memory 103, and calling data and instructions stored in the memory 103. In some embodiments, the processor 101 may include one or more control units; processor 101 may also integrate an application processor and a modem processor; the application processor mainly processes an operating system, a user interface, an application program and the like, and the modem processor mainly processes wireless communication. It will be appreciated that the modem processor described above may not be integrated into the processor 101. For example, the processor 101 may be a kylin 960 chip manufactured by Hua technology Co., ltd. In some embodiments of the present application, the processor 101 may further include a fingerprint verification chip for verifying the collected fingerprint.
The radio frequency circuit 102 may be used for receiving and transmitting wireless signals during a message or call. Specifically, the rf circuit 102 may receive downlink data of the base station and then process the downlink data for the processor 101; in addition, data relating to uplink is transmitted to the base station. Typically, the radio frequency circuitry includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency circuitry 102 may also communicate with other devices via wireless communications. The wireless communication may use any communication standard or protocol including, but not limited to, global system for mobile communications, general packet radio service, code division multiple access, wideband code division multiple access, long term evolution, email, short message service, and the like.
The memory 103 is used to store application programs, software, and data, and the processor 101 executes various functions and data processing of the mobile phone 100 by running the application programs and data stored in the memory. The memory 103 mainly includes a storage program area and a storage data area, wherein the storage program area can store an operating system, an application program (such as a sound playing function, an image management function, etc.) required for at least one function; the storage data area may store data (such as audio data, phonebooks, calendar events, etc.) created from the use of the handset. In addition, the memory may include high-speed random access memory, and may also include nonvolatile memory, such as magnetic disk storage devices, flash memory devices, or other volatile solid-state storage devices.
Input/output subsystem (I/O subsystem) 104 couples input/output devices on device 100, such as touch-sensitive display 115 and other input control devices 116, to processor 101. The I/O subsystem 104 may include a display controller 117 and one or more input controllers 118 for other input control devices 116. The one or more input controllers 118 receive electrical signals from/transmit electrical signals to the other input control devices 116. The other input control devices 116 may include physical buttons (e.g., push buttons, rocker buttons, etc.), dials, slider switches, joysticks, click-type dials, and the like. In other embodiments, the input controller 118 may be coupled to (or not coupled to) any of the following: a keyboard, an infrared port, a USB port, and a pointing device such as a mouse. The one or more buttons may include up/down buttons for volume control of speaker 112 and/or microphone 113. The one or more buttons may include a push button.
The touch sensitive display 115 provides an input interface and an output interface between the mobile terminal and the user. The display controller 117 receives electrical signals from the touch sensitive display 115 and/or transmits electrical signals to the touch sensitive display 115. The touch sensitive display 115 displays visual output to a user. Visual output may include graphics, text, icons, video, and any combination thereof (collectively, "graphics"). In some embodiments, some or all of the visual output may correspond to interface elements. Touch-sensitive display 115 is sometimes referred to for convenience as a "touch screen," and may also be referred to as a touch-sensitive display system, and may also be referred to as a display system having a touch-sensitive surface and a display screen.
The touch-sensitive display 115 has a touch-sensitive surface, display, sensor, or set of sensors that receives input from a user based on haptic and/or tactile contact. The touch-sensitive display 115 and the display controller 117 (along with any associated modules and/or sets of instructions in the memory 103) detect contact (and any movement or interruption of the contact) on the touch-sensitive display 115, converting the detected contact into interaction with interface elements (e.g., one or more soft keys, icons, web pages, or images) displayed on the touch-sensitive display 115. In an exemplary embodiment, the point of contact between the touch sensitive display 115 and the user corresponds to a finger of the user.
Wherein a touch sensitive surface (e.g., a touch panel) may collect touch events on or near the user of the cell phone 100 (such as the user's manipulation of any suitable object on or near the touch sensitive surface using a finger, stylus, etc.) and send the collected touch information to other devices, such as the processor 101. Wherein a touch event of a user near the touch sensitive surface may be referred to as hover touch; hover touch may refer to a user not requiring direct contact with a touch pad in order to select, move, or drag an object (e.g., an icon, etc.), but rather merely requiring the user to be located near the mobile terminal in order to perform a desired function. In the context of hover touch applications, the terms "touch," "contact," and the like do not imply a touch for directly contacting a touch screen, but rather a contact in the vicinity or proximity thereof. The touch-sensitive surface capable of floating touch control can be realized by adopting capacitance, infrared light sensation, ultrasonic wave and the like. The touch sensitive surface may comprise two parts, a touch detection device and a touch controller. The touch detection device detects the touch azimuth of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch detection device, converts the touch information into touch point coordinates, and transmits the touch point coordinates to the processor 101, and can also receive and execute instructions transmitted by the processor 101. In addition, touch sensitive surfaces may be implemented using a variety of types, such as resistive, capacitive, infrared, and surface acoustic waves. A display (also referred to as a display screen) may be used to display information entered by or provided to the user as well as various menus of the handset 100. The display may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The touch sensitive surface may be overlaid on top of the display and, upon detection of a touch event thereon or nearby, the touch sensitive surface is transferred to the processor 101 to determine the type of touch event, whereupon the processor 101 may provide a corresponding visual output on the display depending on the type of touch event. Although in fig. 1 the touch sensitive surface and the display are shown as two separate components to implement the input and output functions of the handset 100, in some embodiments the touch sensitive surface may be integrated with the display to implement the input and output functions of the handset 100. It will be appreciated that the touch screen 115 is formed by stacking multiple layers of materials, only the touch sensitive surface (layer) and the display screen (layer) are shown in the embodiments of the present application, and other layers are not described in the embodiments of the present application. In addition, in other embodiments of the present application, the touch-sensitive surface may be covered on the display, and the size of the touch-sensitive surface is larger than that of the display, so that the display is covered under the touch-sensitive surface, or the touch-sensitive surface may be configured on the front of the mobile phone 100 in a full panel manner, that is, the touch of the user on the front of the mobile phone 100 can be perceived by the mobile phone, so that a full touch experience on the front of the mobile phone can be achieved. In other embodiments, the touch-sensitive surface is configured on the front of the mobile phone 100 in a full-panel manner, and the display may also be configured on the front of the mobile phone 100 in a full-panel manner, so that a bezel-free structure can be implemented on the front of the mobile phone.
The touch-sensitive display 115 may use LCD (liquid crystal display) technology, LPD (light emitting polymer display) technology, or LED (light emitting diode) technology, but other display technologies may be used in other embodiments. The touch sensitive display 115 and the display controller 117 may detect contact and any movement or interruption thereof using any of a variety of touch sensing technologies now known or later developed, including but not limited to capacitive, resistive, infrared, and surface acoustic wave technologies, as well as other proximity sensor arrays or other elements for determining one or more points of contact with the touch sensitive display 115. In an exemplary embodiment, a projected mutual capacitance sensing technique is used.
The touch sensitive display 115 may have a video resolution in excess of 100 dpi. In some embodiments, the touch screen has a video resolution of about 160 dpi. The user may contact the touch-sensitive display 115 using any suitable object or appendage, such as a stylus, finger, or the like. In some embodiments, the user interface is designed to work primarily with finger-based contacts and gestures, which may be less accurate than stylus-based inputs due to a larger contact area of a finger on the touch screen. In some embodiments, the device translates the finger-based coarse input into a precise pointer/cursor position or command to perform the action desired by the user.
In some embodiments, the touch screen may include a sensor group having pressure sensing.
In other embodiments, the cell phone 100 may include a touch pad (not shown) for activating or deactivating a specific function in addition to the touch screen. In some embodiments, the touch pad is a touch sensitive area of the mobile terminal that, unlike the touch screen, does not display visual output. The touch pad may be a touch sensitive surface separate from the touch sensitive display 115 or an extension of the touch sensitive surface formed by the touch screen.
The handset 100 may also include one or more light sensors 119. Fig. 1 shows an optical sensor 119 coupled to a light sensor controller 120 in the I/O subsystem 104. The photosensor 119 can include a Charge Coupled Device (CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The light sensor 119 receives light projected through one or more lenses from an external environment and converts the light into data representing an image. In conjunction with the camera module 213, the light sensor 119 may capture still images or video. In some embodiments, one or more light sensors are located on the front and/or rear of the handset 100 opposite the touch sensitive display 115 on the front of the device so that the touch sensitive display 115 can be used as a viewfinder for still image and/or video image acquisition. In some embodiments, another one or more light sensors are located in the front of the device so that a user can obtain an image of the user for a video conference while watching other video conference participants on the touch screen display, light sensor 119 can be a camera.
In various embodiments of the present application, the mobile phone 100 may also have a fingerprint recognition function. For example, the fingerprint identifier may be configured on the back side of the mobile phone 100 (e.g., below the rear camera) or on the front side of the mobile phone 100 (e.g., below the touch screen 115). In addition, the fingerprint recognition function may also be implemented by configuring a fingerprint recognizer in the touch screen 115, i.e., the fingerprint recognizer may be integrated with the touch screen 115 to implement the fingerprint recognition function of the mobile phone 100. In this case, the fingerprint identifier may be configured in the touch screen 115, may be a part of the touch screen 115, or may be otherwise configured in the touch screen 115. In addition, the fingerprint sensor may be implemented as a full panel fingerprint sensor, and thus the touch screen 115 may be considered as a panel where fingerprint collection can be performed anywhere. The fingerprint identifier may send the acquired fingerprint to the processor 101 for processing (e.g., fingerprint verification, etc.) of the fingerprint by the processor 101. The primary component of the fingerprint identifier in embodiments of the present application is a fingerprint sensor that may employ any type of sensing technology, including but not limited to optical, capacitive, piezoelectric, or ultrasonic sensing technologies, and the like.
In addition, for a specific technical solution of integrating a fingerprint acquisition device in a touch screen in an embodiment of the present application, reference may be made to a patent application with application number US2015/0036065 A1, entitled "fingerprint sensor in a mobile terminal", which is filed by the united states patent and trademark office, and the entire contents of which are incorporated by reference in the various embodiments of the present application.
The bluetooth device 105 is configured to establish a wireless link with other mobile terminals (e.g., smart watch, tablet computer, etc.) through a bluetooth communication protocol, and perform data interaction.
The handset 100 may also include at least one sensor 106, such as the light sensor 119, motion sensor, and other sensors described above. Specifically, the light sensor 119 may include an ambient light sensor that may adjust the brightness of the display panel of the touch-sensitive display 115 according to the brightness of ambient light, and a proximity sensor that may turn off the power of the display panel when the mobile phone 100 moves to the ear. As one of the motion sensors, the accelerometer sensor can detect the acceleration in all directions (generally three axes), and can detect the gravity and direction when stationary, and can be used for recognizing the application of the gesture of the mobile terminal (such as horizontal and vertical screen switching, related games, magnetometer gesture calibration), vibration recognition related functions (such as pedometer and knocking), and the like; other sensors such as gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc. that may be configured with the mobile phone 100 are not described herein.
Wi-Fi device 107 for providing Wi-Fi network access to mobile phone 100, mobile phone 100 can help user send and receive e-mail, browse web page and access streaming media through Wi-Fi device 107, and provides wireless broadband Internet access to user.
The positioning device 108 is configured to provide the mobile phone 100 with geographic location information, where the geographic location information may indicate a current geographic location of the mobile phone. It is understood that the positioning device 108 may be a receiver of a positioning communication system such as a global positioning system or a Beidou satellite navigation system, russian GLONASS, etc. After receiving the geographical location information sent by the positioning communication system, the positioning device 108 sends the information to the processor 101 for further processing, or sends the information to the memory 103 for storage.
Audio circuitry 109, speaker 112, and microphone 113 may provide an audio interface between the user and the handset 100. The audio circuit 109 may transmit the received electrical signal after audio data conversion to the speaker 112, and the electrical signal is converted into a sound signal by the speaker 112 and output, where the speaker 112 may include a power amplifier speaker, a receiver, and the like; on the other hand, the microphone 113 converts the collected sound signal into an electrical signal, which is received by the audio circuit 109 and converted into audio data, which is output to the radio frequency circuit 102 for transmission to, for example, another cellular phone, or to the memory 103 for further processing.
The handset 100 may also include a power system 111 (including power failure detection circuitry, a power converter or inverter, a power status indicator, a battery, and a power management chip) that provides power to the various components. The battery may be logically connected to the processor 101 through a power management chip to manage charging, discharging, and power consumption through the power system 111.
The handset 100 may also include a peripheral interface 110 for providing an interface to external input/output devices (e.g., mouse, keyboard, external display, external memory, etc.). Peripheral interface 110 may be used to couple input and output peripheral devices of the device to processor 101 and memory 103. The processor 101 runs or executes various applications and/or instruction sets stored in the memory 103 to perform various functions of the handset 100 and to process data. In some embodiments, the processor 101 may include an image signal processor and a dual-core or multi-core processor.
Although not shown, the handset 100 may also include a camera (front and/or rear), a Subscriber Identity Module (SIM) card slot, a flash, a micro-projection device, a Near Field Communication (NFC) device, etc.
In some embodiments, as shown in fig. 2, the software stored in memory 103 may include an operating system 201, a communication module (or instruction set) 202, a contact/movement module (or instruction set) 203, a graphics module (or instruction set) 204, a text input module (or instruction set) 205, a positioning module (or instruction set) 206, a Wi-Fi module (or instruction set) 207, and an application program (or instruction set) 208. Furthermore, in some embodiments, memory 103 may also store device/global internal state 209, as shown in fig. 1 and 2. The device/global internal state 209 may include at least one of the following states: an active application state indicating which applications (if any) are currently active; display status indicating what applications, views, or other information occupy various areas of the touch screen 115; sensor status, including information obtained from the various sensors and peripherals interface 110 of the mobile terminal; and information about the geographic location and attitude of the mobile terminal.
Operating system 201 (e.g., darwin, RTXC, LINUX, UNIX, OS X, WINDOWS, ANDROID, or an embedded operating system (such as Vx Works)) includes various software and/or drivers for controlling and managing general system tasks (e.g., memory management, storage device control, power management, etc.), and facilitates communication between the various hardware and software. Furthermore, in some embodiments, memory 103 may also store gallery 210.
The communication module 202 may communicate with other mobile terminals through the peripheral interface 110 and may also include various software for processing data received by the radio frequency circuitry 102 and/or the peripheral interface 110. Peripheral interfaces (e.g., universal Serial Bus (USB), firewire, etc.) are adapted to be coupled directly to other mobile terminals or indirectly through a network (e.g., the internet, wireless LAN, etc.). In some embodiments, the peripheral interface 110 may also be a multi-pin (e.g., 30-pin) connector that is the same as or similar to and/or compatible with a 30-pin connector used on an iPod (trademark of Apple inc.) device.
The contact/movement module 203 may detect contact with the touch screen 115 and other touch sensitive devices (e.g., a touch pad or physical click wheel). The contact/movement module 203 may include a number of software for performing various operations related to contact detection, such as determining whether contact has occurred (e.g., detecting a finger press event), determining whether movement of the contact exists and tracking the movement on the touch panel (e.g., detecting one or more finger drag events), and determining whether contact has terminated (e.g., detecting a finger lift event or a contact break). The contact/movement module 203 receives contact data from the touch panel. Determining movement of the point of contact may include determining a speed (magnitude), a speed (magnitude and direction), and/or an acceleration (change in magnitude and/or direction) of the point of contact, the movement of the point of contact being represented by a series of contact data. These operations may be applied to single point contacts (e.g., one finger contact) or multiple point simultaneous contacts (e.g., "multi-touch"/multiple finger contact).
The contact/movement module 203 may also detect gesture inputs of the user. Different types of gestures on the touch-sensitive surface have different contact patterns. Thus, the type of gesture may be detected by detecting a specific contact pattern. For example, detecting a single-finger tap gesture includes detecting a finger-down event, and then detecting a finger-up (lift-off) event at the same location (or substantially the same location) as the finger-down event (e.g., at an icon location). As another example, detecting a finger swipe gesture on a touch-sensitive surface includes detecting a finger press event, then detecting one or more finger drag events, and then detecting a finger lift (lift-off) event.
The types of gestures may be varied, for example, the gestures may be Flick gestures, i.e., a single finger tapping on the touch-sensitive surface of the touch-sensitive display 115, then swiping off and then quickly leaving the touch-sensitive surface, scrolling up and down the screen, or switching pictures left and right, etc.; the gesture may also be Slide gesture, i.e. a single finger tapping the touch sensitive surface, then continuously touching and moving, such as sliding unlocking; the gesture may also be Swipe gesture, i.e. multiple fingers contact the touch-sensitive surface and then continue to contact and move, such as a three-finger grip back to the main interface; the gesture may also be a Tap gesture, i.e. a single finger tapping the touch-sensitive surface followed immediately by leaving the touch-sensitive surface; the gesture may be a Double Tap gesture, that is, an operation performed twice in a very short time; the gesture may also be a Touch & Hold gesture, i.e., a finger tapping on the Touch-sensitive surface and remaining stationary; the gesture may also be a Drag gesture, i.e. a finger tapping the touch-sensitive surface, moving the finger slowly without leaving the touch-sensitive surface (typically with a definite target position, such as dragging a file to a trash can to delete the file); the gesture may also be a Pinch gesture, i.e., a Pinch of two fingers (typically thumb and index finger) on a touch-sensitive surface; the gesture may also be a Unpinch gesture, i.e., two fingers (typically thumb and index finger) splaying on a touch-sensitive surface; it will be appreciated that the gestures may be other types of gestures besides the various gestures listed above, and the type of gestures is not limited in this embodiment.
Graphics module 204 includes a number of known software for rendering and displaying graphics on touch-sensitive display 115 or other displays, including means for changing the intensity of the graphics displayed. As used herein, the term "graphic" includes any object that may be displayed to a user, including without limitation text, web pages, icons (such as user interface objects including soft keys), digital images, video, animation, and the like.
In some embodiments, the graphics module 204 stores data representation graphics to be used. Each graphic may be assigned a corresponding code. The graphic module 204 receives one or more codes designating graphics to be displayed from an application program or the like, and also receives coordinate data and other graphics attribute data together if necessary, and then generates screen image data to output to the display controller 117.
Text input module 205, which may be a component of graphics module 204, provides a soft keyboard for entering text in a variety of applications 208 (e.g., contact 208-1, browser 208-6, and any other application requiring text input).
The location module 206 is used to determine the geographic location of the mobile terminal and provide this information for use in various applications 208 (e.g., to the phone 2082 for geographic location based dialing, to the camera module 208-3 as picture/video metadata, and to other applications providing geographic location based services, such as weather applet 211-1, map/navigation applet 211-2, etc.).
The Wi-Fi module 207 is used to run the various instructions required by the Wi-Fi device 107.
The application 208 may include the following modules (or instruction sets) or a subset or superset thereof:
a contacts module 208-1 (sometimes also referred to as an address book or contact) for managing stored contacts;
● Telephone module 208-2;
a camera module 208-3 for still and/or video images for receiving instructions from a user for digital imaging (photographing) with the light sensor 119;
an image management module 208-4 for editing, deleting, moving, renaming, etc. the gallery 210 stored in the memory 103;
exercise support module 208-5;
browser module 208-6;
Desktop applet modules 211, which may include one or more of the following: weather applet 211-1, map/navigation applet 211-2 and other user-obtained applets, and user-defined applet 211-3;
A multimedia player module (i.e., video and music player module) 208-7, which may be comprised of a video player module and a music player module;
Word processing module 208-8;
the videoconference module 208-9,
Email module 208-10;
instant message module 208-11;
● Notification module 208-12;
Map module 208-13;
Calendar module 208-14; and/or
Application store module 208-15.
Examples of other applications 208 that may be stored in memory 103 include other word processing applications, other image editing applications, drawing applications, JAVA-enabled applications, encryption, digital rights management, voice recognition, and voice replication.
In conjunction with touch-sensitive display 115, contact module 203, graphics module 204, and text input module 205, contact module 208-1 may be used to manage an address book or contact list (e.g., stored in an application internal state of contact module 208-1 in memory 103), including: adding names to address books; deleting the name from the address book; associating a telephone number, email address, home address, or other information with the name; associating the image with the name; classifying and categorizing names; providing a telephone number or email address to initiate and/or facilitate communication via telephone 208-2, email; etc.
In conjunction with the radio frequency circuitry 102, the audio circuitry 109, the speaker 112, the microphone 113, the touch-sensitive display 115, the display controller 117, the movement/contact module 203, the graphics module 204, and the text input module 205, the telephony module 208-2 may be used to input a sequence of characters corresponding to a telephone number, access one or more telephone numbers in the contact module 208-1, modify an already entered telephone number, dial a corresponding telephone number, make a call, and disconnect or hang up when the call is completed. As described above, wireless communication may use any of a number of communication standards, protocols, and techniques.
In conjunction with radio frequency circuitry 102, audio circuitry 109, speaker 112, microphone 113, touch-sensitive display 115, display controller 117, light sensor 119, light sensor controller 120, mobile contact module 203, graphics module 204, text input module 205, contact module 208-1, and telephony module 208-2, videoconferencing module 208-9 includes executable instructions for initiating, conducting, and ending a videoconference between a user and one or more other participants according to user instructions.
In conjunction with the radio frequency circuitry 102, the touch sensitive display 115, the display controller 117, the movement/contact module 203, the graphics module 204, and the text input module 205, the email client module 208-10 includes executable instructions for creating, sending, receiving, and managing emails in response to user instructions. In conjunction with the image management module 208-4, the email client module 208-10 makes it very easy to create and send emails with still or video images captured by the camera module 208-3.
In conjunction with the radio frequency circuit 102, the touch sensitive display 115, the display controller 117, the contact module 203, the graphics module 204, and the text input module 205, the instant message module 208-11 may include executable instructions for inputting a sequence of characters corresponding to an instant message, modifying previously entered characters, transmitting a corresponding instant message (e.g., using a Short Message Service (SMS) or Multimedia Message Service (MMS) protocol for phone-based instant messages or using XMPP, SIMPLE, or IMPS for internet-based instant messages), receiving an instant message, and viewing a received instant message. In some embodiments, the transmitted and/or received instant messages may include graphics, photos, audio files, video files, and/or other attachments supported in an MMS and/or Enhanced Message Service (EMS). As used herein, "instant message" refers to both telephone-based messages (e.g., messages sent using SMS or MMS) and internet-based messages (e.g., messages sent using XMPP, SIMPLE, or IMPS).
In conjunction with radio frequency circuitry 102, touch-sensitive display 115, display controller 117, movement/contact module 203, graphics module 204, text input module 205, positioning module 206, map module 208-13, and multimedia player module 208-7, workout support module 208-5 includes executable instructions for creating a workout (e.g., with time, distance, and/or calorie consumption goals); communication with an exercise sensor (sports device); receiving exercise sensor data; calibrating a sensor for monitoring exercise; selecting and playing music for an exercise; and displaying, storing and transmitting the exercise data.
In conjunction with the touch sensitive display 115, the display controller 117, the light sensor 119, the light sensor controller 120, the movement/contact module 203, the graphics module 204, and the image management module 208-4, the camera module 208-3 includes executable instructions for capturing and storing still images or video (including video streams) into the memory 103 (e.g., in the digital photo 210), modifying characteristics of the still images or video, or deleting the still images or video from the memory 1023 (e.g., from the digital photo 210).
In conjunction with the touch-sensitive display 115, the display controller 117, the movement/contact module 203, the graphics module 204, the text input module 205, and the camera module 208-3, the image management module 208-4 includes executable instructions for arranging, modifying (e.g., editing), or otherwise manipulating, tagging, deleting, presenting (e.g., in a digital slide or album), and storing still images and/or video images (including still images and/or video images stored in the gallery 210).
In conjunction with the radio frequency circuitry 102, the touch sensitive display 115, the display controller 117, the movement/contact module 203, the graphics module 204, and the text input module 205, the browser module 208-6 includes executable instructions for browsing the Internet (including searching, linking to, receiving, and displaying web pages or portions thereof, as well as attachments and other files linked to web pages) according to user instructions.
In conjunction with the radio frequency circuitry 102, the touch sensitive display 115, the display controller 117, the movement/contact module 203, the graphics module 204, the text input module 205, the email client module 208-10, and the browser module 208-6, the calendar module 208-14 includes executable instructions for creating, displaying, modifying, and storing a calendar and data associated with the calendar (e.g., calendar entries, to-do task lists, etc.) according to user instructions.
In conjunction with the radio frequency circuitry 102, the touch sensitive display 115, the display controller 117, the movement/contact module 203, the graphics module 204, the text input module 205, and the browser module 208-6, the desktop applet module 149 is a mini-application that may be downloaded and used by a user (e.g., weather applet 211-1, map/navigation applet 211-2) or created by a user (e.g., user-defined applet 211-3). In some embodiments, the desktop applet includes an HTML (hypertext markup language) file, a CSS (cascading style sheet) file, and a JavaScript file. In some embodiments, the desktop applet includes an XML (extensible markup language) file and a JavaScript file (e.g., yahoo.
In conjunction with the touch-sensitive display 115, the display controller 117, the movement/contact module 203, the graphics module 204, the audio circuit 109, the speaker 112, the radio frequency circuit 102, and the browser module 208-6, the multimedia player module 208-7 includes executable instructions that allow a user to download and playback recorded music and other sound files stored in one or more file formats (such as MP3 or AAC files), as well as executable instructions for displaying, presenting, or otherwise playing back video (e.g., on the touch-sensitive display 115 or on an external display connected via the peripheral interface 110). In some embodiments, the handset 100 may include the functionality of an MP3 player.
In conjunction with the radio frequency circuitry 102, the touch sensitive display 115, the display controller 117, the movement/contact module 203, the graphics module 204, the text input module 205, the location module 206, and the browser module 208-6, the map module 208-13 may be used to receive, display, modify, and store maps and data associated with maps (e.g., driving routes; data of stores and other points of interest at or near a particular geographic location; and other geographic location-based data) according to user instructions.
In conjunction with the radio frequency circuitry 102, the touch sensitive display 115, the display controller 117, the movement/contact module 203, the graphics module 204, the text input module 205, and the browser module 208-6, the application store module 208-15 may be used to receive, display data related to the application store, such as price, content, etc., according to user instructions.
In conjunction with the touch-sensitive display 115, the display controller 117, the movement/contact module 203, and the graphics module 204, the notification module 208-12 includes executable instructions that display notifications or alerts (such as incoming messages or incoming calls, calendar event reminders, application events, and the like) on the touch-sensitive display 115.
Each of the modules and applications described above corresponds to a set of executable instructions for performing one or more of the functions described above, as well as the methods described in the present disclosure (e.g., computer-implemented methods and other information processing methods described herein). These modules (i.e., sets of instructions) need not be implemented as separate software programs, procedures or modules, and thus various subsets of these modules may be combined or otherwise rearranged in various embodiments. In some embodiments, memory 103 may store a subset of the modules and data structures described above. Furthermore, the memory 103 may store additional modules and data structures not described above.
Each of the above identified elements in fig. 2 may be stored in one or more of the aforementioned memories 103. Each of the identified modules corresponds to a set of instructions for performing the functions described above. The above identified modules or programs (i.e., sets of instructions) need not be implemented as separate software programs, procedures or modules, and thus various subsets of these modules may be combined or otherwise rearranged in various embodiments. In some embodiments, memory 103 may store subsets of the modules and data structures described above. In addition, the memory 103 may store additional modules and data structures not described above.
In some embodiments, the mobile terminal is a device on which operation of a predefined set of functions is performed exclusively by the touch screen and/or touch pad. By using a touch screen and/or a touch pad as the primary input control device for operation of the mobile terminal, the number of physical input control devices (such as push buttons, dials, etc.) on the mobile terminal may be reduced.
The predefined set of functions that may be performed by the touch screen and/or the touch pad include navigation between user interfaces. In some embodiments, the mobile terminal is navigated to a main menu, or a root menu from any graphical user interface that may be displayed on the mobile terminal when the touch pad is touched by a user. In such embodiments, the touch pad may be referred to as a "menu button". In some other embodiments, the menu buttons may be physical push buttons or other physical input control devices, rather than a touch pad.
The following embodiments may be implemented in a mobile terminal (e.g., handset 100) having the hardware described above.
Fig. 3A and 3B illustrate a mobile terminal (exemplified by handset 100) having a touch screen according to some embodiments. The mobile terminal 100 may further include physical devices such as a camera 306, a headset 307, and an ambient light sensor (not shown), where the camera may include one or more front cameras and one or more rear cameras, where the front cameras are disposed on the same surface of the display screen, the rear cameras are disposed on the back surface of the display screen, the front cameras may be disposed on the mobile phone surface and closely arranged with the display screen, for example, as shown in fig. 3A, the front camera 306 is located near the center of the top of the mobile phone, the headset 307 is located above the front camera 306, or, for example, as shown in fig. 3B, the front camera 306 is located near the top of the mobile terminal 100 and biased to one side, and the headset 307 is disposed directly above the front camera 306. The touch screen 115 is an irregular rectangular touch screen, and an irregular area of a screen area of the touch screen 115 extends to two side areas of the front camera 306 at the top of the mobile terminal 100, so that a working space is provided for devices such as the camera 306 and the earphone 307, and meanwhile, the display screen occupation ratio is greatly improved, and the visual effect is improved. In the following embodiment, the front camera 306 is set at the top center of the mobile terminal 100.
The touch screen 115 may display one or more graphics within a Graphical User Interface (GUI). In this embodiment, as well as in other embodiments described below, the mobile terminal may detect operation of the user to select one or more of these graphics by making a gesture on the graphics, for example with one or more fingers 301 (not drawn to scale in the figures) or with one or more styluses (not shown). In some embodiments, selection of one or more graphics occurs when a user breaks contact with the one or more graphics. In some embodiments, the gesture may include one or more taps, one or more swipes (left to right, right to left, up and/or down), and/or a finger (right to left, left to right, up and/or down) that has been in contact with the mobile terminal 100. In some embodiments, inadvertent contact with a graphic does not select the graphic. For example, when the gesture corresponding to the selection is a tap, a swipe gesture that swipes over an application icon does not select the corresponding application.
The mobile terminal 100 may also include one or more physical buttons, such as a home screen button 303, a menu button 304, a back button 305. As previously described, menu button 304 may be used to navigate to any application 208 in a set of applications that may be running on mobile terminal 100. In other embodiments, the buttons may each be implemented as virtual keys in a Graphical User Interface (GUI) displayed on the touch screen 115.
In some embodiments, the mobile terminal 100 may also include a push button for powering on and off the device and locking the device, volume adjustment button(s), a Subscriber Identity Module (SIM) card slot, a headset jack, and a docking/charging external port, among others. Pressing a button may be used to turn on and off a device by pressing the button and holding the button in a pressed state for a predefined time interval; locking the device by depressing the button and releasing the button before the predefined time interval has elapsed; and/or unlock the device or initiate an unlocking process. In an alternative embodiment, device 100 may also accept voice input through microphone 113 for activating or deactivating certain functions.
Attention is now directed to an embodiment of a Graphical User Interface (GUI) that may be implemented on the mobile terminal 100.
As shown in fig. 4, the mobile terminal provided in the embodiment of the present application has a display screen and a camera, where the display screen may be a touch screen, and the touch screen includes a first screen area 308 and a second screen area 309. The first screen area 308 is a rectangular area, and the second screen area 309 is an area where the touch screen extends from the first screen area 308 to the periphery of the front camera. Therefore, unlike the regular rectangular touch screen in the prior art, the display area of the touch screen 115 extends to two side areas of the front camera 306 at the top of the mobile terminal 100, so that a working space is provided for devices such as the camera 306 and the earphone 307, and meanwhile, the screen occupation ratio of the touch screen 115 is greatly improved, and the visual effect is improved. In some of the following embodiments, the front camera 306 is disposed at the top center of the mobile terminal 100 as illustrated in the drawings.
In an embodiment of the present application, the first screen area 308 and the second screen area 309 may be controlled by the same display controller 117.
In some embodiments, upon illuminating the display screen or detecting a user clicking on the home screen button 303, the mobile terminal displays a series of exemplary user graphical interfaces (GRAPHICAL USER INTERFACE, GUI) on the touch screen, including a home screen (home screen) in a first screen area 303, a Wi-Fi signal, a communication signal, a battery level, a time, etc. device status icon in a second screen area 309, and also may display a text message, mail, camera, music, text message, etc. application icons in a second screen area.
Wherein the home screen may include a plurality of sub-screens (sub-screens), each of which may include different content. Taking fig. 4 as an example for illustration, the GUI may be a first sub-screen in a home screen of the mobile terminal. The first sub-screen is displayed on the touch screen of the mobile terminal, and may include a main screen indicator 403 and icons of various apps. Wherein the home screen indicator 403 is used to prompt the user which sub-screen is currently displayed, in particular which sub-screen. Illustratively, in the first sub-screen, an App icon including a plurality of rows and columns; when the mobile terminal detects that a finger (or a stylus or the like) of a user touches a position on the touch screen, the mobile terminal can open a graphical user interface of an App corresponding to a certain App icon in response to the touch event. It will be appreciated that in other embodiments, a Dock may be further included in the home screen, where a common App icon may be included in the Dock, and so on, which will not be described again.
In connection with fig. 4, in some embodiments, the graphical user interface 400 illustratively includes the following elements, or a subset or superset thereof:
Displayed on the first screen area 308:
main page indicator 403;
home screen button 303;
● A menu button 304;
A return button 305;
Icons of applications, such as:
an email 404;
Calendar 405;
settings 406 that provide access to settings of the device 100 and its various applications 208;
a camera 407;
a phone 408;
Contacts 409;
A short message 410;
a gallery 411;
weather 412;
reading 413;
An application store 414;
Map 415;
A browser 416;
A video and music player 417; and
Instant messaging applications (WeChat) 418.
Displayed on the second screen area 309:
wireless communication signal strength identification 421;
A time indicator 402;
Battery status indicator 401;
Wi-Fi signal strength identification 419;
● An alarm clock indicator 420;
Icon 422 of the application, etc.
Further, while the following examples are primarily presented with reference to finger inputs (e.g., single-finger contacts, single-finger flick gestures, single-finger swipe gestures), it should be understood that in some embodiments one or more of these finger inputs are replaced by input from another input device (e.g., stylus input).
Attention is now directed to embodiments of a user interface ("UI") and associated processes that may be implemented on a mobile terminal, such as portable mobile terminal 100, having a display and a touch-sensitive surface. The mobile terminal 100 of the present application aims to fully display the second screen area 309 with discontinuous display on the screen 115, provide various interactive functions, and improve the user experience.
Herein, unless specifically stated otherwise, the gesture of the user is flexible and may be a click, double click, swipe, circle, line, single finger touch, or multi-finger touch, among others. Those of ordinary skill in the art will appreciate that the selection of a particular gesture is flexible so long as substantially the same effect is achieved. In this context, unless otherwise specified, the location or area where the gesture of the user acts on the touch-sensitive surface is also flexible, and may be an area of or near an application interface element displayed by the display screen, a blank area where the application interface element is not displayed by the display screen, an area where a function is set by the display screen, and so on. Those of ordinary skill in the art will appreciate that the specific location or area of the gesture on the touch-sensitive surface may be flexibly set so long as substantially the same effect is achieved.
In some embodiments, the mobile terminal 100 may determine that the gesture is intended to instruct the device to operate on an application icon or in a non-application icon area by detecting the pressure, duration, area, location, distance from a certain application icon or icons, or distance from a boundary of the touch sensitive surface of the finger 301, for example: launch an application, flip a sub-screen, etc., to respond to the gesture.
Fig. 5 illustrates a flow diagram of a method of displaying a graphical user interface provided in accordance with some embodiments. The method is performed on the mobile terminal 100 with a display mentioned in the following embodiments. In some embodiments, the display is a touch screen display and the touch sensitive surface is on the display. In some embodiments, the display is separate from the touch-sensitive surface. Some operations in a method may be combined and/or the order of some operations may be changed.
As shown in fig. 5, the method includes steps 501-503. Firstly, in step 501, the mobile terminal displays an application interface of a first application in the first screen area; then, in step 502, when the second application has a notification, the mobile terminal displays the notification in the second screen area; in step 503, when a first touch operation performed on the notification is detected, the mobile terminal continues to display an application interface of the first application in the first screen area, and displays an application interface related to the notification of the second application on the application interface of the first application.
The following detailed description is provided in connection with certain embodiments and various figures.
As shown in fig. 6A, a finger 301 may be detected on the touch-sensitive surface of the touch screen 115. The mobile terminal 100 initiates the application icon 418 in response to the finger 301 clicking on the application icon 418 on the first screen area 308. Thereafter, the graphical user interface of the mobile terminal 100 transitions to fig. 6B, displaying interface content of the application interface on the first screen area 308.
In some embodiments, the mobile terminal 100 may prompt the user either explicitly or implicitly: the application icon 418 may be launched, for example: the mobile terminal 100 vibrates.
In some embodiments, application icon 503 may be unresponsive to finger 301. The mobile terminal 100 may also prompt the user either explicitly or implicitly: the application icon 418 is not responsive to the finger 301.
In other embodiments, when the touch screen displays an interface of a home screen as shown in fig. 4, the mobile terminal 100 displays an interface switching from one screen to another screen in the first screen region 308 in response to the finger 301 sliding from the non-application icon region in the first screen region 308.
In the above embodiment, in response to the finger 301 executed in the first screen area 308, when the mobile terminal 100 transforms the corresponding display content in the first screen area 308, the same content may be displayed in the second screen area 309, for example, after the transition from fig. 6A to fig. 6B, the mobile terminal 100 still displays some device status icons and application icons in the second screen area 309, even when the mobile terminal 100 is not lit in the first screen area 308, some device status icons and application icons may still be displayed in the second screen area 309, so that the user may still obtain some device related information without completely powering the mobile terminal 100, saving the device power, and improving the user experience.
Of course, the mobile terminal 100 may set display or hide display contents of the first screen region 309 in response to one finger 301.
As shown in fig. 7A, the mobile terminal 100 has one display/hidden icon 423 in the second screen area 309, and in the display state, when the mobile terminal 100 detects an operation (e.g., a single click, double click, long press, drag, etc. operation) of the display/hidden icon 423 by the finger 301, the mobile terminal 100 may transition from displaying some device state icons and application icons in the second screen area 309 as shown in fig. 7A to displaying those device state icons and application icons hidden in the second screen area 309 as shown in fig. 7B (leaving the display/hidden icon 423). In the hidden state, when the mobile terminal 100 detects the operation of the display/hidden icon 423 by the finger 301, the mobile terminal 100 may transition from displaying those device state icons and application icons (remaining display/hidden icons 423) in the second screen area 309 as shown in fig. 7B to displaying some device state icons and application icons in the second screen area 309 as shown in fig. 7A.
After hiding, the mobile terminal 100 may extinguish the second screen area 309 to save power consumption, reduce interference of display content of the second screen area 309 on display content of the first screen area 308, or display a continuous interface content in the first screen area 308 and the first screen area 308 together, so that the overall display improves the display effect.
In some embodiments, the mobile terminal 100 may also provide a pressure sensing function in place of the display/hidden icon 423 or in combination with the display/hidden icon 423. When the mobile terminal 100 detects an input of pressing the touch screen or the device frame and the input pressure reaches a certain threshold, the mobile terminal 100 may display or hide the above-mentioned icons displayed in the second screen area 309, similar to fig. 8A and 8B.
In some embodiments, after no operation on the second screen area 309 is detected for a period of time, the mobile terminal 100 may decrease the brightness or transparency of the second screen area 309, and may decrease the interference of the display content of the second screen area 309 with the user's viewing of the first screen area 308.
As shown in fig. 8A and 8B, in some embodiments, when the mobile terminal 100 detects an operation of the finger 301 on the icon 422 of the application program of the second screen area 309 (e.g., clicking the icon 422 of the application program), the mobile terminal 100 may run the application program and display interface contents of the corresponding application in the first screen area 308.
In some embodiments, the mobile terminal 100 may display icons of some designated applications, such as commonly used applications or applications set by a user, on the second screen area 308, and the mobile terminal 100 may cooperate with an application interface for quickly entering the commonly used applications in the second screen area 309 and the first screen area 308, so as to simplify the user operation process. For example, as shown in fig. 9A, when the mobile terminal 100 detects that the finger 301 clicks on the icon 406 of the setup application of the first screen area 308, a setup application interface is displayed in the first screen area 308, as shown in fig. 9B, and then, when the finger 301 is detected to click on the icon 422 of the WeChat application of the second screen area 309, the mobile terminal 100 displays an application interface of the WeChat application, as shown in fig. 9C, in the first screen area 308, and displays an icon 406a of the setup application in the second screen area 309; then when detecting a touch operation 301a in which the finger 301 clicks the icon 406a of the setting application of the second screen area 309, the mobile terminal 100 displays an application interface of the setting application as shown in fig. 9B on the first screen area 308 while displaying an icon 422 of the WeChat application on the second screen interface 309; or upon detecting a touch operation 301b performed by the finger 301 to click on the return control 425 shown in fig. 9C, the mobile terminal 100 displays the main interface as shown in fig. 9A in the first screen area 308 and displays the icon 422 of the WeChat application in the second screen area 309. Compared with the prior art that the application interface of the setting application needs to be returned to the main screen before the WeChat application can be re-entered, or the prior art that the application interface of the WeChat application possibly enters the WeChat application needs to be operated to enter the background operation list, the method of the embodiment of the application can provide the functions of quickly entering the common application and quickly returning to the application interface of the previous access application, simplify the operation flow of a user and improve the use experience of the user.
Steps 501 to 503 of the method will be further described below with reference to the drawings and the foregoing embodiments.
In step 501, the mobile terminal 100 displays an application interface of the first application in the first screen area 308, for example, as shown in fig. 10A, and the mobile terminal 100 displays an interface of the instant messaging application in the video call process in the first screen area 308.
In some embodiments, the mobile terminal 100 may further display an icon 422 of at least one second application in the second screen area 309, and specifically which second application is displayed and the number of displays may be determined and adjusted according to the setting.
Next, in step 502, when there is a notification of the second application, the mobile terminal 100 displays the content of the notification in the second screen area 309; and continuing to display the application interface of the first application in the first screen area.
In some embodiments, when the mobile terminal 100 detects the notification of the second application, the mobile terminal 100 may flash an icon corresponding to the second application on the second screen area 309, or display the number of messages on the icon corresponding to the second application, or the like. For social or communication-type applications, the mobile terminal 100 may also modify the application icon to an image of the message source, i.e., the message sender, to facilitate user identification of the identity of the party.
The notification may be an instant message received by the mobile terminal 100 through the radio frequency circuit 102, or the notification may be a content triggered by the second application in background operation to determine that a specific condition (e.g., a specific time, a specific place, a specific temperature, etc.) is satisfied, such as a timed reminder notification. The specific content of the notification may include, but is not limited to, text, image, animation, etc., and the prompt information may also be information of the sender of the notification, such as an image of the sender (e.g., head portrait, icon, etc.), identity information of the sender (e.g., name, number, etc.), sending time of the sender, etc.
In connection with fig. 10B, the mobile terminal 100 displays a notification 424 beside an icon (e.g., a letter) 422 of the first application of the second screen area 309, the notification 424 including the sender avatar and the textual content of the notification "where.
In some embodiments, when the mobile terminal 100 detects the notification of the second application, the mobile terminal may automatically adjust the positions of icons of other applications on the second screen area 309 according to the content length of the notification, may display only the content of the notification in the whole second screen area 309, may scroll when the content of the notification is longer, and may make the icons corresponding to the second application flash or change color or display the number of messages beside the icon, so that the user may perceive the notification. In some embodiments, the mobile terminal 100 may fixedly display the notification on the second screen area, so as to facilitate the user to view the content of the notification at any time, and when detecting that the user selects to close (press the screen or slide, click, double click on an icon or message) or trigger the second application, the mobile terminal 100 may close the prompt information of the notification, and return to the initial state as shown in fig. 10A, so as to achieve effective notification of the message.
In some embodiments, when the mobile terminal 100 displays the notification 424, the icons of some other applications may be hidden due to the limited size of the second screen area 309, for example, as shown in fig. 10B, the mobile terminal 100 displays the notification 424 next to the icon 422 of the second application, the icon on the right side of the notification 424 is moved to the right as a whole, and no other icons "moving out" of the second screen area 309 are displayed.
In some embodiments, referring to fig. 11A and 11B, the mobile terminal 100 may move and display other icons outside the second screen area 309 according to the sliding operation of the finger 301 on the second screen area 309, so that the user may display the other icons that are hidden when needed.
In other embodiments, when the mobile terminal 100 can fix the display size of the notification 424 and display the message content of the notification in the display size area of the notification, such as a right-to-left scrolling display, a top-to-bottom scrolling display, or the like.
Then, in step 503, when the mobile terminal 100 responds to the first operation performed by the finger 301 on the notification displayed by the second screen area 309, the application interface of the first application is continuously displayed in the first screen area 308, and the application interface 430 related to the second application is displayed on all or part of the area of the application interface of the first application.
The application interface 430 may be an application interface related to the notification, or may be an application interface that invokes other functions of the second application.
As shown in fig. 12A, when the mobile terminal 100 detects that the finger 301 drags the notification 402 from the second screen area 309 to the first screen area 308, the interface of the running instant messaging application continues to be displayed, the notification of the ongoing video call is displayed, the application interface 430 is displayed on the application interface of the instant messaging application, and the application interface 430 may include an application interface related to the specific content of the notification shown in fig. 12B.
In some embodiments, in conjunction with fig. 15A and 15B, the application interface 430 may further include application interfaces for other related functions of the second application, for example, an operation of the mobile terminal 100 detecting a user clicking or double clicking on the icon 422 of the second application, a contact list may be displayed in the first screen area 308, a sliding content of the user contact list input may be detected, a context may be displayed in a movable manner, a contact may be determined upon detecting a user sliding or input, and a contact may be entered into the display interface 431 of the contact.
In some embodiments, the method may further include step 504, where, when the mobile terminal 100 detects an operation performed by the finger 301 on the interactive interface of the touch screen (for example, clicking "reply") in step 504, the application interface 430 displayed by the mobile terminal 100 transitions from fig. 12B to the input interface shown in fig. 12C, and the mobile terminal 100 acquires a second operation performed by the finger 301 in the input area of the application interface 430, acquires a reply message, and when detecting that clicking "send" by the finger 301, the second application sends the reply message to the corresponding receiver (corresponding to the sender described above).
Then, the application interface 430 displayed by the mobile terminal 100 transitions from the input interface shown in fig. 12B to the input interface shown in fig. 12C to continue displaying the interface of the running instant messaging application, and the application interface 430 is displayed on the interface of the instant messaging application in response to the notification of the continued video call.
During this time, the mobile terminal 100 still continues to run the first application, and the application interface of the first application is still displayed in real time in the first screen area 308, for example, fig. 12A to fig. 12C, and the mobile terminal 100 still performs the video call process in real time in the first screen area 308, that is, continues to display the dynamic images of both parties of the call, transmit the voices of both parties of the call, and so on in real time. The application interface 430 may be displayed in a semi-transparent or opaque manner, wherein the semi-transparent manner may allow the user to continue to view the interface content of the first application.
In some embodiments, when the mobile terminal 100 detects a third touch operation performed on the first screen area 308 and an area outside the application interface of the second application while displaying the application interface related to the notification of the second application, the mobile terminal 100 executes an instruction corresponding to the third touch operation in the second application and continues to display the application interface related to the notification in the second application in the first screen area.
As shown in fig. 12C, the input interface 430 of the second application may include a virtual keyboard, where the virtual keyboard may be set as a full keyboard or a nine-grid keyboard, and as shown in fig. 14B, the input interface 430 of the second application is split into two parts, i.e. left and right, to space the middle part of the interface content of the first application, for example, in the game process, the central visual area is important, and the space of the middle part is not easy to interfere with the game.
In some embodiments, the application interface of the second application may further integrate voice input in the input method, and after the mobile terminal 100 converts the voice input of the user into text, the second application reply message is run, so as to further reduce the interference of the keyboard on the display screen of the first application.
In other embodiments, when the mobile terminal 100 detects that the finger 301 drags the notification 402 from the second screen area 309 to the first screen area 308, the content displayed by the mobile terminal 100 may transition from fig. 13A to the application interface 430 shown in fig. 13B, that is, the notification content displayed by the mobile terminal in the first screen area 308 is switched to be displayed as the content being input, and the input interface is displayed on the first screen area 309, in this embodiment, the method further simplifies the interaction process and the interaction interface of the second application displayed in the first screen area 308 by matching the first screen area 308 and the second screen area 309 to process the notification of the second application, thereby facilitating the user to quickly process the notification of the second application, shortening the influence time of the application interface of the first application, and enabling the user to return to, for example, watching a movie, playing a game, or making a video call as soon as possible, and improving the user experience.
In some embodiments, after the mobile terminal 100 processes the notification, the mobile terminal 100 continues to display the application interface of the first application in the first screen area, and does not display the application interface of the second application, for example, after the mobile terminal 100 detects that the gesture 301 clicks to send, the mobile terminal 100 exits the application interface of the second application, as shown in fig. 13A.
In some embodiments, when detecting the third touch operation performed by the finger 301 on the area outside the application interface of the first screen area 308 and the second application, the mobile terminal 100 may execute an instruction corresponding to the third touch operation in the second application, for example, when the first application is a game application and processes a notification of the second application, if an enemy approach occurs in the game, the user may immediately perform an operation on the interface of the first application, and the mobile terminal may continue to display, in the first screen area, an application interface related to the notification in the second application.
In some embodiments, the mobile terminal 100 may continue to display the application interface of the first application in the first screen area 308 according to the operation of the user, and exit the application interface of the second application, where the manner of exiting may be, but is not limited to, the mobile terminal 100 detecting that the finger 301 clicks the icon corresponding to the second application in the second screen area 308; the mobile terminal 100 detects that the finger 301 presses the application interface of the second application for a long time; the mobile terminal detects that the finger 301 double-clicks an area in the application interface of the first application and outside the application interface of the second application, etc.
Therefore, when the mobile terminal 100 displays the interface content of the first application, such as video call, audio-visual playing (movie, television, animation, etc.), game, etc., in the first screen area 308, when there is a notification of the second application, the mobile terminal 100 may display the notification related information in the second screen area 309, on one hand, compared with the manner of popping up the interface in the first screen area 308 to prompt, the mobile terminal will not occupy the first screen area 308, and will not affect the display content of the first screen area 308, on the other hand, the notification related information may be permanently displayed in the second screen area 309, unlike quick retrieval after popping up the popping up interface, so as to avoid the problem that the user will not see the notification content clearly and the notification will disappear.
When the mobile terminal 100 detects a first operation of the user on the notification, the application interface 430 of the second application is displayed on the interface content of the first application while the interface content of the first application is displayed on the first screen area 308, and the application interface 430 can process the notification in response to the second operation performed thereon by the user, including referring to the content of the notification, processing or replying to the notification, and the like. Thus, the mobile terminal 100 does not need to jump from the interface content of the first application being displayed to the interface content of the second application on the first screen area 308, so that the user can still continue to watch the interface content of the first application, while the application interface 430 can provide the user with a function of quickly processing the notification of the second application.
Thus, in some specific usage scenarios, the user's usage experience is greatly improved, for example: when a user watches a movie, the user can process the notification without stopping the movie being watched, without exiting the player application in the first screen area and switching to other applications; the user does not need to jump out of the game application in the game process, so that the problems that the user is likely to be judged to be lost or run away by a game service party are avoided, and the notification can be processed; in addition, the user can process the notification without closing the video call interface in the video passing process, thereby avoiding influencing the call process with the opposite party.
The mobile terminal 100 may detect that the user clicks a message, an icon, presses a screen or a border somewhere to enter a message input mode, where the mobile terminal 100 is displayed in a part or the whole area of the first screen area and is used as an input area, and the display of the original application such as a game, a movie or a reading is not affected in the input area, for example, the content of the message, the input keyboard, the handwriting input track, the input method, etc. is superimposed on the input area in a semitransparent or opaque manner, and the input message is acquired and then sent.
In some embodiments, when the mobile terminal 100 detects that the mobile terminal 100 rotates using an accelerometer sensor, a gravity sensor, and a gyroscope, the icon of the second screen area 309 may also rotate accordingly, for example, as shown in fig. 14A, to provide the display direction of the user with a proper viewing angle, and the application interface 430 displayed on the first screen area 308 also provides a proper display direction.
According to another aspect of the present application, a method for displaying a graphical user interface is provided, the method comprising steps 601 to 602.
In step 601, the mobile terminal 100 starts a front camera function, and as shown in fig. 16A, the mobile terminal 100 detects an operation of clicking the camera icon 407 by the finger 301, starts the camera function, and detects an operation of clicking the control 425 of the front/rear camera to be switched by the finger 301, starts the front camera to perform shooting.
Thereupon, in step 602, the mobile terminal 100 displays the image acquired by the front camera on the first screen area 308, and enters a light supplementing mode in the second screen area 309, where the light supplementing mode may include increasing the display brightness of all or part of the second screen area beyond a threshold value to supplement light to the subject, for example, as shown in fig. 16B, the display brightness of the second screen area 309 is increased above a certain threshold value, even to a maximum value, that is, the backlight brightness in the second screen area 309 is increased to a certain threshold value, even to a maximum value.
In some embodiments, the mobile terminal 100 may further adjust the second screen area 309 to a single color mode, such as white, yellow, blue, and so on, to supplement light to the photographed object, and may also help to improve the imaging color of the photographed object, improve the imaging effect, and provide the photographing experience of the user.
Therefore, in the method, the mobile terminal displays the second screen area 309 as the light-compensating light bar during the shooting process of the front-end camera, so as to compensate light for the shot object (for example, a human face), so that an image with better brightness can still be obtained when the user environment light is darker, for example, the mobile terminal can still perform self-shooting and video call in a dark environment. Therefore, when the front camera is started to take a picture, the mobile terminal 100 uses the second screen area as a light supplementing lamp, and lights the discontinuous screen area, so that the effect of the light supplementing lamp is provided in the applications of taking a picture, making up, face recognition and the like, and the quality of the obtained image is improved in a dark light environment. The backlight brightness of the discontinuous screen area can be enhanced, so that a better light supplementing effect can be obtained. By combining the embodiments, the display screen has more various functions, the use requirements of users in various scenes are improved, and the use experience of the users is improved.
In some embodiments, in step 602, during the process of entering the second screen area 309 into the light supplementing mode to supplement light, the mobile terminal 100 may perform light supplementing on one or more partial areas of the second screen area 309, as shown in fig. 17A, 17B and 17C, the front camera shooting process enters a depth 3D mode, when the finger 301 is detected to click to shoot, the front camera continuously shoots 3 photos, and lights 426a, 426B and 426C respectively to supplement light every time shooting, and then the processor 101 of the mobile terminal 100 synthesizes three pictures to obtain high-precision 3D depth information by using light and shadow changes in combination with the dual front cameras. And the light is supplemented from different angles, so that an image with higher precision is formed, and the imaging effect is further improved.
When the mobile terminal realizes the iris recognition function, the front camera and the iris scanning camera are started, the photographed human eyes are displayed in the first screen area 308, infrared light is added on the basis that visible light is displayed in the second screen area 309, an infrared light supplementing effect is provided for iris recognition or face recognition, and the iris recognition function is improved.
Therefore, the mobile terminal utilizes the second screen area to realize a soft light supplementing lamp effect, and better shooting effect is brought under the condition that the display application content is not influenced. The present application provides a solution that utilizes discontinuous screen areas as light filling lamps. Compared with the prior art, the light supplementing effect can be realized in the scheme with large screen duty ratio without occupying extra space and adding extra holes on the equipment panel.
In some embodiments, as shown in fig. 18, the mobile terminal 100 may further display a prompt pattern 435 pointing to the position of the front camera 306 in the second screen area 309, so as to improve the shooting effect.
In the above embodiments, it may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, produces a flow or function in accordance with embodiments of the present application, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in or transmitted from one computer-readable storage medium to another, for example, by wired (e.g., coaxial cable, optical fiber, digital Subscriber Line (DSL)), or wireless (e.g., infrared, wireless, microwave, etc.). The computer readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that contains an integration of one or more available media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., solid state disk Solid STATE DISK (SSD)), etc.
The embodiment of the application can divide the functional modules of the device according to the method example, for example, each functional module can be divided corresponding to each function, and two or more functions can be integrated in one processing module. The integrated modules may be implemented in hardware or in software functional modules. The division of the modules in the embodiment of the application is schematic, only one logic function is divided, and other division modes can be adopted in actual implementation.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of the functional modules is illustrated, and in practical application, the above-described functional allocation may be performed by different functional modules according to needs, i.e. the internal structure of the mobile device is divided into different functional modules to perform all or part of the above-described functions. The specific working processes of the mobile device and the unit in the above-described system may refer to the corresponding processes in the foregoing method embodiments, which are not described herein again.
In summary, the above embodiments are only for illustrating the technical solution of the present application, and are not limited thereto; although the present application has been described in detail with reference to the above embodiments, it should be understood by those of ordinary skill in the art that: the technical scheme described in the above embodiments can be modified or some technical features thereof can be replaced equivalently; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application.

Claims (17)

1. A method for displaying a graphical user interface on an electronic device, the electronic device comprising a touch screen, the touch screen comprising a first screen area for displaying a main interface of the electronic device and a second screen area located on both sides of a front camera on top of the electronic device;
Displaying an application interface of a first application in the first screen area;
and when receiving a notification message of a second application, displaying message content in the second screen area, wherein the message content is the content of the notification message.
2. The method according to claim 1, wherein the method further comprises:
Receiving a first operation of a user on the notification message;
And displaying an interface related to the notification message on an application interface of the first application, wherein the content displayed in the interface comprises the message content.
3. The method of claim 2, wherein displaying an interface associated with the notification message on an application interface of the first application comprises:
And suspending and displaying an interface related to the notification message on a partial area of the application interface of the first application.
4. A method according to any one of claims 1-3, wherein the second screen area further displays an icon corresponding to a third application, and when the electronic device receives the notification message of the second application, the message content is displayed in the second screen area, and the display position of the icon corresponding to the third application is changed.
5. The method of any of claims 1-4, wherein a second operation of the notification message by the user is received, wherein the second operation comprises at least one of clicking, sliding, or pressing, and wherein displaying the notification message is stopped.
6. The method of claims 1-5, wherein the message content comprises at least one of text, images, numbers, and animations.
7. The method according to any one of claims 1-6, further comprising:
And displaying prompt information in the second screen area while displaying the message content in the second screen area, wherein the prompt information comprises at least one of an image of a sender of the notification message, identity information of the sender and sending time of the sender.
8. The method of any of claims 1-7, wherein the second application is an application running in the background.
9. The method of any of claims 1-8, wherein the notification message is dynamically displayed in the second screen area.
10. The method according to claims 1-9, characterized in that the method further comprises:
detecting a third operation executed by a user on the notification message, and displaying a message reply interface, wherein the message reply interface comprises a virtual keyboard or a handwriting input area;
Receiving a reply message input by a user, and sending the reply message to a sender of the notification message.
11. The method of claim 10, wherein the message reply interface includes a first portion and a second portion, the first portion and the second portion being unconnected.
12. The method according to any one of claims 1-11, further comprising:
displaying a first control in the second screen area;
Detecting a preset operation of a user on the first control;
and hiding a status icon and/or a notification message on the second screen area in response to the preset operation, wherein the status icon comprises at least one of Wi-Fi signal, battery level, time and equipment status icon.
13. The method according to claim 12, wherein the method further comprises: and after hiding the status icons and/or the notification messages on the second screen area, extinguishing the second screen area.
14. The method of claim 2, wherein the second screen area further displays an icon corresponding to the second application;
receiving a fourth operation of a user on an icon corresponding to the second application or the first operation of the notification message;
And responding to the fourth operation or the first operation, displaying an application interface of the second application in the first screen area, and displaying an icon corresponding to the first application in the second screen area.
15. An electronic device, wherein the electronic device comprises:
a camera;
A touch screen;
One or more processors;
A memory;
And one or more computer programs, wherein the one or more computer programs are stored in the memory, the one or more computer programs comprising instructions, which when executed by the electronic device, cause the electronic device to perform the method of any of claims 1-14.
16. A computer program product, wherein the computer program product, when run on a computer, causes the computer to perform the method of any of claims 1-14.
17. A computer readable storage medium comprising instructions, wherein the instructions, when run on a computer, cause the computer to perform the method of any of claims 1-14.
CN202410029388.1A 2017-06-30 2017-06-30 Method for displaying graphical user interface and mobile terminal Pending CN118042036A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410029388.1A CN118042036A (en) 2017-06-30 2017-06-30 Method for displaying graphical user interface and mobile terminal

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
CN202410029388.1A CN118042036A (en) 2017-06-30 2017-06-30 Method for displaying graphical user interface and mobile terminal
PCT/CN2017/091283 WO2019000437A1 (en) 2017-06-30 2017-06-30 Method of displaying graphic user interface and mobile terminal
CN201780091258.9A CN110663016B (en) 2017-06-30 2017-06-30 Method for displaying graphical user interface and mobile terminal

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN201780091258.9A Division CN110663016B (en) 2017-06-30 2017-06-30 Method for displaying graphical user interface and mobile terminal

Publications (1)

Publication Number Publication Date
CN118042036A true CN118042036A (en) 2024-05-14

Family

ID=64742786

Family Applications (4)

Application Number Title Priority Date Filing Date
CN202410023427.7A Pending CN118042035A (en) 2017-06-30 2017-06-30 Method for displaying graphical user interface and mobile terminal
CN202211291655.XA Active CN116055610B (en) 2017-06-30 2017-06-30 Method for displaying graphical user interface and mobile terminal
CN201780091258.9A Active CN110663016B (en) 2017-06-30 2017-06-30 Method for displaying graphical user interface and mobile terminal
CN202410029388.1A Pending CN118042036A (en) 2017-06-30 2017-06-30 Method for displaying graphical user interface and mobile terminal

Family Applications Before (3)

Application Number Title Priority Date Filing Date
CN202410023427.7A Pending CN118042035A (en) 2017-06-30 2017-06-30 Method for displaying graphical user interface and mobile terminal
CN202211291655.XA Active CN116055610B (en) 2017-06-30 2017-06-30 Method for displaying graphical user interface and mobile terminal
CN201780091258.9A Active CN110663016B (en) 2017-06-30 2017-06-30 Method for displaying graphical user interface and mobile terminal

Country Status (2)

Country Link
CN (4) CN118042035A (en)
WO (1) WO2019000437A1 (en)

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109886000B (en) * 2019-02-01 2024-03-01 维沃移动通信有限公司 Image encryption method and mobile terminal
CN110196667B (en) * 2019-05-30 2021-02-09 维沃移动通信有限公司 Notification message processing method and terminal
CN114816209A (en) * 2019-06-25 2022-07-29 华为技术有限公司 Full screen display method and device of mobile terminal
CN112583957A (en) * 2019-09-30 2021-03-30 华为技术有限公司 Display method of electronic device, electronic device and computer-readable storage medium
CN112714236B (en) * 2019-10-24 2024-05-10 中兴通讯股份有限公司 Terminal, shooting method, storage medium and electronic device
CN112162807A (en) * 2020-09-24 2021-01-01 维沃移动通信有限公司 Function execution method and device
CN112291412B (en) * 2020-10-29 2022-04-22 维沃移动通信(杭州)有限公司 Application program control method and device and electronic equipment
CN112380014A (en) * 2020-11-17 2021-02-19 莫雪华 System resource allocation system and method based on big data
CN114691256A (en) * 2020-12-30 2022-07-01 星络智能科技有限公司 Interface configuration method, intelligent panel and computer readable storage medium
CN115695599A (en) * 2021-07-28 2023-02-03 华为技术有限公司 Method for prompting camera state and electronic equipment
CN116088716B (en) * 2022-06-13 2023-12-08 荣耀终端有限公司 Window management method and terminal equipment
CN115150550A (en) * 2022-06-20 2022-10-04 湖北星纪时代科技有限公司 Photographing processing method and device for terminal, electronic equipment and storage medium

Family Cites Families (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9619143B2 (en) * 2008-01-06 2017-04-11 Apple Inc. Device, method, and graphical user interface for viewing application launch icons
US8723823B2 (en) * 2011-02-07 2014-05-13 Qualcomm Incorporated System and method for providing notifications on a mobile computing device
KR20120126161A (en) * 2011-05-11 2012-11-21 삼성전자주식회사 Mobile terminal and method for controlling screen using the same
KR102129795B1 (en) * 2013-05-15 2020-07-03 엘지전자 주식회사 Mobile terminal and method for controlling thereof
KR20160097393A (en) * 2013-06-07 2016-08-18 삼성전자주식회사 Method for invoking program and electronic device therof
CN103500079A (en) * 2013-09-17 2014-01-08 小米科技有限责任公司 Notification message display method and device and electronic equipment
CN104750381A (en) * 2013-12-31 2015-07-01 中兴通讯股份有限公司 Method, device and terminal for operating application items quickly
US9887949B2 (en) * 2014-05-31 2018-02-06 Apple Inc. Displaying interactive notifications on touch sensitive devices
CN104267950A (en) * 2014-09-25 2015-01-07 北京金山安全软件有限公司 Setting method and device of terminal application program and mobile terminal
CN106302095B (en) * 2015-06-04 2021-03-02 深圳市腾讯计算机系统有限公司 Message display control method, device and terminal
CN105162959A (en) * 2015-08-04 2015-12-16 广东欧珀移动通信有限公司 Message notification processing method and device
CN105511719A (en) * 2015-11-27 2016-04-20 努比亚技术有限公司 Notification information display method and device
CN105335056A (en) * 2015-12-10 2016-02-17 魅族科技(中国)有限公司 Application control method and device and terminal
CN105955573A (en) * 2016-04-27 2016-09-21 上海斐讯数据通信技术有限公司 Mobile terminal application switching method and system
CN106231187A (en) * 2016-07-29 2016-12-14 维沃移动通信有限公司 A kind of method shooting image and mobile terminal
CN106547446A (en) * 2016-10-31 2017-03-29 努比亚技术有限公司 Using switching device and method
CN106843654B (en) * 2017-01-24 2019-01-29 维沃移动通信有限公司 A kind of method and mobile terminal of terminal multi-job operation
CN106657485B (en) * 2017-03-07 2020-05-12 Oppo广东移动通信有限公司 Mobile terminal

Also Published As

Publication number Publication date
WO2019000437A1 (en) 2019-01-03
CN116055610A (en) 2023-05-02
CN118042035A (en) 2024-05-14
CN116055610B (en) 2023-12-08
CN110663016B (en) 2024-01-16
CN110663016A (en) 2020-01-07

Similar Documents

Publication Publication Date Title
CN116055610B (en) Method for displaying graphical user interface and mobile terminal
US10867059B2 (en) Device, method, and graphical user interface for accessing an application in a locked device
US10209877B2 (en) Touch screen device, method, and graphical user interface for moving on-screen objects without using a cursor
CN114764298B (en) Cross-device object dragging method and device
CN108701001B (en) Method for displaying graphical user interface and electronic equipment
CN105955607B (en) Content sharing method and device
CN108776568B (en) Webpage display method, device, terminal and storage medium
KR101749235B1 (en) Device, method, and graphical user interface for managing concurrently open software applications
JP5658765B2 (en) Apparatus and method having multiple application display modes, including a mode with display resolution of another apparatus
KR101170877B1 (en) Portable electronic device for photo management
US20130326415A1 (en) Mobile terminal and control method thereof
WO2019024700A1 (en) Emoji display method and device, and computer readable storage medium
CN112214138B (en) Method for displaying graphical user interface based on gestures and electronic equipment
US20210165670A1 (en) Method, apparatus for adding shortcut plug-in, and intelligent device
EP4125274A1 (en) Method and apparatus for playing videos
WO2018133200A1 (en) Icon arrangement method and terminal
CN113220203B (en) Activity entry display method, device, terminal and storage medium
WO2023072233A1 (en) Page switching method, page switching apparatus, electronic device and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination