CN110663016A - Method for displaying graphical user interface and mobile terminal - Google Patents

Method for displaying graphical user interface and mobile terminal Download PDF

Info

Publication number
CN110663016A
CN110663016A CN201780091258.9A CN201780091258A CN110663016A CN 110663016 A CN110663016 A CN 110663016A CN 201780091258 A CN201780091258 A CN 201780091258A CN 110663016 A CN110663016 A CN 110663016A
Authority
CN
China
Prior art keywords
application
mobile terminal
screen area
screen
notification
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201780091258.9A
Other languages
Chinese (zh)
Other versions
CN110663016B (en
Inventor
严斌
薛康乐
马冬
尹帮实
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202211291655.XA priority Critical patent/CN116055610B/en
Publication of CN110663016A publication Critical patent/CN110663016A/en
Application granted granted Critical
Publication of CN110663016B publication Critical patent/CN110663016B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72484User interfaces specially adapted for cordless or mobile telephones wherein functions are triggered by incoming communication events
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets

Abstract

The application provides a method for displaying a graphical user interface and a mobile terminal comprising a touch screen and a camera, wherein the touch screen of the mobile terminal comprises a first screen area and a second screen area, the first screen area is a rectangular area, and the second screen area is an area which extends from the first screen area to the periphery of the camera. The mobile terminal displays an application interface of a first application in the first screen area; when the second application has a notice, the mobile terminal displays the notice in the second screen area; when detecting a first touch operation performed on the notification, the mobile terminal continues to display the application interface of the first application in the first screen area, and displays the application interface of the second application related to the notification on the application interface of the first application. The method and the device improve the diversified interaction function of the mobile terminal, simplify the operation process of the user, and improve the user experience.

Description

Method for displaying graphical user interface and mobile terminal Technical Field
The present application relates to the field of human-computer interaction, and in particular, to a method for displaying a graphical user interface and a mobile terminal.
Background
Under the development trend of terminal devices such as current smart phones, the functions which can be realized by the terminal are continuously expanded, the requirements of consumers on the terminal devices are not only met, and the requirements on display are continuously improved. From a digital key function machine to a smart phone with a resistor screen and a capacitor screen, the size of a display screen is also developed to 5.5 inches, even larger, and a large-screen mobile phone is more and more popular with consumers. With the increasing of the white heat of the intelligent terminal competition, the concept of an ultra-large screen occupation ratio and even a full-screen mobile phone is developed, and the industry is dedicated to improving the screen occupation ratio of the mobile phone in consideration of the portability of the size of the mobile phone. With the generation of various display screen structures and sizes for improving the mobile phone screen occupation ratio, how to provide more diversified interactive functions in the display screen becomes a key for improving the user experience.
Disclosure of Invention
In order to solve the technical problem, embodiments of the present application provide various methods for displaying a graphical user interface and a mobile terminal, which aim to improve the diversified interaction function of the mobile terminal, simplify the operation process of a user, and improve the user experience.
In a first aspect, an embodiment of the present application provides a method for displaying a graphical user interface on a mobile terminal, where the method includes the following steps: firstly, the mobile terminal displays an application interface of a first application in the first screen area; then, when the second application has a notification, the mobile terminal displays the notification in the second screen area; then, when detecting a first touch operation performed on the notification, the mobile terminal continues to display the application interface of the first application in the first screen area, and displays the application interface of the second application related to the notification on the application interface of the first application. Therefore, the method can make full use of the display area of the mobile terminal, and when the first screen area displays the interface content of the first application, such as video call, video and audio playing (movie, television, animation, etc.), games, etc., and when the second application notifies, the mobile terminal can display the information related to the notification in the second screen area, on one hand, compared with the mode of popping up the interface in the first screen area for prompting, the method does not occupy the first screen area and does not affect the display content of the first screen area, on the other hand, the notified content can be displayed in the second screen area persistently, so that the problem that the notification disappears when the user does not see the notified content clearly is avoided; and provides a convenient function for the user to quickly process the notification of the second application and continue to view the interface content of the first application.
In some possible implementations, the mobile terminal displays the application interface of the second application on a partial area of the application interface of the first application.
In some possible implementation manners, the mobile terminal displays the application interface of the second application on a partial area of the application interface of the first application, and the mobile terminal displays the application interface of the second application on the application interface of the first application in a transparent or semi-transparent manner, so as to improve the display effect of the mobile terminal on the touch screen.
In some possible implementations, when the mobile terminal displays the application interface of the second application related to the notification, and detects a second touch operation performed on the application interface of the second application, the mobile terminal processes the notification, so that the user can continue to view the interface content of the first application and quickly process the notification of the second application, including viewing, replying, and the like.
In some possible implementations, after the mobile terminal processes the notification, the mobile terminal further continues to display the application interface of the first application in the first screen area, and does not display the application interface of the second application. And after the mobile terminal processes the notification, closing the application interface of the second application displayed on the first screen area, wherein the processing process is simple and smooth and is suitable for the use habit of the user.
In some possible implementations, when the mobile terminal displays the application interface of the second application related to the notification, when a third touch operation performed on the first screen area and an area outside the application interface of the second application is detected, the mobile terminal executes an instruction corresponding to the third touch operation in the second application, and continues to display the application interface of the second application related to the notification in the first screen area, that is, when the application interface of the second application and the application interface of the first application are displayed on the first screen, whether to execute the instruction related to the second application or the instruction related to the second application may be selected according to a position where the touch operation performed in the first screen area is detected.
In some possible implementations, when the mobile terminal displays the application interface of the second application related to the notification, and detects a fourth touch operation, the mobile terminal continues to display the application interface of the first application in the first screen area and does not display the application interface of the second application, the mobile terminal may exit the application interface of the second application displayed in the first screen area according to the corresponding fourth touch operation, and the fourth touch operation may be determined according to a preset, and the step of detecting the fourth touch operation by the mobile terminal includes any one of: the mobile terminal detects touch operation of the notification corresponding to the second application in the second screen area; the mobile terminal detects touch operation on an application interface of the second application; the mobile terminal detects a touch operation performed on an area in the application interface of the first application and outside the application interface of the second application.
In some possible implementations, the mobile terminal displays at least one icon of the second application in the second screen area.
In some possible implementations, the method further includes: when a first touch operation performed on the icon of the second application is detected, displaying an application interface of the second application in the first screen area, and displaying the icon of the first application in the second screen area.
In some possible implementations, with reference to the embodiment described in the previous paragraph, the method further includes: and when the mobile terminal does not display the application interface of the second application in the second screen area, displaying the icon of the second application in the second screen area.
The second application may be a system-level application or an application provided by a third service provider. In some embodiments, the mobile terminal may quickly enter the second application in response to detecting that the user displays at least one icon of the second application in the second screen area, so that a function of quickly entering a common application and quickly returning to an application interface of an application accessed in the previous time is provided, an operation flow of the user is simplified, and user experience is improved.
In some possible implementations, the mobile terminal prompts the notification by displaying content of the notification in the second screen area.
In some possible implementations, the mobile terminal prompts the notification by displaying sender information of the notification in the second screen area, where the sender information includes any one or any several of a sender image, sender identity information, and sender sending time.
In some possible implementation manners, the mobile terminal prompts the notification by changing a display manner of an icon of the second application corresponding to the notification in the second screen area.
In some possible implementations, the method further includes: the mobile terminal displays a control in the second screen area; when a third touch operation performed on the control is detected, the mobile terminal hides the icon of the at least one second application and/or the notification of the second application. The mobile terminal selectively extinguishes the second screen area based on touch operation so as to save the power consumption or reduce the interference of the display content of the second screen area on the display content of the first screen area; or the mobile terminal displays a continuous interface content by matching the first display screen area and the first screen area together based on touch operation, so that the display effect is improved by integral display, and diversified display modes and functions are provided.
In some possible implementations, when the mobile terminal displays the notification in the second screen area, the content of the notification is displayed in the second screen area in a scrolling manner, so as to adapt to the size of the second screen area and completely display the content of the notification.
In a second aspect, the present application provides a method for displaying a graphical user interface on a mobile terminal, wherein the mobile terminal includes a touch screen and a camera, the touch screen includes a first screen area and a second screen area, the first screen area is a rectangular area, and the second screen area is an area of the touch screen extending from the first screen area to the periphery of the camera, the method performed on the mobile terminal includes the following steps: firstly, the mobile terminal displays an application interface of a first application in the first screen area and displays an icon of a second application in the second screen area; then, when a first touch operation performed on the icon of the second application is detected, the mobile terminal displays an application interface of the second application in the first screen area, and displays the icon of the first application in the second screen area. The mobile terminal displays at least one icon of a second application, such as some commonly used applications, in a second screen area, when a touch operation performed on the icon of the second application is detected, the mobile terminal quickly enters the second application, and switches and displays an application interface of the second application in the first screen area, so that the touch operation that the application interface of the first application needs to be exited first is saved, a function of quickly entering the commonly used applications is provided, the interaction process is simplified, and the use experience of a user is improved.
In some possible implementations, after the mobile terminal displays the application interface of the second application in the first screen area, when the mobile terminal does not display the application interface of the second application in the second screen area, the mobile terminal displays an icon of the second application in the second screen area. And the icon of the first application displayed in the second screen area responds to the detected touch operation on the icon of the first application, and the application interface of the second application is switched and displayed in the first screen area, so that the function of quickly returning to the previous access application is provided, and the interaction process is simple, convenient and quick.
In a third aspect, the present application provides a method for displaying on a mobile terminal, where the mobile terminal includes a touch screen and a camera, the touch screen includes a first screen area and a second screen area, the first screen area is a rectangular area, and the second screen area is an area where the touch screen extends from the first screen area to a periphery of the camera, and the method executed on the mobile terminal includes: firstly, the mobile terminal starts the camera to shoot; then, the mobile terminal displays the image acquired by the camera on the first screen area, and improves the display brightness of all or part of the second screen area to exceed a threshold value so as to supplement light for the shot object. According to the method, the mobile terminal can supplement light for the shot object (such as a human face) by taking the second screen area as the light supplement lamp without additionally arranging a light supplement lamp device in the shooting process of the front camera, so that the user can still obtain an image with better brightness when the ambient light of the user is dark, and the effect of the light supplement lamp is provided in the applications of shooting, makeup, human face recognition and the like, so that the quality of the obtained image is improved in a dark light environment. Therefore, the mobile terminal has more various interactive functions, the use requirement of the user in the shooting scene is improved, and the use experience of the user is improved.
In some possible embodiments, the mobile terminal displays a preset color in all or at least one local area of the second screen area to supplement light to the object to be photographed. In some embodiments, the mobile terminal adjusts the second screen area to a monochrome mode, such as white, yellow, blue, and the like, so as to supplement light for the photographed object, and at the same time, help to improve the imaging color of the photographed object, improve the imaging effect, and provide the photographing experience of the user.
In some possible embodiments, the method further comprises: and the mobile terminal also displays a prompt pattern pointing to the position of the camera in a second screen area.
In a fourth aspect, the present application provides a mobile terminal, wherein the mobile terminal includes: a camera; the touch screen comprises a first screen area and a second screen area, wherein the first screen area is a rectangular area, and the second screen area is an area of the touch screen extending from the first screen area to the periphery of the camera; one or more processors; a memory; and one or more computer programs, wherein the one or more computer programs are stored in the memory, the one or more computer programs comprising instructions which, when executed by the mobile terminal, cause the mobile terminal to perform the steps of: firstly, displaying an application interface of a first application in the first screen area; when the second application has a notification, displaying the notification in the second screen area; then, when a first touch operation performed on the notification is detected, the application interface of the first application is continuously displayed in the first screen area, and the application interface of the second application related to the notification is displayed on the application interface of the first application.
In some possible embodiments, in the step of displaying the application interface of the second application related to the notification, the instructions, when executed by the mobile terminal, cause the mobile terminal to perform the steps of: displaying the application interface of the second application on a partial area of the application interface of the first application.
In some possible embodiments, in the step of displaying the application interface of the second application related to the notification, the instructions, when executed by the mobile terminal, cause the mobile terminal to perform the steps of: displaying the application interface of the second application on the application interface of the first application in a transparent or semi-transparent mode.
In some possible embodiments, when displaying the application interface of the second application related to the notification, the instructions, when executed by the mobile terminal, cause the mobile terminal to perform the steps of: and processing the notification when a second touch operation executed on the application interface of the second application is detected.
In some possible embodiments, after processing the notification, the instructions, when executed by the mobile terminal, cause the mobile terminal to further perform the steps of: and continuously displaying the application interface of the first application in the first screen area, and not displaying the application interface of the second application.
In some possible embodiments, when the mobile terminal displays the application interface of the second application related to the notification, the instructions, when executed by the mobile terminal, cause the mobile terminal to perform the steps of: and when a third touch operation executed on the area, outside the application interface, of the second application is detected, executing an instruction, corresponding to the third touch operation, of the second application, and continuously displaying an application interface, related to the notification, of the second application in the first screen area.
In some possible embodiments, when displaying the application interface of the second application related to the notification, the instructions, when executed by the mobile terminal, cause the mobile terminal to perform the steps of: and when a fourth touch operation is detected, continuously displaying the application interface of the first application in the first screen area, and not displaying the application interface of the second application.
Wherein, in the step of detecting the fourth touch operation, the instruction, when executed by the mobile terminal, causes the mobile terminal to execute any one of the following steps: detecting a touch operation on a notification corresponding to the second application in the second screen area; detecting a touch operation on an application interface of the second application; a touch operation performed on an area in an application interface of the first application and outside an application interface of the second application is detected.
In some possible embodiments, the instructions, when executed by the mobile terminal, cause the mobile terminal to further perform the steps of: and displaying at least one icon of the second application in the second screen area.
In some possible embodiments, in the step of prompting the notification in the second screen area, the instructions, when executed by the mobile terminal, cause the mobile terminal to perform at least any one of the following steps: displaying the content of the notification in the second screen area; displaying sender information of the notification in the second screen area, wherein the sender information comprises any one item or any several items of sender images, sender identity information and sender sending time; and changing the display mode of the icon of the second application corresponding to the notification in the second screen area.
In some possible embodiments, the instructions, when executed by the mobile terminal, cause the mobile terminal to further perform the steps of: displaying a control in the second screen area; when a third touch operation performed on the control is detected, hiding the at least one icon of the second application and/or the notification of the second application.
In some possible embodiments, in displaying the notification in the second screen region, the instructions, when executed by the mobile terminal, cause the mobile terminal to perform the steps of: and displaying the content of the notification in a scrolling mode in the second screen area.
In a fifth aspect, the present application provides a mobile terminal, wherein the mobile terminal includes: a camera; the touch screen comprises a first screen area and a second screen area, wherein the first screen area is a rectangular area, and the second screen area is an area of the touch screen extending from the first screen area to the periphery of the camera; one or more processors; a memory; and one or more computer programs, wherein the one or more computer programs are stored in the memory, the one or more computer programs comprising instructions which, when executed by the mobile terminal, cause the mobile terminal to perform the steps of: firstly, displaying an application interface of a first application in the first screen area, and displaying an icon of a second application in the second screen area; and then when a first touch operation performed on the icon of the second application is detected, displaying an application interface of the second application in the first screen area, and displaying the icon of the first application in the second screen area.
In some possible embodiments, after the mobile terminal displays the application interface of the second application in the first screen area, the instructions, when executed by the mobile terminal, cause the mobile terminal to further perform the steps of: and when the mobile terminal does not display the application interface of the second application in the second screen area, displaying the icon of the second application in the second screen area.
In a sixth aspect, the present application provides a mobile terminal for display on a mobile terminal, wherein the mobile terminal comprises a camera; the touch screen comprises a first screen area and a second screen area, wherein the first screen area is a rectangular area, and the second screen area is an area of the touch screen extending from the first screen area to the periphery of the camera; one or more processors; a memory; and one or more computer programs, wherein the one or more computer programs are stored in the memory, the one or more computer programs comprising instructions which, when executed by the mobile terminal, cause the mobile terminal to perform the steps of: starting the camera to shoot; and then, displaying the image acquired by the camera on the first screen area, and improving the display brightness of all or part of the second screen area to exceed a threshold value so as to supplement light for the shot object.
In some possible embodiments, the instructions, when executed by the mobile terminal, cause the mobile terminal to further perform the steps of: and displaying a preset color in all or at least one local area of the second screen area to supplement light for the shot object.
In some possible embodiments, the instructions, when executed by the mobile terminal, cause the mobile terminal to further perform the steps of: and displaying a prompt pattern pointing to the position of the camera in the second screen area.
In a seventh aspect, the present application provides a Graphical User Interface (GUI) stored in an electronic device, the electronic device comprising a touch screen, a memory, and one or more processors, the touch screen comprising a first screen area and a second screen area, the first screen area being a rectangular area, the second screen area being an area of the touch screen extending from the first screen area to around the camera, the one or more processors being configured to execute one or more computer programs stored in the memory, the GUI comprising: a first GUI displayed in the first screen area, the first GUI comprising an application interface of a first application; a second GUI displayed in the second screen area in response to detecting a notification to the second application, the second GUI including the notification; in response to detecting a first touch operation performed on the notification, continuing to display the first GUI in the first screen area, and displaying a third GUI on the first GUI, the third GUI including an application interface of the second application related to the notification.
In some possible embodiments, the first GUI further comprises: at least one icon of the second application.
In an eighth aspect, the present application provides a computer program product, which, when run on a mobile terminal, causes the mobile terminal to perform the method according to any one of the first to third aspects and embodiments thereof.
In a ninth aspect, the present application provides a computer readable storage medium comprising instructions, wherein the instructions, when executed on a mobile terminal, cause the mobile terminal to perform the method of any of the first to third aspects and embodiments thereof.
In a tenth aspect, the present application provides a mobile terminal having a touch screen, the mobile terminal including: the touch screen comprises a display unit, a detection unit and a control unit, wherein the display unit is used for displaying an application interface of a first application in a first screen area of the touch screen; the detection unit is used for detecting whether the second application has a notice or not and detecting whether the touch operation on the touch screen exists or not; the control unit is used for enabling the display unit to display the notification in a second screen area of the touch screen when the detection unit detects that the second application has the notification, and continuing to display the application interface of the first application in the first screen area and displaying the application interface of the second application related to the notification on the application interface of the first application when the detection unit detects that the first touch operation is executed on the notification. The mobile terminal further comprises a camera, the touch screen comprises a first screen area and a second screen area, the first screen area is a rectangular area, and the second screen area is an area extending from the first screen area to the periphery of the camera.
In some possible embodiments, the display unit is further configured to display at least one icon of the second application in the second screen area.
In some possible implementation manners, the control unit is further configured to, when the detection unit detects a first touch operation performed on an icon of the second application, cause the display unit to display an application interface of the second application in the first screen area, and display the icon of the first application in the second screen area.
In some possible implementation manners, with reference to the foregoing embodiment, the display unit is further configured to display an icon of the second application in the second screen area when the application interface of the second application is not displayed in the second screen area.
The second application may be a system-level application or an application provided by a third service provider.
In some possible embodiments, the display unit is configured to display a control in the second screen region; the control unit is further configured to cause the display unit to hide the at least one icon of the second application and/or the notification of the second application when the detection unit detects a third touch operation performed on the control.
In some possible implementations, the display unit is configured to display the application interface of the second application on a partial area of the application interface of the first application.
In some possible implementations, the control unit is further configured to process the notification when the detection unit detects a second touch operation performed on an application interface of the second application.
In some possible implementations, after the control unit finishes processing the notification, the processing unit is further configured to cause the display unit to continue to display the application interface of the first application in the first screen area and not display the application interface of the second application.
In some possible implementations, when the display unit displays the application interface of the second application related to the notification, the control unit is configured to, when the detection unit detects a third touch operation performed on an area outside the application interface of the second application and in the first screen area, execute an instruction corresponding to the third touch operation in the second application, and cause the display unit to continue to display the application interface related to the notification in the second application in the first screen area.
In some possible implementations, when the display unit displays the application interface of the second application related to the notification, when the control unit is configured to detect the fourth touch operation at the detection unit, the application interface of the first application is continuously displayed in the first screen area, and the application interface of the second application is not displayed. Further, the control unit exits the application interface of the second application displayed in the first screen area according to the corresponding fourth touch operation. The fourth touch operation may be determined according to a preset setting, and the step of detecting the fourth touch operation by the detection unit includes any one of: the mobile terminal detects touch operation of the notification corresponding to the second application in the second screen area; the mobile terminal detects touch operation on an application interface of the second application; the mobile terminal detects a touch operation performed on an area in the application interface of the first application and outside the application interface of the second application.
In some possible implementations, the display unit prompts the notification by displaying content of the notification in the second screen region.
In some possible implementations, the display unit prompts the notification by displaying sender information of the notification in the second screen area, where the sender information includes any one or any several of a sender image, sender identity information, and sender sending time.
In some possible implementations, the display unit prompts the notification by changing a display manner of an icon of the second application corresponding to the notification in the second screen region.
In some possible implementations, the display unit is further configured to display a control in the second screen region; the control unit is further configured to cause the display unit to hide the at least one icon of the second application and/or the notification of the second application when the detection unit detects a third touch operation performed on the control.
In some possible implementations, the display unit displays the content of the notification in a scroll display manner in the second screen region while the notification is displayed in the second screen region.
In an eleventh aspect, the present application provides a mobile terminal having a touch screen, the mobile terminal including: the touch screen comprises a display unit and a control unit, wherein the display unit is used for displaying an application interface of a first application in a first screen area of the touch screen and displaying an icon of a second application in a second screen area of the touch screen; the detection unit is used for detecting whether touch operation is performed on the touch screen; the control unit is used for enabling the display unit to display the application interface of the second application in the first screen area and display the icon of the first application in the second screen area when the detection unit detects the first touch operation executed on the icon of the second application.
In some possible implementations, the display unit is further configured to display an icon of the second application in the second screen area when the application interface of the second application is not displayed in the second screen area after the application interface of the second application is displayed in the first screen area.
In a twelfth aspect, the present application provides a mobile terminal having a touch screen, the mobile terminal including: the device comprises a shooting unit, a display unit and a control unit, wherein the shooting unit is used for shooting images; the display unit is used for displaying the image acquired by the camera on a first screen area of the touch screen; the control unit is used for controlling the display unit to improve the display brightness of all or part of the second screen area of the touch screen to exceed a threshold value so as to supplement light for the shot object.
In some possible embodiments, the display unit is further configured to display a preset color in all or at least one local area of the second screen area to fill light into the object to be photographed.
In some possible embodiments, the display unit is further configured to further display a prompt pattern pointing to the camera position in the second screen area.
Drawings
Fig. 1 is a schematic hardware configuration diagram of a mobile terminal provided according to some embodiments of the present application;
FIG. 2 is a schematic diagram of a portion of hardware, software of a mobile terminal provided in accordance with some embodiments of the present application;
fig. 3A and 3B are schematic diagrams provided according to some embodiments of the present application;
FIG. 4 is a graphical user interface schematic diagram of a mobile terminal provided in accordance with some embodiments of the present application;
FIG. 5 is a flow chart diagram of a GUI display process in a mobile terminal according to some embodiments of the present application;
FIGS. 6A and 6B are schematic diagrams of GUI display processes in a mobile terminal provided according to some embodiments of the present application;
FIGS. 7A and 7B are schematic diagrams of GUI display processes in a mobile terminal according to some embodiments of the present application;
FIGS. 8A and 8B are schematic diagrams of GUI display processes in a mobile terminal according to some embodiments of the present application;
FIGS. 9A, 9B and 9C are schematic diagrams of GUI display processes in mobile terminals according to some embodiments of the present application;
FIGS. 10A and 10B are schematic diagrams of GUI display processes in a mobile terminal according to some embodiments of the present application;
11A and 11B are schematic diagrams of GUI display processes in a mobile terminal provided according to some embodiments of the present application;
12A, 12B and 12C are schematic diagrams of GUI display processes in mobile terminals according to some embodiments of the present application;
13A and 13B are schematic diagrams of GUI display processes in a mobile terminal provided according to some embodiments of the present application;
14A and 14B are schematic diagrams of a GUI displayed in a mobile terminal provided according to some embodiments of the present application;
15A and 15B are schematic diagrams of GUI display processes in a mobile terminal provided according to some embodiments of the present application;
FIGS. 16A and 16B are schematic diagrams of GUI display processes in a mobile terminal according to some embodiments of the present application;
17A, 17B and 17C are schematic diagrams of GUI display processes in mobile terminals according to some embodiments of the present application;
fig. 18 is a schematic diagram of a GUI display process in a mobile terminal according to some embodiments of the present application.
Detailed Description
The present application is further described below with reference to the drawings and examples.
Reference will now be made in detail to embodiments, examples of which are illustrated in the accompanying drawings. In the following detailed description, numerous specific details are set forth in order to provide a thorough understanding of the present application. However, it will be apparent to one skilled in the art that the present application may be practiced without these specific details. In other instances, well-known methods, procedures, components, circuits, and networks have not been described in detail as not to unnecessarily obscure aspects of the embodiments.
It will also be understood that, although the terms first, second, etc. may be used herein to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. For example, a first contact may be named a second contact, and similarly, a second contact may be named a first contact without departing from the scope of the present application. The first contact and the second contact are both contacts, but they may not be the same contact, and in some scenarios may be the same contact.
The terminology used in the description of the present application herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the specification of the application and the appended claims, the singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
As used herein, the term "if" may be interpreted to mean "when.. or" after.. or "in response to a determination" or "in response to a detection," depending on the context. Similarly, the phrase "if it is determined," or "if [ a stated condition or event ] is detected," may be interpreted to mean "upon determining" or "in response to determining.
Embodiments of a mobile terminal, a user interface for such a mobile terminal, and associated processes for using such a mobile terminal are described below. In some embodiments, the mobile terminal is a portable mobile terminal, such as a cell phone, tablet computer, etc., that also contains other functions, such as personal digital assistant and/or music player functions. Exemplary embodiments of the portable mobile terminal include, but are not limited to, a mount
Figure PCTCN2017091283-APPB-000001
Figure PCTCN2017091283-APPB-000002
Or other operating system. Other portable mobile terminals may also be used, such as a laptop or tablet computer with a touch-sensitive surface (e.g., a touch screen display and/or a touch pad).
In the following discussion, a mobile terminal including a display and a touch-sensitive surface is presented. It should be understood, however, that the mobile terminal may include one or more other physical user interface devices, such as a physical keyboard, mouse, and/or joystick.
Mobile terminals typically support a variety of applications, such as one or more of the following: a drawing application, a word processing application, a web page creation application, a spreadsheet application, a gaming application, a telephone application, a video conferencing application, an email application, an instant messaging application, an exercise support application, a photo management application, a gallery application, a digital video camera application, a web browsing application, a digital music player application, and/or a digital video player application.
Various applications executable on the mobile terminal may use at least one common physical user interface device, such as a touch-sensitive surface. One or more functions of the touch-sensitive surface and corresponding information displayed on the mobile terminal may be displayed and/or varied from one application to the next and/or within a corresponding application. In this way, a common physical architecture of the mobile terminal (such as a touch-sensitive surface) may support various applications with a user interface that is intuitive and clear to the user.
Attention is now directed to embodiments of portable mobile terminals having touch sensitive displays. FIG. 1 is a block diagram illustrating a portable mobile terminal (taking cell phone 100 as an example) having a touch-sensitive display 115 in accordance with some embodiments. Touch sensitive display 115 is also sometimes referred to as a "touch screen" and may also be referred to as or as a touch sensitive display system, and may also be referred to as a display system having a touch-sensitive surface (touch-sensitive surface) and a display screen (display).
It should be understood that the handset 100 shown in fig. 1 is only one example of the mobile terminal described above, and that the handset 100 may have more or fewer components than shown in the figures, may combine two or more components, or may have a different configuration of components. The various components shown in the figures may be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
As shown in fig. 1, the handset 100 may include: processor 101, radio frequency circuitry 102, memory 103, input/output subsystem 104, bluetooth device 105, sensor 106, Wi-Fi device 107, positioning device 108, audio circuitry 109, peripheral interface 110, and power system 111. These components may communicate over one or more communication buses or signal lines. Those skilled in the art will appreciate that the hardware configuration shown in fig. 1 does not constitute a limitation of the mobile terminal, and may include more or less components than those shown, or some components in combination, or a different arrangement of components.
The following describes the components of the handset 100 in detail with reference to fig. 1:
the processor 101 is a control center of the mobile phone 100, connects various parts of the mobile phone 100 by using various interfaces and lines, and executes various functions of the mobile phone 100 and processes data by running or executing an application (App) stored in the memory 103 and calling data and instructions stored in the memory 103. In some embodiments, processor 101 may include one or more control units; processor 101 may also integrate an application processor and a modem processor; the application processor mainly processes an operating system, a user interface, application programs and the like, and the modem processor mainly processes wireless communication. It will be appreciated that the modem processor described above may not be integrated into the processor 101. For example, the processor 101 may be an kylin 960 chip manufactured by Huanti technologies, Inc. In some embodiments of the present application, the processor 101 may further include a fingerprint verification chip for verifying the acquired fingerprint.
The rf circuit 102 may be used for receiving and transmitting wireless signals during the transmission and reception of information or calls. In particular, the rf circuit 102 may receive downlink data of the base station and then process the received downlink data to the processor 101; in addition, data relating to uplink is transmitted to the base station. Typically, the radio frequency circuitry includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency circuitry 102 may also communicate with other devices via wireless communication. The wireless communication may use any communication standard or protocol including, but not limited to, global system for mobile communications, general packet radio service, code division multiple access, wideband code division multiple access, long term evolution, email, short message service, and the like.
The memory 103 is used for storing applications, software, and data, and the processor 101 performs various functions and data processing of the mobile phone 100 by operating the applications and data stored in the memory. The memory 103 mainly includes a program storage area and a data storage area, wherein the program storage area can store an operating system and application programs (such as a sound playing function and an image management function) required by at least one function; the storage data area may store data created from use of the handset (e.g., audio data, phone book, calendar events, etc.). Further, the memory may include high speed random access memory, and may also include non-volatile memory, such as a magnetic disk storage device, flash memory device, or other volatile solid state storage device.
An input/output subsystem (I/O subsystem) 104 couples input/output devices on device 100, such as a touch-sensitive display 115 and other input control devices 116, to processor 101. The I/O subsystem 104 may include a display controller 117 and one or more input controllers 118 for other input control devices 116. The one or more input controllers 118 receive/transmit electrical signals from/to other input control devices 116. The other input control devices 116 may include physical buttons (e.g., push buttons, rocker buttons, etc.), dials, slide switches, joysticks, click wheels, and so forth. In still other embodiments, the input controller 118 may be coupled (or not coupled) to any of: a keyboard, an infrared port, a USB port, and a pointing device such as a mouse. The one or more buttons may include up/down buttons for volume control of speaker 112 and/or microphone 113. The one or more buttons may include a push button.
The touch sensitive display 115 provides an input interface and an output interface between the mobile terminal and a user. Display controller 117 receives electrical signals from touch-sensitive display 115 and/or transmits electrical signals to touch-sensitive display 115. Touch sensitive display 115 displays visual output to a user. The visual output may include graphics, text, icons, video, and any combination thereof (collectively "graphics"). In some embodiments, some or all of the visual output may correspond to an interface element. Touch sensitive display 115 is sometimes referred to for convenience as a "touch screen," and may also be referred to as a touch sensitive display system, and may also be referred to as a display system having a touch sensitive surface and a display screen.
Touch sensitive display 115 has a touch sensitive surface, display, sensor or group of sensors that receive input from a user based on haptic and/or tactile contact. The touch-sensitive display 115 and the display controller 117 (along with any associated modules and/or sets of instructions in memory 103) detect contact (and any movement or breaking of the contact) on the touch-sensitive display 115, and translate the detected contact into interaction with interface elements (e.g., one or more soft keys, icons, web pages, or images) displayed on the touch-sensitive display 115. In an exemplary embodiment, the point of contact between the touch-sensitive display 115 and the user corresponds to a finger of the user.
Wherein a touch sensitive surface (e.g., a touch panel) may capture touch events on or near the touch sensitive surface by a user of the cell phone 100 (e.g., user operation on or near the touch sensitive surface using a finger, stylus, etc. any suitable object) and transmit the captured touch information to another device, such as the processor 101. Among them, a touch event of a user near a touch-sensitive surface may be referred to as a hover touch; hover touch may refer to a user not needing to directly contact the touchpad in order to select, move, or drag a target (e.g., an icon, etc.), but only needing to be located near the mobile terminal in order to perform a desired function. In the context of a hover touch application, the terms "touch," "contact," and the like do not imply a contact that is used to directly contact the touch screen, but rather a contact that is near or in proximity thereto. The touch-sensitive surface capable of performing floating touch can be realized by adopting capacitance, infrared light sensation, ultrasonic waves and the like. The touch sensitive surface may comprise two parts, a touch detection means and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch detection device, converts the touch information into touch point coordinates, and sends the touch point coordinates to the processor 101, and the touch controller can also receive and execute instructions sent by the processor 101. In addition, touch sensitive surfaces may be implemented using various types of resistive, capacitive, infrared, and surface acoustic waves. The display (also referred to as a display screen) may be used to display information entered by or provided to the user as well as various menus for the handset 100. The display may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The touch-sensitive surface may be overlaid on the display, and when a touch event is detected on or near the touch-sensitive surface, it may be communicated to the processor 101 to determine the type of touch event, and the processor 101 may then provide a corresponding visual output on the display according to the type of touch event. Although in fig. 1 the touch sensitive surface and the display are two separate components to implement the input and output functions of the handset 100, in some embodiments the touch sensitive surface and the display may be integrated to implement the input and output functions of the handset 100. It is understood that the touch screen 115 is formed by stacking multiple layers of materials, and only the touch sensitive surface (layer) and the display screen (layer) are shown in the embodiment of the present application, and other layers are not described in the embodiment of the present application. In addition, in some other embodiments of the present application, the touch-sensitive surface may be covered on the display, and the size of the touch-sensitive surface is larger than that of the display, so that the display is completely covered under the touch-sensitive surface, or the touch-sensitive surface may be configured on the front surface of the mobile phone 100 in a full panel manner, that is, the touch of the user on the front surface of the mobile phone 100 can be sensed by the mobile phone, so that the full touch experience on the front surface of the mobile phone can be realized. In other embodiments, the touch-sensitive surface is disposed on the front of the mobile phone 100 in a full-panel manner, and the display may also be disposed on the front of the mobile phone 100 in a full-panel manner, so that a frameless structure can be implemented on the front of the mobile phone.
The touch sensitive display 115 may use LCD (liquid crystal display) technology, LPD (light emitting polymer display) technology, or LED (light emitting diode) technology, although other display technologies may be used in other embodiments. Touch sensitive display 115 and display controller 117 may detect contact and any movement or breaking thereof using any of a variety of touch sensing technologies now known or later developed, including but not limited to capacitive, resistive, infrared, and surface acoustic wave technologies, as well as other proximity sensor arrays or other elements for determining one or more points of contact with touch sensitive display 115. In an exemplary embodiment, projected mutual capacitance sensing technology is used.
Touch sensitive display 115 may have a video resolution in excess of 100 dpi. In some embodiments, the touch screen has a video resolution of about 160 dpi. The user may make contact with the touch-sensitive display 115 using any suitable object or appendage, such as a stylus, a finger, and so forth. In some embodiments, the user interface is designed to work primarily with finger-based contacts and gestures, which may be less accurate than stylus-based input due to the larger contact area of the finger on the touch screen. In some embodiments, the device translates the rough finger-based input into a precise pointer/cursor position or command to perform the action desired by the user.
In some embodiments, the touch screen may include a sensor group having pressure sensing.
In still other embodiments, in addition to a touch screen, the cell phone 100 may include a touch pad (not shown) for activating or deactivating certain functions. In some embodiments, the trackpad is a touch-sensitive area of the mobile terminal that, unlike a touchscreen, does not display visual output. The trackpad may be a touch-sensitive surface separate from the touch-sensitive display 115, or an extension of the touch-sensitive surface formed by the touch screen.
The handset 100 may also include one or more light sensors 119. FIG. 1 shows an optical sensor 119 coupled to an optical sensor controller 120 in the I/O subsystem 104. The light sensor 119 may include a Charge Coupled Device (CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The light sensor 119 receives light projected through one or more lenses from the external environment and converts the light into data representing an image. In conjunction with the camera module 213, the light sensor 119 may capture still images or video. In some embodiments, one or more light sensors are located on the front and/or back of the handset 100, opposite the touch sensitive display 115 on the front of the device, so that the touch sensitive display 115 can be used as a viewfinder for still and/or video image capture. In some embodiments, another one or more light sensors are located in the front of the device so that the user can obtain images of the user for the video conference while viewing other video conference participants on the touch screen display, and light sensor 119 may be a camera.
In various embodiments of the present application, the mobile phone 100 may further have a fingerprint recognition function. For example, the fingerprint recognizer may be disposed on the back side of the handset 100 (e.g., below the rear facing camera), or the fingerprint recognizer may be disposed on the front side of the handset 100 (e.g., below the touch screen 115). In addition, the fingerprint recognition function can also be realized by configuring a fingerprint recognizer in the touch screen 115, that is, the fingerprint recognizer can be integrated with the touch screen 115 to realize the fingerprint recognition function of the mobile phone 100. In this case, the fingerprint recognizer may be disposed in the touch screen 115, may be a part of the touch screen 115, or may be otherwise disposed in the touch screen 115. Additionally, the fingerprint recognizer may be implemented as a full panel fingerprint recognizer, and thus, the touch screen 115 may be considered as a panel that may be used for fingerprint acquisition at any location. The fingerprint identifier may send the captured fingerprint to the processor 101 for processing (e.g., fingerprint verification, etc.) by the processor 101. The main component of the fingerprint identifier in the embodiments of the present application is a fingerprint sensor, which may employ any type of sensing technology, including but not limited to optical, capacitive, piezoelectric, or ultrasonic sensing technologies, etc.
In addition, as to a specific technical solution of integrating a fingerprint acquisition device in a touch screen in the embodiments of the present application, reference may be made to the patent application with application number US 2015/0036065 a1, entitled "fingerprint sensor in a mobile terminal", published by the united states patent and trademark office, the entire contents of which are incorporated by reference in the various embodiments of the present application.
The bluetooth device 105 is configured to establish a wireless link with another mobile terminal (e.g., a smart watch, a tablet computer, etc.) through a bluetooth communication protocol, and perform data interaction.
The handset 100 may also include at least one sensor 106, such as the light sensor 119 described above, motion sensors, and other sensors. Specifically, the light sensor 119 may include an ambient light sensor that adjusts the brightness of the display panel of the touch-sensitive display 115 according to the brightness of ambient light, and a proximity sensor that turns off the power of the display panel when the mobile phone 100 is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally, three axes), detect the magnitude and direction of gravity when stationary, and can be used for applications (such as horizontal and vertical screen switching, related games, magnetometer attitude calibration) for recognizing the attitude of the mobile terminal, and related functions (such as pedometer and tapping) for vibration recognition; as for other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which can be configured on the mobile phone 100, further description is omitted here.
The Wi-Fi device 107 is used for providing Wi-Fi network access for the mobile phone 100, and the mobile phone 100 can help a user to send and receive e-mails, browse webpages, access streaming media and the like through the Wi-Fi device 107 and provides wireless broadband internet access for the user.
The positioning device 108 is used for providing the mobile phone 100 with geographic location information, which may indicate the current geographic location of the mobile phone. It is understood that the positioning device 108 may specifically be a receiver of a positioning communication system such as the global positioning system or the beidou satellite navigation system, russian GLONASS, etc. After receiving the geographical location information sent by the positioning communication system, the positioning device 108 sends the information to the processor 101 for further processing or sends the information to the memory 103 for storage.
Audio circuitry 109, speaker 112, and microphone 113 can provide an audio interface between a user and the handset 100. The audio circuit 109 may transmit the electrical signal obtained by converting the received audio data to the speaker 112, and the audio signal is converted by the speaker 112 into a sound signal for output, where the speaker 112 may include a power amplifier speaker, an earphone, and the like; on the other hand, the microphone 113 converts the collected sound signal into an electrical signal, which is received by the audio circuit 109 and converted into audio data, which is then output to the radio frequency circuit 102 to be sent to, for example, another cellular phone, or to the memory 103 for further processing.
The handset 100 may also include a power system 111 (including power failure detection circuitry, a power converter or inverter, a power status indicator, a battery, and a power management chip) that provides power to the various components. The battery may be logically connected to the processor 101 through a power management chip to manage charging, discharging, and power consumption functions through the power system 111.
The handset 100 may also include a peripheral interface 110 for providing an interface to external input/output devices (e.g., mouse, keyboard, external display, external memory, etc.). Peripheral interface 110 may be used to couple input and output peripherals of the device to processor 101 and memory 103. The processor 101 runs or executes various applications and/or sets of instructions stored in the memory 103 to perform various functions of the handset 100 and to process data. In some embodiments, the processor 101 may include an image signal processor and a dual or multi-core processor.
Although not shown, the handset 100 may also include a camera (front and/or rear), a Subscriber Identity Module (SIM) card slot, a flash, a micro-projector, a Near Field Communication (NFC) device, and so forth.
In some embodiments, as shown in fig. 2, the software stored in memory 103 may include an operating system 201, a communication module (or set of instructions) 202, a contact/movement module (or set of instructions) 203, a graphics module (or set of instructions) 204, a text input module (or set of instructions) 205, a location module (or set of instructions) 206, a Wi-Fi module (or set of instructions) 207, and an application program (or set of instructions) 208. Further, in some embodiments, memory 103 may also store device/global internal state 209, as shown in fig. 1 and 2. Device/global internal state 209 may include at least one of the following states: an active application state indicating which applications (if any) are currently active; display state indicating what applications, views, or other information occupy various areas of touch screen 115; sensor status including information obtained from various sensors of the mobile terminal and the peripheral interface 110; and information about the geographic position and attitude of the mobile terminal.
The operating system 201 (e.g., Darwin, RTXC, LINUX, UNIX, OS X, WINDOWS, ANDROID, or an embedded operating system such as Vx Works) includes various software and/or drivers for controlling and managing general system tasks (e.g., memory management, storage device control, power management, etc.) and facilitates communication between the various hardware and software. Further, in some embodiments, the memory 103 may also store a gallery 210.
The communication module 202 may communicate with other mobile terminals through the peripheral interface 110 and may also include various software for processing data received by the radio frequency circuit 102 and/or the peripheral interface 110. The peripheral interfaces (e.g., Universal Serial Bus (USB), firewire, etc.) are adapted to couple directly to other mobile terminals or indirectly through a network (e.g., the internet, wireless LAN, etc.). In some embodiments, peripheral interface 110 may also be a multi-pin (e.g., 30-pin) connector that is the same as or similar to and/or compatible with a 30-pin connector used on iPod (trademark of Apple inc.) devices.
The contact/movement module 203 may detect contact with the touch screen 115 and other touch sensitive devices (e.g., a trackpad or a physical click wheel). The contact/movement module 203 may include a plurality of software for performing various operations related to contact detection, such as determining whether contact has occurred (e.g., detecting a finger-down event), determining whether there is movement of the contact and tracking the movement on the touch panel (e.g., detecting one or more finger-dragging events), and determining whether the contact has terminated (e.g., detecting a finger-up event or a break in contact). The contact/movement module 203 receives contact data from the touch panel. Determining movement of the point of contact may include determining a velocity (magnitude), a velocity (magnitude and direction), and/or an acceleration (change in magnitude and/or direction) of the point of contact, the movement of the point of contact being represented by a series of contact data. These operations may be applied to single point contacts (e.g., one finger contact) or multiple point simultaneous contacts (e.g., "multi-touch"/multiple finger contacts).
The contact/movement module 203 may also detect gesture inputs by the user. Different types of gestures on the touch-sensitive surface have different contact patterns. Thus, the type of gesture may be detected by detecting a particular contact pattern. For example, detecting a single-finger tap gesture includes detecting a finger-down event, and then detecting a finger-up (lift-off) event at the same location (or substantially the same location) as the finger-down event (e.g., at an icon location). As another example, detecting a finger swipe gesture on the touch-sensitive surface includes detecting a finger-down event, then detecting one or more finger-dragging events, and then subsequently detecting a finger-up (lift-off) event.
The types of gestures described above may be various, for example, the gestures described above may be Flick gestures, i.e., a single finger taps the touch-sensitive surface of the touch-sensitive display 115, then swiftly slides, then swiftly leaves the touch-sensitive surface, such as scrolling the screen up and down, or toggling a picture left or right, etc.; the gesture can also be a Slide gesture, namely a single finger taps the touch-sensitive surface and then continuously contacts and moves, such as sliding unlocking; the gesture can also be a Swipe gesture, that is, a plurality of fingers contact the touch-sensitive surface and then are continuously contacted and moved, for example, a three-finger grip returns to the main interface; the gesture may also be a Tap gesture, i.e. a single finger tapping the touch sensitive surface followed immediately by leaving the touch sensitive surface; the gesture can also be a Double Tap gesture, namely, two Tap gestures are performed in a very short time; the gesture may also be a Touch & Hold gesture, i.e. a finger taps the Touch sensitive surface and remains stationary; the gesture may also be a Drag gesture, i.e. a finger tapping on the touch sensitive surface, moving the finger slowly without leaving the touch sensitive surface (typically with a well-defined target location, such as dragging a file to a trash can to delete a file); the gesture may also be a Pinch gesture, i.e. a Pinch of two fingers (typically a thumb and an index finger) on a touch-sensitive surface; the gesture may also be a Unpin gesture, i.e. two fingers (typically a thumb and an index finger) are spread across the touch-sensitive surface; it is understood that the gesture may be other types of gestures besides the various gestures listed above, and the embodiment does not limit the types of gestures.
Graphics module 204 includes a number of known software for rendering and displaying graphics on touch-sensitive display 115 or other display, including components for changing the intensity of displayed graphics. As used herein, the term "graphic" includes any object that may be displayed to a user, including without limitation text, web pages, icons (such as user interface objects including soft keys), digital images, videos, animations and the like.
In some embodiments, the graphics module 204 stores data representing graphics to be used. Each graphic may be assigned a corresponding code. The graphics module 204 receives one or more codes specifying graphics to be displayed from an application program or the like, and also receives coordinate data and other graphics attribute data together if necessary, and then generates screen image data to output to the display controller 117.
Text input module 205, which may be a component of graphics module 204, provides a soft keyboard for entering text in various applications 208 (e.g., contacts 208-1, browser 208-6, and any other application that requires text input).
The positioning module 206 is used to determine the geographic location of the mobile terminal and provide this information for use in various applications 208 (e.g., to provide to the phone 2082 for geographic location based dialing, to the camera module 208-3 as picture/video metadata, and to other applications that provide geographic location based services, such as weather desktop applet 211-1, map/navigation desktop applet 211-2, etc.).
The Wi-Fi module 207 is used to run various instructions required by the Wi-Fi device 107.
Application 208 may include the following modules (or sets of instructions), or a subset or superset thereof:
● contact module 208-1 (also sometimes referred to as a contact list or contact) for managing stored contacts;
● telephone module 208-2;
● camera module 208-3 for still and/or video images, for receiving user instructions for digital imaging (taking pictures) with the light sensor 119;
● image management module 208-4, for editing, deleting, moving, renaming, etc. the gallery 210 stored in the memory 103;
● exercise support module 208-5;
● browser module 208-6;
desktop applet modules 211 that may include one or more of the following: a weather desktop applet 211-1, a map/navigation desktop applet 211-2, and other desktop applets obtained by the user, and a user-defined desktop applet 211-3;
● multimedia player module (i.e., video and music player module) 208-7, which may be comprised of a video player module and a music player module;
● word processing module 208-8;
● the video conference module 208-9,
● email module 208-10;
● instant message module 208-11;
● notification module 208-12;
● map module 208-13;
● calendar module 208-14; and/or
● application store module 208-15.
Examples of other applications 208 that may be stored in memory 103 include other word processing applications, other image editing applications, drawing applications, JAVA-enabled applications, encryption, digital rights management, voice recognition, and voice replication.
In conjunction with touch-sensitive display 115, contact module 203, graphics module 204, and text input module 205, contacts module 208-1 may be used to manage an address book or contact list (e.g., stored in memory 103 in an application internal state of contacts module 208-1), including: adding a name to the address book; deleting names from the address book; associating a telephone number, email address, home address, or other information with a name; associating the image with a name; classifying and classifying names; providing a telephone number or email address to initiate and/or facilitate communication via telephone 208-2, email; and so on.
In conjunction with radio frequency circuitry 102, audio circuitry 109, speaker 112, microphone 113, touch-sensitive display 115, display controller 117, move/contact module 203, graphics module 204, and text input module 205, telephony module 208-2 may be used to enter a sequence of characters corresponding to a telephone number, access one or more telephone numbers in contacts module 208-1, modify an already entered telephone number, dial a corresponding telephone number, place a call, and disconnect or hang up when the call is completed. As described above, wireless communication may use any of a number of communication standards, protocols, and technologies.
In conjunction with radio frequency circuitry 102, audio circuitry 109, speaker 112, microphone 113, touch-sensitive display 115, display controller 117, light sensor 119, light sensor controller 120, mobile contact module 203, graphics module 204, text input module 205, contacts module 208-1, and telephony module 208-2, video conference module 208-9 includes executable instructions for initiating, conducting, and ending a video conference between a user and one or more other participants according to user instructions.
In conjunction with radio frequency circuitry 102, touch-sensitive display 115, display controller 117, movement/contact module 203, graphics module 204, and text input module 205, email client module 208-10 includes executable instructions for creating, sending, receiving, and managing emails in response to user instructions. In conjunction with the image management module 208-4, the email client module 208-10 makes it very easy to create and send emails with still images or video images captured by the camera module 208-3.
In conjunction with radio frequency circuitry 102, touch-sensitive display 115, display controller 117, contact module 203, graphics module 204, and text input module 205, instant message module 208-11 may include executable instructions for entering a sequence of characters corresponding to an instant message, modifying previously entered characters, transmitting a corresponding instant message (e.g., using a Short Message Service (SMS) or Multimedia Message Service (MMS) protocol for a phone-based instant message or using XMPP, SIMPLE, or IMPS for an internet-based instant message), receiving an instant message, and viewing the received instant message. In some embodiments, the transmitted and/or received instant messages may include graphics, photos, audio files, video files, and/or other attachments supported in the MMS and/or Enhanced Messaging Service (EMS). As used herein, "instant message" refers to both telephony-based messages (e.g., messages sent using SMS or MMS) and internet-based messages (e.g., messages sent using XMPP, SIMPLE, or IMPS).
In conjunction with radio frequency circuitry 102, touch-sensitive display 115, display controller 117, movement/contact module 203, graphics module 204, text input module 205, location module 206, map module 208-13, and multimedia player module 208-7, workout support module 208-5 includes executable instructions for creating workouts (e.g., with time, distance, and/or calorie consumption goals); communicating with an exercise sensor (sports device); receiving exercise sensor data; calibrating a sensor for monitoring an exercise; selecting and playing music for exercise; and displaying, storing and transmitting exercise data.
In conjunction with touch-sensitive display 115, display controller 117, light sensor 119, light sensor controller 120, movement/contact module 203, graphics module 204, and image management module 208-4, camera module 208-3 includes executable instructions for capturing and storing still images or video (including video streams) in memory 103 (e.g., in digital photograph 210), modifying characteristics of the still images or video, or deleting the still images or video from memory 1023 (e.g., from digital photograph 210).
In conjunction with the touch-sensitive display 115, the display controller 117, the movement/contact module 203, the graphics module 204, the text input module 205, and the camera module 208-3, the image management module 208-4 includes executable instructions for arranging, modifying (e.g., editing), or otherwise manipulating, tagging, deleting, presenting (e.g., in a digital slide or album), and storing still images and/or video images, including still images and/or video images stored in the gallery 210.
In conjunction with radio frequency circuitry 102, touch-sensitive display 115, display controller 117, movement/contact module 203, graphics module 204, and text input module 205, browser module 208-6 includes executable instructions for browsing the internet (including searching, linking to, receiving, and displaying web pages or portions thereof, and attachments and other files linked to web pages) according to user instructions.
In conjunction with the radio frequency circuitry 102, the touch-sensitive display 115, the display controller 117, the movement/contact module 203, the graphics module 204, the text input module 205, the email client module 208-10, and the browser module 208-6, the calendar module 208-14 includes executable instructions for creating, displaying, modifying, and storing a calendar and data associated with the calendar (e.g., calendar entries, to-do task lists, etc.) according to user instructions.
In conjunction with radio frequency circuitry 102, touch-sensitive display 115, display controller 117, movement/contact module 203, graphics module 204, text input module 205, and browser module 208-6, desktop applet module 149 is a mini-application (e.g., weather desktop applet 211-1, map/navigation desktop applet 211-2) or a mini-application created by a user (e.g., user-customized desktop applet 211-3) that may be downloaded and used by the user. In some embodiments, the desktop applet includes an HTML (HyperText markup language) file, a CSS (cascading Style sheet) file, and a JavaScript file. In some embodiments, the desktop applet includes an XML (extensible markup language) file and a JavaScript file (e.g., Yahoo! desktop applet).
In conjunction with touch-sensitive display 115, display controller 117, movement/contact module 203, graphics module 204, audio circuitry 109, speakers 112, radio frequency circuitry 102, and browser module 208-6, multimedia player module 208-7 includes executable instructions that allow a user to download and playback recorded music and other sound files stored in one or more file formats, such as MP3 or AAC files, as well as executable instructions for displaying, rendering, or otherwise playing back video (e.g., on touch-sensitive display 115 or on an external display connected via peripheral interface 110). In some embodiments, the cell phone 100 may include the functionality of an MP3 player.
In conjunction with the radio frequency circuitry 102, the touch-sensitive display 115, the display controller 117, the movement/contact module 203, the graphics module 204, the text input module 205, the positioning module 206, and the browser module 208-6, the map module 208-13 may be used to receive, display, modify, and store maps and data associated with maps (e.g., driving routes; data for stores and other points of interest at or near a particular geographic location; and other geographic location-based data) according to user instructions.
In conjunction with radio frequency circuitry 102, touch sensitive display 115, display controller 117, movement/contact module 203, graphics module 204, text input module 205, and browser module 208-6, application store module 208-15 may be used to receive and display data related to the application store, such as price, content, etc., according to user instructions.
In conjunction with touch-sensitive display 115, display controller 117, movement/contact module 203, and graphics module 204, notification module 208-12 includes executable instructions to display notifications or alerts (such as incoming messages or incoming calls, calendar event reminders, application events, etc.) on touch-sensitive display 115.
Each of the modules and applications described above corresponds to a set of executable instructions for performing one or more of the functions described above as well as the methods described in this application (e.g., the computer-implemented methods and other information processing methods described herein). These modules (i.e., sets of instructions) need not be implemented as separate software programs, procedures or modules, and thus various subsets of these modules may be combined or otherwise rearranged in various embodiments. In some embodiments, memory 103 may store a subset of the modules and data structures described above. In addition, memory 103 may store additional modules and data structures not described above.
Each of the above identified elements of fig. 2 may be stored in one or more of the aforementioned memories 103. Each of the identified modules corresponds to a set of instructions for performing a function described above. The above identified modules or programs (i.e., sets of instructions) need not be implemented as separate software programs, procedures or modules, and thus various subsets of these modules may be combined or otherwise rearranged in various embodiments. In some embodiments, memory 103 may store a subset of the modules and data structures described above. Memory 103 may also store additional modules and data structures not described above.
In some embodiments, the mobile terminal is a device on which the operation of a predefined set of functions is performed exclusively through the touch screen and/or the touch pad. By using a touch screen and/or touch pad as the primary input control device for operation of the mobile terminal, the number of physical input control devices (such as push buttons, dials, etc.) on the mobile terminal may be reduced.
The predefined set of functions that may be performed by the touch screen and/or trackpad include navigation between user interfaces. In some embodiments, the mobile terminal is navigated from any graphical user interface that may be displayed on the mobile terminal to a main menu, or root menu when the trackpad is touched by a user. In such embodiments, the touch pad may be referred to as a "menu button". In some other embodiments, the menu button may be a physical push button or other physical input control device rather than a touchpad.
The following embodiments may be implemented in a mobile terminal (e.g., handset 100) having the hardware described above.
Fig. 3A and 3B illustrate a mobile terminal (taking cell phone 100 as an example) having a touch screen according to some embodiments. The mobile terminal 100 may further include a camera 306, an earpiece 307, an ambient light sensor (not shown), and other physical devices, where the camera may include one or more front cameras and one or more rear cameras, where the front camera is disposed on the same side of the display screen, the rear camera is disposed on the back of the display screen, the front camera may be disposed on the mobile phone face and closely arranged with the display screen, for example, as shown in fig. 3A, the front camera 306 is located at a midpoint position near the top of the mobile phone, the earpiece 307 is located above the front camera 306, or for example, as shown in fig. 3B, the front camera 306 is located at a position biased to one side near the top of the mobile terminal 100, the earpiece 307 is disposed directly above the front camera 306, and the front camera 306 may also be located at a position near the bottom of the mobile terminal 100, and the specific design may be product design. The touch screen 115 is an irregular rectangular touch screen, the irregular area of the screen area of the touch screen 115 extends to the areas on two sides of the front camera 306 at the top of the mobile terminal 100, so that a working space is provided for the camera 306, the receiver 307 and other devices, the display screen occupation ratio is greatly improved, and the visual effect is improved. In the following embodiments, the front camera 306 is disposed at the center of the top of the mobile terminal 100.
Touch screen 115 may display one or more graphics within a Graphical User Interface (GUI). In this embodiment, as well as other embodiments described below, the mobile terminal may detect an operation by the user selecting one or more of the graphics by, for example, making a gesture on the graphics with one or more fingers 301 (not drawn to scale in the figures) or with one or more styluses (not shown). In some embodiments, the selection of the one or more graphics occurs when the user breaks contact with the one or more graphics. In some embodiments, the gesture may include one or more taps, one or more swipes (left to right, right to left, up, and/or down), and/or a finger (right to left, left to right, up, and/or down) that has made contact with the mobile terminal 100. In some embodiments, inadvertent contact with a graphic does not select the graphic. For example, when the gesture corresponding to the selection is a tap, a swipe gesture that sweeps over an application icon does not select the corresponding application.
The mobile terminal 100 may also include one or more physical buttons, such as a home screen button 303, a menu button 304, a back button 305. As previously described, menu button 304 may be used to navigate to any application 208 in a set of applications that may be running on mobile terminal 100. In still other embodiments, the buttons described above may each be implemented as virtual keys in a Graphical User Interface (GUI) displayed on touch screen 115.
In some embodiments, the mobile terminal 100 may also include a push button, volume adjustment button(s), Subscriber Identity Module (SIM) card slot, headset jack, docking/charging external port, and the like for device powering on and off and locking the device. Pressing the button may be used to power the device on and off by pressing the button and holding the button in a pressed state for a predefined time interval; locking the device by depressing the button and releasing the button before the predefined time interval has elapsed; and/or unlocking the device or initiating an unlocking process. In an alternative embodiment, device 100 may also accept voice input through microphone 113 for activating or deactivating certain functions.
Attention is now directed to an embodiment of a Graphical User Interface (GUI) that may be implemented on the mobile terminal 100.
As shown in fig. 4, the mobile terminal provided in the embodiment of the present application has a display screen and a camera, where the display screen may be a touch screen, and the touch screen includes a first screen area 308 and a second screen area 309. The first screen area 308 is a rectangular area, and the second screen area 309 is an area of the touch screen extending from the first screen area 308 to the periphery of the front camera. Therefore, the touch screen 115 is different from the regular rectangular touch screen in the prior art, and the display area of the touch screen 115 extends to the areas on both sides of the front camera 306 at the top of the mobile terminal 100, so as to provide a working space for the camera 306, the receiver 307 and other devices, and simultaneously, the screen occupation ratio of the touch screen 115 is greatly improved, and the visual effect is improved. In some embodiments below, the front camera 306 is disposed at the center of the top of the mobile terminal 100 as shown in the drawings.
In the embodiment of the present application, the first screen area 308 and the second screen area 309 may be controlled by the same display controller 117.
In some embodiments, upon lighting the display screen or detecting the user clicking on the home screen button 303, the mobile terminal displays a series of exemplary Graphical User Interfaces (GUIs) on the touch screen, including displaying a home screen in the first screen region 303, device status icons such as Wi-Fi signals, communication signals, battery level, time, etc., in the second screen region 309, and application icons such as short messages, emails, cameras, music, WeChat, etc., in the second screen region.
Among other things, the main screen may include a plurality of sub-screens (sub-screens), each of which may include different content. Taking fig. 4 as an example for explanation, the GUI may be a first sub-screen in a main screen of the mobile terminal. The first sub-screen is displayed on the touch screen of the mobile terminal, and may include a main screen indicator 403 and icons of various apps. The main screen indicator 403 is used to prompt the user which sub-screen is currently displayed. Exemplarily, in the first sub-screen, App icons of a plurality of rows and a plurality of columns are included; when the mobile terminal detects that a finger (or a stylus pen or the like) of a user touches a position on the touch screen, the mobile terminal can open a graphical user interface of an App corresponding to a certain App icon in response to the touch event. It is understood that, in some other embodiments, a Dock column may be further included in the main screen, and a commonly used App icon and the like may be included in the Dock column, which is not described again.
In connection with fig. 4, in some embodiments, the graphical user interface 400 illustratively includes the following elements, or a subset or superset thereof:
displayed on the first screen area 308:
home page indicator 403;
home screen button 303;
menu buttons 304;
back button 305;
● icons for applications, such as:
e-mail 404;
a calendar 405;
settings 406 that provide access to settings of the device 100 and its various applications 208;
a camera 407;
telephone 408;
contacts 409;
short message 410;
a gallery 411;
weather 412;
reading 413;
application store 414;
map 415;
browser 416;
video and music players 417; and
instant messaging application (WeChat) 418.
Displayed on the second screen area 309:
● wireless communication signal strength identification 421;
● a time indicator 402;
battery status indicator 401;
● Wi-Fi signal strength identification 419;
alarm clock indicator 420;
application icon 422, and the like.
Further, while the following examples are given primarily with reference to finger inputs (e.g., single finger contact, single finger tap gesture, single finger swipe gesture), it should be understood that in some embodiments one or more of these finger inputs are replaced by inputs from another input device (e.g., stylus inputs).
Attention is now directed to embodiments of a user interface ("UI") and associated processes that may be implemented on a mobile terminal having a display and a touch-sensitive surface, such as portable mobile terminal 100. The mobile terminal 100 of the present application aims to provide various interactive functions and improve the user experience by fully utilizing the discontinuous display second screen area 309 on the display screen 115.
Herein, unless specifically stated, the user's gesture is flexible and may be a click, double click, slide, circle, line, single finger touch, or multi-finger touch, among others. One of ordinary skill in the art will appreciate that the selection of a particular gesture is flexible as long as substantially the same result is achieved. In this context, unless specifically stated otherwise, the location or region of the touch-sensitive surface at which the user's gesture is applied is also flexible, and may be a region or vicinity of an application interface element displayed by the display screen, a blank region of the display screen where no application interface element is displayed, a region of a function setting displayed by the display screen, and so forth. It will be appreciated by those skilled in the art that the gestures can be flexibly configured to act on specific locations or regions of the touch-sensitive surface, so long as substantially the same effect is achieved.
In some embodiments, the mobile terminal 100 may determine that the intention of the gesture is to instruct the device to operate on an application icon or in a non-application icon area by detecting the pressure, duration, area, location, distance from an application icon or icons, or distance from a boundary of the touch-sensitive surface of the finger 301 on the touch-sensitive surface, for example: launch an application, flip a sub-screen, etc., in response to a gesture.
FIG. 5 illustrates a flow diagram of a method of displaying a graphical user interface provided in accordance with some embodiments. The method is performed on a mobile terminal 100 having a display as mentioned in the various embodiments below. In some embodiments, the display is a touch screen display and the touch sensitive surface is on the display. In some embodiments, the display is separate from the touch-sensitive surface. Some operations in the methods may be combined, and/or the order of some operations may be changed.
As shown in fig. 5, the method includes steps 501 to 503. Firstly, in step 501, the mobile terminal displays an application interface of a first application in the first screen area; then, in step 502, when the second application has a notification, the mobile terminal displays the notification in the second screen area; in step 503, when detecting a first touch operation performed on the notification, the mobile terminal continues to display an application interface of the first application in the first screen area, and displays an application interface of the second application related to the notification on the application interface of the first application.
The following detailed description is to be read with reference to certain embodiments and various drawings.
As shown in FIG. 6A, a finger 301 may be detected on the touch-sensitive surface of touch screen 115. The mobile terminal 100 launches the application icon 418 in response to the finger 301 clicking the application icon 418 on the first screen area 308. Thereafter, the graphical user interface of the mobile terminal 100 transitions to fig. 6B, displaying the interface contents of the application interface on the first screen area 308.
In some embodiments, the mobile terminal 100 may display or implicitly prompt the user to: application icon 418 may be launched, for example: the mobile terminal 100 vibrates.
In some embodiments, the application icon 503 may be unresponsive to the finger 301. The mobile terminal 100 may also display or implicitly prompt the user to: application icon 418 is not responsive to finger 301.
In other embodiments, when the touch screen displays an interface of a home screen as shown in fig. 4, the mobile terminal 100 displays an interface of a transition from an interface of one screen to an interface of another screen in the first screen area 308 in response to the finger 301 sliding from the non-application icon area in the first screen area 308.
In the above embodiment, in response to the finger 301 executed in the first screen area 308, when the mobile terminal 100 changes the corresponding display content in the first screen area 308, the same content may be displayed in the second screen area 309, for example, after transition from fig. 6A to fig. 6B, the mobile terminal 100 may still display some device status icons and application icons in the second screen area 309, and even when the mobile terminal 100 is not lit up in the first screen area 308, some device status icons and application icons may still be displayed in the second screen area 309, so that the user does not need to fully power the mobile terminal 100, and still can obtain some device related information, save the device power, and improve the user experience.
Of course, the mobile terminal 100 may set whether to display or hide the display contents of the first screen region 309 in response to one finger 301.
As shown in fig. 7A, the mobile terminal 100 has one display/hide icon 423 in the second screen area 309, and in the display state, when the mobile terminal 100 detects an operation (e.g., a click, a double click, a long press, a drag, etc. operation) of the display/hide icon 423 by the finger 301, the mobile terminal 100 may transition from displaying some device state icons and application icons in the second screen area 309 as shown in fig. 7A to displaying those device state icons and application icons hidden in the second screen area 309 as shown in fig. 7B (leaving the display/hide icon 423). In the hidden state, when the mobile terminal 100 detects an operation of the display/hide icon 423 by the finger 301, the mobile terminal 100 may transition from displaying those device state icons and application icons (remaining display/hide icon 423) hidden in the second screen area 309 as shown in fig. 7B to displaying some device state icons and application icons in the second screen area 309 as shown in fig. 7A.
After hiding, the mobile terminal 100 may extinguish the second screen region 309 to save power consumption, reduce interference of the display content of the second screen region 309 with the display content of the first screen region 308, and may also display a continuous interface content in the first screen region 308 and the first screen region 308 in a matching manner, so as to improve the display effect through the whole display.
In some embodiments, the mobile terminal 100 may also provide a pressure sensitive function, instead of or in conjunction with displaying/hiding the icon 423. When the mobile terminal 100 detects an input pressing the touch screen or the device bezel and the input pressure reaches a certain threshold, the mobile terminal 100 may display or hide the above-mentioned icon displayed in the second screen area 309, similar to fig. 8A and 8B.
In some embodiments, after a period of time when no operation on the second screen region 309 is detected, the mobile terminal 100 may reduce the brightness or transparency of the second screen region 309, which may reduce the interference of the display content of the second screen region 309 with the user viewing the first screen region 308.
As shown in fig. 8A and 8B, in some embodiments, when the mobile terminal 100 detects an operation of the icon 422 of the application program of the second screen region 309 by the finger 301 (e.g., clicking the icon 422 of the application program), the mobile terminal 100 may run the application program and display interface content of the corresponding application program in the first screen region 308.
In some embodiments, the mobile terminal 100 may display some icons of the designated applications on the second screen area 308, for example, commonly used applications or applications set by the user, and the mobile terminal 100 may cooperate with an application interface for quickly entering the commonly used applications in the second screen area 309 and the first screen area 308, so as to simplify the user operation process. For example, as shown in fig. 9A, when the mobile terminal 100 detects that the finger 301 clicks the icon 406 of the setting application of the first screen region 308, the set application interface is displayed in the first screen region 308, as shown in fig. 9B, and then, when the finger 301 is detected to click the icon 422 of the wechat application of the second screen region 309, the mobile terminal 100 displays the application interface of the wechat application of fig. 9C in the first screen region 308 and displays the icon 406a of the setting application in the second screen region 309; then, when detecting a touch operation 301a of clicking the icon 406a of the setting application of the second screen area 309 by the finger 301, the mobile terminal 100 displays an application interface of the setting application as shown in fig. 9B on the first screen area 308, and simultaneously displays an icon 422 of the wechat application on the second screen interface 309; or when the touch operation 301b of clicking the return control 425 shown in fig. 9C performed by the finger 301 is detected, the mobile terminal 100 displays the main interface shown in fig. 9A in the first screen area 308 and displays the icon 422 of the wechat application in the second screen area 309. Compared with the prior art that the user can enter the application interface of the WeChat application after the application interface of the setting application is quitted and returns to the main screen, or the user can enter the application interface of the WeChat application after the user operates the application interface and enters the background running list, the method provided by the embodiment of the application can provide the functions of quickly entering the common application and quickly returning to the application interface of the previous access application, simplifies the operation process of the user, and improves the use experience of the user.
The following further describes steps 501 to 503 of the method with reference to the drawings and the above embodiments.
In step 501, the mobile terminal 100 displays an application interface of the first application in the first screen area 308, for example, as shown in fig. 10A, the mobile terminal 100 displays an interface of the instant messaging application during the video call in the first screen area 308.
In some embodiments, the mobile terminal 100 may further display at least one icon 422 of the second application in the second screen area 309, and the specific display of which second application and the display number may be determined and adjusted according to the setting.
Next, in step 502, when there is a notification of the second application, the mobile terminal 100 displays the content of the notification in the second screen area 309; and continuously displaying the application interface of the first application in the first screen area.
In some embodiments, when the mobile terminal 100 detects the notification of the second application, the mobile terminal 100 may prompt in the second screen area 309 by blinking an icon corresponding to the second application, or by displaying the number of messages on the icon corresponding to the second application. For social or communication applications, the mobile terminal 100 may modify the application icon into an image of the message source, i.e., the message sender, so as to facilitate the user to identify the identity of the sender.
The notification may be an instant message received by the mobile terminal 100 through the radio frequency circuitry 102, or may be content, such as a timed reminder notification, triggered by the second application determining that a particular condition (e.g., a particular time, a particular location, a particular temperature, etc.) is met while it is running in the background. The specific content of the notification may include, but is not limited to, text, image, animation, and the like, and the prompt information may also be information of the sender of the notification, such as an image of the sender (e.g., an avatar, an icon, and the like), identity information of the sender (e.g., a name, a number, and the like), sending time of the sender, and the like.
Referring to fig. 10B, the mobile terminal 100 displays a notification 424 next to an icon (e.g., WeChat) 422 of the first application in the second screen area 309, the notification 424 including a sender's avatar and the text "where" of the notification.
In some embodiments, when the mobile terminal 100 detects the notification of the second application, the mobile terminal may automatically adjust the positions of the icons of the other applications on the second screen region 309 according to the length of the content of the notification, may display only the content of the notification in the entire second screen region 309, may scroll when the content of the notification is long, and may cause the icon corresponding to the second application to flash or change color or display the number of messages beside the icon, so that the user may perceive the notification. In some embodiments, the mobile terminal 100 may fixedly display the notification on the second screen area, which is beneficial for the user to view the content of the notification at any time, and the mobile terminal 100 may close the notification prompt message and back to the initial state shown in fig. 10A to reach a valid notification of the message when detecting the user's operation of selecting to close (pressing the screen or sliding, clicking, double-clicking the icon or message) or triggering the second application.
In some embodiments, when the mobile terminal 100 displays the notification 424, the icons of some other applications may be hidden due to the limited size of the second screen area 309, for example, as shown in fig. 10B, the mobile terminal 100 displays the notification 424 next to the icon 422 of the second application, the icon on the right side of the notification 424 is shifted to the right overall, and no other icon "shifted out" of the second screen area 309 is displayed.
In some embodiments, with reference to fig. 11A and 11B, the mobile terminal 100 may display other icons outside the second screen area 309 in a moving manner according to the sliding operation of the finger 301 on the second screen area 309, so that the user may display the other hidden icons when necessary.
In other embodiments, the mobile terminal 100 may fix the display size of the notification 424 and display the message content of the notification in the display size area of the notification, such as a scrolling display from right to left, or a scrolling display from top to bottom, etc.
Then, in step 503, in response to detecting the first operation performed by the finger 301 on the notification displayed in the second screen area 309, the mobile terminal 100 continues to display the application interface of the first application in the first screen area 308, and displays the application interface 430 related to the second application on all or a part of the application interface of the first application.
The application interface 430 may be an application interface related to the notification, or an application interface for invoking other functions of the second application.
As shown in fig. 12A, when the mobile terminal 100 detects an operation of dragging the notification 402 from the second screen area 309 to the first screen area 308 by the finger 301, the interface of the running instant messaging application is continuously displayed, a notification of the ongoing video call is continuously displayed, and the application interface 430 is displayed on the application interface of the instant messaging application, and the application interface 430 may include an application interface related to specific content of the notification shown in fig. 12B.
In some embodiments, in conjunction with fig. 15A and 15B, the application interface 430 may further include application interfaces of other related functions of the second application, for example, the mobile terminal 100 detects an operation of the user clicking or double-clicking the icon 422 of the second application, may display a contact list in the first screen area 308, may move a display context when an operation of sliding contents input by the user in the contact list is detected, may determine a contact when a sliding or input by the user is detected, and enters the display interface 431 of the contact.
In some embodiments, the method may further include a step 504, in which, when the mobile terminal 100 detects an operation (e.g., clicking "reply") performed by the finger 301 on the interactive interface of the touch screen, the application interface 430 displayed by the mobile terminal 100 transitions from fig. 12B to the input interface shown in fig. 12C, the mobile terminal 100 acquires a second operation of the finger 301 on the input area of the application interface 430, acquires a reply message, and when it detects that the finger 301 clicks "send", the second application sends the reply message to a corresponding recipient (corresponding to the aforementioned sender).
Then, the application interface 430 displayed by the mobile terminal 100 transitions from fig. 12B to the input interface shown in fig. 12C to continue displaying the interface of the running instant messaging application, notify of the ongoing video call, and display the application interface 430 on the interface of the instant messaging application.
During this period, the mobile terminal 100 still continues to run the first application, and still displays the application interface of the first application in real time in the first screen area 308, for example, fig. 12A-12C, and the mobile terminal 100 still performs the video call process in real time in the first screen area 308, that is, continues to display the dynamic images of the two parties of the call, transfer the voices of the two parties of the call, and so on in real time. The application interface 430 may be displayed in a semi-transparent or opaque manner, wherein displaying in a semi-transparent manner may allow the user to continue to view the interface content of the first application.
In some embodiments, when the mobile terminal 100 displays the application interface of the second application related to the notification, the mobile terminal 100 executes an instruction corresponding to a third touch operation in the second application when detecting the third touch operation executed on the area outside the application interface of the second application in the first screen area 308, and continues to display the application interface related to the notification in the second application in the first screen area.
As shown in fig. 12C, the input interface 430 of the second application may include a virtual keyboard, and the virtual keyboard may be set as a full keyboard or a squared keyboard, and as shown in fig. 14B, the input interface 430 of the second application is split into a left part and a right part, so as to free a middle part of the interface content of the first application, for example, during a game, a central view area is important, and the free middle part is not likely to interfere with the game.
In some embodiments, the application interface of the second application may further integrate voice input into the input method, and after the mobile terminal 100 converts the voice input into text according to the voice input of the user, the second application is operated to reply the message, so as to further reduce interference of the keyboard on the display screen of the first application.
In other embodiments, when the mobile terminal 100 detects the operation of dragging the notification 402 from the second screen region 309 to the first screen region 308 by the finger 301, the content displayed by the mobile terminal 100 may transition from fig. 13A to the application interface 430 shown in fig. 13B, that is, the notification content displayed by the mobile terminal in the first screen region 308 is switched to be displayed as the content being input, and the input interface is displayed on the first screen region 309, in this embodiment, the method processes the notification of the second application through the first screen region 308 and the second screen region 309 in cooperation, further simplifies the interaction process and the interaction interface of the second application displayed in the first screen region 308, facilitates the user to process the notification of the second application quickly, shortens the impact time on the application interface of the first application, and can make the user return to, for example, watching a movie, playing a game, and playing a game as soon as possible, In the video call, the user experience is improved.
In some embodiments, after the mobile terminal 100 finishes processing the notification, the mobile terminal 100 continues to display the application interface of the first application in the first screen area and does not display the application interface of the second application, for example, after the mobile terminal 100 detects that the gesture 301 clicks to send, the mobile terminal 100 exits from the application interface of the second application, and the display is as shown in fig. 13A.
In some embodiments, when detecting a third touch operation performed by the finger 301 on the first screen area 308 and on an area outside the application interface of the second application, the mobile terminal 100 may perform an instruction corresponding to the third touch operation in the second application, for example, when the first application is a game application, and during processing of a notification of the second application, if an enemy approach is found in the game, the user may immediately operate the interface of the first application, and the mobile terminal may continue to display an application interface related to the notification in the second application in the first screen area.
In some embodiments, the mobile terminal 100 may continue to display the application interface of the first application in the first screen area 308 and exit the application interface of the second application according to the operation of the user, where the exiting may be, but is not limited to, the mobile terminal 100 detecting that the finger 301 clicks the icon corresponding to the second application in the second screen area 308; the mobile terminal 100 detects that the finger 301 presses the application interface of the second application for a long time; the mobile terminal detects that the finger 301 double-clicks an area in the application interface of the first application and outside the application interface of the second application, and so on.
Therefore, when the mobile terminal 100 displays the interface content of the first application in the first screen area 308, such as a video call, a video playback (movie, television, animation, etc.), a game, etc., and when there is a notification of the second application, the mobile terminal 100 may display the information related to the notification in the second screen area 309, on one hand, compared with a manner of popping up an interface in the first screen area 308 for prompting, the mobile terminal does not occupy the first screen area 308 and does not affect the display content of the first screen area 308, on the other hand, the information related to the notification may be persistently displayed in the second screen area 309, and is not rapidly retrieved after the pop-up interface pops up, thereby avoiding a problem that the notification disappears when the user does not have time to see the content of the notification.
When the mobile terminal 100 detects a first operation of the notification by the user, the application interface 430 of the second application is displayed on the interface content of the first application while the interface content of the first application is displayed in the first screen area 308, and the application interface 430 may process the notification in response to a second operation performed by the user thereon, including reviewing the content of the notification, processing or replying to the notification, and the like. Accordingly, the mobile terminal 100 does not need to jump from the interface content of the first application being displayed to the interface content of the second application on the first screen area 308, so that the user can still continue to view the interface content of the first application, while the application interface 430 can provide the user with a function of quickly processing the notification of the second application.
Therefore, in some specific use scenarios, the use experience of the user is greatly improved, for example: when a user watches a movie, the user can process the notification without stopping the movie being watched, and switching to other applications without exiting the player application in the first screen area; the user does not need to jump out of the game application in the game process, so that the problems of ' disconnection ', escape ' and the like which are possibly judged by a game server are avoided, and the notification can be processed; in addition, the user can process the notification without closing a video call interface in the process of passing the video, so that the call process with the other party is prevented from being influenced.
The mobile terminal 100 may detect that the user clicks a message, an icon, or presses a screen or a border to enter a message input mode, in which the mobile terminal 100 is displayed in a part or the whole area of the first screen area and is used as an input area, and the display of the screen of the original application such as a game, a movie, or a reading is not affected in the input area, for example, the content of the message, the input keyboard, the handwriting input track, the input method, and the like are superimposed thereon in a semi-transparent or non-transparent manner, and the input message is obtained and transmitted.
In some embodiments, when the mobile terminal 100 detects that the mobile terminal 100 is rotated by using an accelerometer sensor, a gravity sensor, or a gyroscope, the icon of the second screen region 309 may also rotate, for example, as shown in fig. 14A, to provide a display direction of a proper viewing angle for the user, and also to provide a proper display direction for the application interface 430 displayed on the first screen region 308.
According to another aspect of the present application, a method for displaying a graphical user interface is provided, which includes steps 601 to 602.
In step 601, the mobile terminal 100 starts the front camera function, and as shown in fig. 16A, the mobile terminal 100 detects an operation of clicking the camera icon 407 by the finger 301, starts the camera function, and detects an operation of clicking the control 425 for switching the front/rear cameras by the finger 301, starts the front camera to perform shooting.
Then, in step 602, the mobile terminal 100 displays the image acquired by the front camera on the first screen region 308, and enters a light filling mode in the second screen region 309, where the light filling mode may include increasing the display brightness of all or a part of the second screen region to exceed a threshold value to fill the object with light, for example, as shown in fig. 16B, the display brightness of the second screen region 309 is increased to a certain threshold value or even to a maximum value, that is, the backlight brightness in the second screen region 309 is increased to a certain threshold value or even to a maximum value.
In some embodiments, the mobile terminal 100 may further adjust the second screen region 309 to a monochrome mode, for example, white, yellow, blue, and the like, to supplement light for the object to be photographed, and at the same time, help to improve an imaging color of the object to be photographed, improve an imaging effect, and provide a photographing experience of a user.
Therefore, in the method, the mobile terminal displays the second screen region 309 as a light supplement lamp bar to supplement light for a shot object (such as a human face) in the shooting process of the front camera, so that an image with better brightness can be obtained when the ambient light of a user is dark, for example, self-shooting and video call can be performed in a dark environment. Therefore, when the front-facing camera is started to photograph, the mobile terminal 100 uses the second screen region as a fill-in light, lights the discontinuous screen region, and provides the function of the fill-in light in applications such as photographing, makeup, face recognition, and the like, so that the quality of the obtained image is improved in a dark light environment. The backlight brightness of the area can be enhanced in the discontinuous screen area to obtain a better light supplement effect. By combining the embodiments, the display screen has more various functions, the use requirements of the user in various scenes are improved, and the use experience of the user is improved.
In some embodiments, in step 602, in the process of performing light supplement when the second screen region 309 enters the light supplement mode, the mobile terminal 100 may perform light supplement in one or more partial regions of the second screen region 309, as shown in fig. 17A, 17B, and 17C, when the front camera shooting process enters the depth 3D mode, when it is detected that the finger 301 clicks to take a photo, the front camera continuously takes 3 photos, and light 426a, 426B, and 426C for light supplement each time of taking a photo, and then the processor 101 of the mobile terminal 100 synthesizes the three photos, and obtains high-precision 3D depth information by using light and shadow change in combination with the dual front cameras. Fill light from different angles, form the higher image of precision, further help improving the formation of image effect.
When the mobile terminal realizes the iris recognition function, the front camera and the iris scanning camera are started, the eyes of a user who shoots are displayed in the first screen area 308, infrared light is added on the basis that visible light is displayed in the second screen area 309, an infrared light supplementing effect is provided for iris recognition or face recognition, and the iris recognition function is improved.
Therefore, the mobile terminal utilizes the second screen area to realize a soft light supplement effect, and a better shooting effect is brought under the condition that the display application content is not influenced. The application provides a scheme of utilizing discontinuous screen area as light filling lamp. Compared with the prior art, the light supplementing effect can be realized in the scheme of large screen occupation ratio without occupying extra space and adding extra holes on the panel of the equipment.
In some embodiments, as shown in fig. 18, the mobile terminal 100 may further display a prompt pattern 435 pointing to the position of the front camera 306 in the second screen area 309, so as to improve the shooting effect.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, cause the processes or functions described in accordance with the embodiments of the application to occur, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, from one website site, computer, server, or data center to another website site, computer, server, or data center via wired (e.g., coaxial cable, fiber optic, Digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that incorporates one or more of the available media. The usable medium may be a magnetic medium (e.g., floppy Disk, hard Disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., Solid State Disk (SSD)), among others.
In the embodiment of the present application, the device may be divided into the functional modules according to the method example, for example, each functional module may be divided corresponding to each function, or two or more functions may be integrated into one processing module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. The division of the modules in the embodiment of the present application is schematic, and is only a logic function division, and there may be another division manner in actual implementation.
It will be clear to those skilled in the art that, for convenience and simplicity of description, the foregoing division of the functional modules is merely used as an example, and in practical applications, the above function distribution may be performed by different functional modules as needed, that is, the internal structure of the mobile device is divided into different functional modules to perform all or part of the above described functions. For the specific working processes of the system, the mobile device and the unit described above, reference may be made to the corresponding processes in the foregoing method embodiments, and details are not described here again.
In summary, the above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the above embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the above embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions in the embodiments of the present application.

Claims (40)

  1. A method of displaying a graphical user interface on a mobile terminal, wherein the mobile terminal comprises a touch screen and a camera, the touch screen comprises a first screen area and a second screen area, the first screen area is a rectangular area, and the second screen area is an area of the touch screen extending from the first screen area to around the camera, the method comprising:
    the mobile terminal displays an application interface of a first application in the first screen area;
    when the second application has a notice, the mobile terminal displays the notice in the second screen area;
    when detecting a first touch operation performed on the notification, the mobile terminal continues to display the application interface of the first application in the first screen area, and displays the application interface of the second application related to the notification on the application interface of the first application.
  2. The method of claim 1, wherein the step of the mobile terminal displaying the application interface of the second application related to the notification comprises:
    the mobile terminal displays the application interface of the second application on a partial area of the application interface of the first application; and/or
    And the mobile terminal displays the application interface of the second application on the application interface of the first application in a transparent or semi-transparent mode.
  3. The method according to claim 1 or 2, wherein, when the mobile terminal displays an application interface of the second application related to the notification, the method further comprises:
    and when detecting a second touch operation executed on the application interface of the second application, the mobile terminal processes the notification.
  4. The method of claim 3, wherein after the mobile terminal has processed the notification, the method further comprises:
    and the mobile terminal continuously displays the application interface of the first application in the first screen area and does not display the application interface of the second application.
  5. The method according to any one of claims 1 to 4, wherein, when the mobile terminal displays an application interface of the second application related to the notification, the method further comprises:
    when a third touch operation executed on the area outside the application interface of the second application in the first screen area is detected, the mobile terminal executes an instruction corresponding to the third touch operation in the second application, and continues to display the application interface related to the notification in the second application in the first screen area.
  6. The method according to any one of claims 1 to 4, wherein the method further comprises, when the mobile terminal displays an application interface of the second application related to the notification:
    when a fourth touch operation is detected, the mobile terminal continues to display the application interface of the first application in the first screen area and does not display the application interface of the second application, wherein the step of detecting the fourth touch operation by the mobile terminal comprises any one of the following steps:
    the mobile terminal detects touch operation of the notification corresponding to the second application in the second screen area;
    the mobile terminal detects touch operation on an application interface of the second application;
    the mobile terminal detects a touch operation performed on an area in the application interface of the first application and outside the application interface of the second application.
  7. The method of any of claims 1-6, wherein the method further comprises:
    and the mobile terminal displays at least one icon of the second application in the second screen area.
  8. The method of claim 7, wherein the step of the mobile terminal prompting the notification in the second screen area comprises at least any one of:
    the mobile terminal displays the content of the notification in the second screen area;
    the mobile terminal displays the sender information of the notification in the second screen area, wherein the sender information comprises any one item or any several items of sender images, sender identity information and sender sending time;
    and the mobile terminal changes the display mode of the icon of the second application corresponding to the notification in the second screen area.
  9. The method of claim 7 or 8, wherein the method further comprises:
    the mobile terminal displays a control in the second screen area;
    when a third touch operation performed on the control is detected, the mobile terminal hides the icon of the at least one second application and/or the notification of the second application.
  10. The method of any of claims 1-9, wherein the step of the mobile terminal displaying the notification in the second screen area comprises:
    and the mobile terminal displays the content of the notification in a scrolling display mode in the second screen area.
  11. A method of displaying a graphical user interface on a mobile terminal, wherein the mobile terminal comprises a touch screen and a camera, the touch screen comprises a first screen area and a second screen area, the first screen area is a rectangular area, and the second screen area is an area of the touch screen extending from the first screen area to around the camera, the method comprising:
    the mobile terminal displays an application interface of a first application in the first screen area, and displays an icon of a second application in the second screen area;
    when a first touch operation performed on the icon of the second application is detected, the mobile terminal displays an application interface of the second application in the first screen area, and displays the icon of the first application in the second screen area.
  12. The method of claim 11, wherein after the mobile terminal displays an application interface of the second application in the first screen area, the method further comprises:
    and when the mobile terminal does not display the application interface of the second application in the second screen area, displaying the icon of the second application in the second screen area.
  13. A method for displaying on a mobile terminal, wherein the mobile terminal comprises a touch screen and a camera, the touch screen comprises a first screen area and a second screen area, the first screen area is a rectangular area, and the second screen area is an area of the touch screen extending from the first screen area to the periphery of the camera, the method comprises:
    the mobile terminal starts the camera to shoot;
    and the mobile terminal displays the image acquired by the camera on the first screen area, and improves the display brightness of all or part of the second screen area to exceed a threshold value so as to supplement light for the shot object.
  14. The method of claim 13, wherein the method further comprises:
    and the mobile terminal displays a preset color in all or at least one local area of the second screen area so as to supplement light for the shot object.
  15. The method of claim 13 or 14, wherein the method further comprises:
    and the mobile terminal also displays a prompt pattern pointing to the position of the camera in a second screen area.
  16. A mobile terminal, wherein the mobile terminal comprises:
    a camera;
    the touch screen comprises a first screen area and a second screen area, wherein the first screen area is a rectangular area, and the second screen area is an area of the touch screen extending from the first screen area to the periphery of the camera;
    one or more processors;
    a memory;
    and one or more computer programs, wherein the one or more computer programs are stored in the memory, the one or more computer programs comprising instructions which, when executed by the mobile terminal, cause the mobile terminal to perform the steps of:
    displaying an application interface of a first application in the first screen area;
    when the second application has a notification, displaying the notification in the second screen area;
    when the first touch operation executed on the notification is detected, the application interface of the first application is continuously displayed in the first screen area, and the application interface of the second application related to the notification is displayed on the application interface of the first application.
  17. The mobile terminal of claim 16, wherein in the step of displaying the application interface of the second application related to the notification, the instructions, when executed by the mobile terminal, cause the mobile terminal to perform the steps of:
    displaying an application interface of the second application on a partial area of an application interface of the first application; and/or
    Displaying the application interface of the second application on the application interface of the first application in a transparent or semi-transparent mode.
  18. The mobile terminal of claim 16 or 17, wherein the instructions, when executed by the mobile terminal, cause the mobile terminal to perform the steps of, while displaying the application interface of the second application relating to the notification:
    and processing the notification when a second touch operation executed on the application interface of the second application is detected.
  19. The mobile terminal of claim 18, wherein the instructions, when executed by the mobile terminal after processing the notification, cause the mobile terminal to further perform the steps of:
    and continuously displaying the application interface of the first application in the first screen area, and not displaying the application interface of the second application.
  20. The mobile terminal of any of claims 16 to 19, wherein when the mobile terminal displays an application interface of the second application that is related to the notification, the instructions, when executed by the mobile terminal, cause the mobile terminal to perform the steps of:
    and when a third touch operation executed on the area, outside the application interface, of the second application is detected, executing an instruction, corresponding to the third touch operation, of the second application, and continuously displaying an application interface, related to the notification, of the second application in the first screen area.
  21. The mobile terminal of any of claims 16 to 19, wherein the instructions, when executed by the mobile terminal, cause the mobile terminal to perform the steps of, while displaying the application interface of the second application relating to the notification:
    when a fourth touch operation is detected, continuing to display the application interface of the first application in the first screen area and not displaying the application interface of the second application, wherein,
    a step of detecting a fourth touch operation, wherein the instruction, when executed by the mobile terminal, causes the mobile terminal to execute any one of the following steps:
    detecting a touch operation on a notification corresponding to the second application in the second screen area;
    detecting a touch operation on an application interface of the second application;
    a touch operation performed on an area in an application interface of the first application and outside an application interface of the second application is detected.
  22. The mobile terminal of any of claims 16 to 21, wherein the instructions, when executed by the mobile terminal, cause the mobile terminal to further perform the steps of:
    and displaying at least one icon of the second application in the second screen area.
  23. The mobile terminal of claim 22, wherein in the step of prompting the notification in the second screen area, the instructions, when executed by the mobile terminal, cause the mobile terminal to perform at least any of the following:
    displaying the content of the notification in the second screen area;
    displaying sender information of the notification in the second screen area, wherein the sender information comprises any one item or any several items of sender images, sender identity information and sender sending time;
    and changing the display mode of the icon of the second application corresponding to the notification in the second screen area.
  24. The mobile terminal of claim 22 or 23, wherein the instructions, when executed by the mobile terminal, cause the mobile terminal to further perform the steps of:
    displaying a control in the second screen area;
    when a third touch operation performed on the control is detected, hiding the at least one icon of the second application and/or the notification of the second application.
  25. The mobile terminal of any of claims 16 to 24, wherein in displaying the notification in the second screen region, the instructions, when executed by the mobile terminal, cause the mobile terminal to perform the steps of:
    and displaying the content of the notification in a scrolling mode in the second screen area.
  26. A mobile terminal, wherein the mobile terminal comprises:
    a camera;
    the touch screen comprises a first screen area and a second screen area, wherein the first screen area is a rectangular area, and the second screen area is an area of the touch screen extending from the first screen area to the periphery of the camera;
    one or more processors;
    a memory;
    and one or more computer programs, wherein the one or more computer programs are stored in the memory, the one or more computer programs comprising instructions which, when executed by the mobile terminal, cause the mobile terminal to perform the steps of:
    displaying an application interface of a first application in the first screen area, and displaying an icon of a second application in the second screen area;
    when a first touch operation performed on the icon of the second application is detected, displaying an application interface of the second application in the first screen area, and displaying the icon of the first application in the second screen area.
  27. The mobile terminal of claim 26, wherein after the mobile terminal displays the application interface of the second application in the first screen region, the instructions, when executed by the mobile terminal, cause the mobile terminal to further perform the steps of:
    and when the mobile terminal does not display the application interface of the second application in the second screen area, displaying the icon of the second application in the second screen area.
  28. A mobile terminal for displaying on a mobile terminal, wherein,
    a camera;
    the touch screen comprises a first screen area and a second screen area, wherein the first screen area is a rectangular area, and the second screen area is an area of the touch screen extending from the first screen area to the periphery of the camera;
    one or more processors;
    a memory;
    and one or more computer programs, wherein the one or more computer programs are stored in the memory, the one or more computer programs comprising instructions which, when executed by the mobile terminal, cause the mobile terminal to perform the steps of:
    starting the camera to shoot;
    and displaying the image acquired by the camera on the first screen area, and improving the display brightness of all or part of the second screen area to exceed a threshold value so as to supplement light for the shot object.
  29. The mobile terminal of claim 28, wherein the instructions, when executed by the mobile terminal, cause the mobile terminal to further perform the steps of:
    and displaying a preset color in all or at least one local area of the second screen area to supplement light for the shot object.
  30. The mobile terminal of claim 28 or 29, wherein the instructions, when executed by the mobile terminal, cause the mobile terminal to further perform the steps of:
    and displaying a prompt pattern pointing to the position of the camera in the second screen area.
  31. A Graphical User Interface (GUI) stored in an electronic device, the electronic device comprising a touch screen, a memory, one or more processors, the touch screen comprising a first screen area and a second screen area, the first screen area being a rectangular area, the second screen area being an area of the touch screen extending from the first screen area to around the camera, the one or more processors being configured to execute one or more computer programs stored in the memory, the GUI comprising:
    a first GUI displayed in the first screen area, the first GUI comprising an application interface of a first application;
    a second GUI displayed in the second screen area in response to detecting a notification to the second application, the second GUI including the notification;
    in response to detecting a first touch operation performed on the notification, continuing to display the first GUI in the first screen area, and displaying a third GUI on the first GUI, the third GUI including an application interface of the second application related to the notification.
  32. The graphical user interface of claim 31, wherein the first GUI further comprises: at least one icon of the second application.
  33. A computer program product, wherein the computer program product, when run on a mobile terminal, causes the mobile terminal to perform the method of any one of claims 1-15.
  34. A computer-readable storage medium comprising instructions, wherein the instructions, when executed on a mobile terminal, cause the mobile terminal to perform the method of any of claims 1-15.
  35. A mobile terminal having a touch screen, the mobile terminal comprising: a display unit, a detection unit and a control unit, wherein,
    the display unit is used for displaying an application interface of a first application in a first screen area of the touch screen;
    the detection unit is used for detecting whether the second application has a notice or not and detecting whether the touch operation on the touch screen exists or not;
    the control unit is used for enabling the display unit to display the notification in a second screen area of the touch screen when the detection unit detects that the second application has the notification, and continuing to display the application interface of the first application in the first screen area and displaying the application interface of the second application related to the notification on the application interface of the first application when the detection unit detects that the first touch operation is executed on the notification.
  36. The mobile terminal of claim 35, wherein the mobile terminal further comprises a camera, the touch screen comprises a first screen area and a second screen area, the first screen area is a rectangular area, and the second screen area is an area of the touch screen extending from the first screen area to the periphery of the camera.
  37. The mobile terminal of claim 35 or 36, wherein the display unit is further configured to display at least one icon of the second application in the second screen area.
  38. The mobile terminal of claim 37, wherein the display unit is configured to display a control in the second screen area; the control unit is further used for enabling the display unit to hide the at least one icon of the second application and/or the notification of the second application when the detection unit detects a third touch operation performed on the control.
  39. A mobile terminal having a touch screen, the mobile terminal comprising: a display unit, a detection unit and a control unit, wherein,
    the display unit is used for displaying an application interface of a first application in a first screen area of the touch screen and displaying an icon of a second application in a second screen area of the touch screen;
    the detection unit is used for detecting whether touch operation is performed on the touch screen;
    the control unit is used for enabling the display unit to display the application interface of the second application in the first screen area and display the icon of the first application in the second screen area when the detection unit detects the first touch operation executed on the icon of the second application.
  40. A mobile terminal having a touch screen, the mobile terminal comprising: a photographing unit, a display unit, and a control unit, wherein,
    the shooting unit is used for shooting images;
    the display unit is used for displaying the image acquired by the camera on a first screen area of the touch screen;
    the control unit is used for controlling the display unit to improve the display brightness of all or part of the second screen area of the touch screen to exceed a threshold value so as to supplement light for the shot object.
CN201780091258.9A 2017-06-30 2017-06-30 Method for displaying graphical user interface and mobile terminal Active CN110663016B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211291655.XA CN116055610B (en) 2017-06-30 2017-06-30 Method for displaying graphical user interface and mobile terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2017/091283 WO2019000437A1 (en) 2017-06-30 2017-06-30 Method of displaying graphic user interface and mobile terminal

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202211291655.XA Division CN116055610B (en) 2017-06-30 2017-06-30 Method for displaying graphical user interface and mobile terminal

Publications (2)

Publication Number Publication Date
CN110663016A true CN110663016A (en) 2020-01-07
CN110663016B CN110663016B (en) 2024-01-16

Family

ID=64742786

Family Applications (2)

Application Number Title Priority Date Filing Date
CN201780091258.9A Active CN110663016B (en) 2017-06-30 2017-06-30 Method for displaying graphical user interface and mobile terminal
CN202211291655.XA Active CN116055610B (en) 2017-06-30 2017-06-30 Method for displaying graphical user interface and mobile terminal

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN202211291655.XA Active CN116055610B (en) 2017-06-30 2017-06-30 Method for displaying graphical user interface and mobile terminal

Country Status (2)

Country Link
CN (2) CN110663016B (en)
WO (1) WO2019000437A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112291412A (en) * 2020-10-29 2021-01-29 维沃移动通信(杭州)有限公司 Application program control method and device and electronic equipment

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109886000B (en) * 2019-02-01 2024-03-01 维沃移动通信有限公司 Image encryption method and mobile terminal
CN110196667B (en) * 2019-05-30 2021-02-09 维沃移动通信有限公司 Notification message processing method and terminal
CN114816209A (en) * 2019-06-25 2022-07-29 华为技术有限公司 Full screen display method and device of mobile terminal
CN112583957A (en) * 2019-09-30 2021-03-30 华为技术有限公司 Display method of electronic device, electronic device and computer-readable storage medium
CN112714236A (en) * 2019-10-24 2021-04-27 中兴通讯股份有限公司 Terminal, shooting method, storage medium and electronic device
CN112162807A (en) * 2020-09-24 2021-01-01 维沃移动通信有限公司 Function execution method and device
CN112380014A (en) * 2020-11-17 2021-02-19 莫雪华 System resource allocation system and method based on big data
CN115695599A (en) * 2021-07-28 2023-02-03 华为技术有限公司 Method for prompting camera state and electronic equipment
CN116088716B (en) * 2022-06-13 2023-12-08 荣耀终端有限公司 Window management method and terminal equipment
CN115150550A (en) * 2022-06-20 2022-10-04 湖北星纪时代科技有限公司 Photographing processing method and device for terminal, electronic equipment and storage medium

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120204191A1 (en) * 2011-02-07 2012-08-09 Megan Shia System and method for providing notifications on a mobile computing device
CN103500079A (en) * 2013-09-17 2014-01-08 小米科技有限责任公司 Notification message display method and device and electronic equipment
CN104239027A (en) * 2013-06-07 2014-12-24 北京三星通信技术研究有限公司 Method and device for calling programs and terminal device
CN104267950A (en) * 2014-09-25 2015-01-07 北京金山安全软件有限公司 Setting method and device of terminal application program and mobile terminal
CN105162959A (en) * 2015-08-04 2015-12-16 广东欧珀移动通信有限公司 Message notification processing method and device
CN105335056A (en) * 2015-12-10 2016-02-17 魅族科技(中国)有限公司 Application control method and device and terminal
CN105955573A (en) * 2016-04-27 2016-09-21 上海斐讯数据通信技术有限公司 Mobile terminal application switching method and system
CN106231187A (en) * 2016-07-29 2016-12-14 维沃移动通信有限公司 A kind of method shooting image and mobile terminal
CN106302095A (en) * 2015-06-04 2017-01-04 深圳市腾讯计算机系统有限公司 A kind of message display control method, device and terminal
CN106547446A (en) * 2016-10-31 2017-03-29 努比亚技术有限公司 Using switching device and method
CN106657485A (en) * 2017-03-07 2017-05-10 广东欧珀移动通信有限公司 Mobile terminal
US20170149959A1 (en) * 2013-05-15 2017-05-25 Lg Electronics Inc. Mobile terminal and controlling method thereof
CN106843654A (en) * 2017-01-24 2017-06-13 维沃移动通信有限公司 The method and mobile terminal of a kind of terminal multi-job operation

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9619143B2 (en) * 2008-01-06 2017-04-11 Apple Inc. Device, method, and graphical user interface for viewing application launch icons
KR20120126161A (en) * 2011-05-11 2012-11-21 삼성전자주식회사 Mobile terminal and method for controlling screen using the same
CN104750381A (en) * 2013-12-31 2015-07-01 中兴通讯股份有限公司 Method, device and terminal for operating application items quickly
US9887949B2 (en) * 2014-05-31 2018-02-06 Apple Inc. Displaying interactive notifications on touch sensitive devices
CN105511719A (en) * 2015-11-27 2016-04-20 努比亚技术有限公司 Notification information display method and device

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120204191A1 (en) * 2011-02-07 2012-08-09 Megan Shia System and method for providing notifications on a mobile computing device
US20170149959A1 (en) * 2013-05-15 2017-05-25 Lg Electronics Inc. Mobile terminal and controlling method thereof
CN104239027A (en) * 2013-06-07 2014-12-24 北京三星通信技术研究有限公司 Method and device for calling programs and terminal device
CN103500079A (en) * 2013-09-17 2014-01-08 小米科技有限责任公司 Notification message display method and device and electronic equipment
CN104267950A (en) * 2014-09-25 2015-01-07 北京金山安全软件有限公司 Setting method and device of terminal application program and mobile terminal
CN106302095A (en) * 2015-06-04 2017-01-04 深圳市腾讯计算机系统有限公司 A kind of message display control method, device and terminal
CN105162959A (en) * 2015-08-04 2015-12-16 广东欧珀移动通信有限公司 Message notification processing method and device
CN105335056A (en) * 2015-12-10 2016-02-17 魅族科技(中国)有限公司 Application control method and device and terminal
CN105955573A (en) * 2016-04-27 2016-09-21 上海斐讯数据通信技术有限公司 Mobile terminal application switching method and system
CN106231187A (en) * 2016-07-29 2016-12-14 维沃移动通信有限公司 A kind of method shooting image and mobile terminal
CN106547446A (en) * 2016-10-31 2017-03-29 努比亚技术有限公司 Using switching device and method
CN106843654A (en) * 2017-01-24 2017-06-13 维沃移动通信有限公司 The method and mobile terminal of a kind of terminal multi-job operation
CN106657485A (en) * 2017-03-07 2017-05-10 广东欧珀移动通信有限公司 Mobile terminal

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112291412A (en) * 2020-10-29 2021-01-29 维沃移动通信(杭州)有限公司 Application program control method and device and electronic equipment

Also Published As

Publication number Publication date
WO2019000437A1 (en) 2019-01-03
CN116055610B (en) 2023-12-08
CN116055610A (en) 2023-05-02
CN110663016B (en) 2024-01-16

Similar Documents

Publication Publication Date Title
CN116055610B (en) Method for displaying graphical user interface and mobile terminal
US11537268B2 (en) Electronic device comprising multiple displays and method for operating same
US10209877B2 (en) Touch screen device, method, and graphical user interface for moving on-screen objects without using a cursor
US10867059B2 (en) Device, method, and graphical user interface for accessing an application in a locked device
CN108701001B (en) Method for displaying graphical user interface and electronic equipment
US9952681B2 (en) Method and device for switching tasks using fingerprint information
CN105955607B (en) Content sharing method and device
CN108776568B (en) Webpage display method, device, terminal and storage medium
KR101749235B1 (en) Device, method, and graphical user interface for managing concurrently open software applications
JP5658765B2 (en) Apparatus and method having multiple application display modes, including a mode with display resolution of another apparatus
US9652116B2 (en) Mobile terminal and method of controlling the same
US20130326415A1 (en) Mobile terminal and control method thereof
US20110179372A1 (en) Automatic Keyboard Layout Determination
CN112214138B (en) Method for displaying graphical user interface based on gestures and electronic equipment
KR20120123744A (en) Api to replace a keyboard with custom controls
US20130104032A1 (en) Mobile terminal and method of controlling the same
CN113419660A (en) Video resource processing method and device, electronic equipment and storage medium
WO2023072233A1 (en) Page switching method, page switching apparatus, electronic device and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant