WO2018000382A1 - 一种查看应用程序的图形用户界面、方法及终端 - Google Patents

一种查看应用程序的图形用户界面、方法及终端 Download PDF

Info

Publication number
WO2018000382A1
WO2018000382A1 PCT/CN2016/088015 CN2016088015W WO2018000382A1 WO 2018000382 A1 WO2018000382 A1 WO 2018000382A1 CN 2016088015 W CN2016088015 W CN 2016088015W WO 2018000382 A1 WO2018000382 A1 WO 2018000382A1
Authority
WO
WIPO (PCT)
Prior art keywords
input
application
reachable
display area
location
Prior art date
Application number
PCT/CN2016/088015
Other languages
English (en)
French (fr)
Inventor
余亮
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to PCT/CN2016/088015 priority Critical patent/WO2018000382A1/zh
Priority to US16/313,796 priority patent/US11314388B2/en
Priority to CN201680086699.5A priority patent/CN109313531A/zh
Publication of WO2018000382A1 publication Critical patent/WO2018000382A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0483Interaction with page-structured environments, e.g. book metaphor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/48Program initiating; Program switching, e.g. by interrupt

Definitions

  • the present invention relates to the field of human-computer interaction technologies, and in particular, to a graphical user interface, method, and terminal for viewing an application.
  • a terminal device such as a mobile phone. Due to the limited screen of the terminal device, these applications that are opened are usually displayed in a layered manner. Most commonly, the screen generally only displays the top-level application, while other running applications are displayed in the top-most application occlusion of the screen, not visible to the user.
  • the user in order to view other applications running in the system, the user needs to trigger a display of a list of running applications by a specific method, such as a touch home button, in which the applications running in the system are sequentially arranged in the list. A thumbnail of the program. Then, the user needs to find the application that he wants to view from the list, and select the application that he wants to view, so that the application that the user wants to view can be adjusted to the top layer and displayed on the screen.
  • a specific method such as a touch home button
  • the embodiment of the invention provides a graphical user interface, method and terminal for viewing an application, which can simplify the operation of the user to view the occluded application in the cascading application.
  • a method for viewing an application includes: receiving a first input, and obtaining an input location corresponding to the first input, and non-location information corresponding to the first input at the input location And determining, according to the input location and the non-position information, an application that the first input is reachable at the input location, and finally, displaying an application that the first input is reachable at the input location program.
  • the first input is used to view a plurality of running applications displayed in a cascade.
  • the plurality of running applications are cascaded and displayed in the first display area.
  • the first input may be a touch operation detected by the touch screen.
  • the non-position information corresponding to the first input may be the touch pressure detected by the pressure sensor disposed under the touch screen, or may be the touch area collected by the touch screen, or may be a timer record. The touch time, etc., are not limited here.
  • the first input may also be a gesture operation detected by a gesture sensor.
  • the non-position information corresponding to the first input may be a gesture depth.
  • the first input may also be other types of input operations, and no limitation is made herein.
  • Implementing the method described in the first aspect can simplify the operation of the user to view the occluded application in the cascading application, and can simultaneously display the output content of multiple applications in the cascading application, so that the user can view the cascading application.
  • an application reachable by the first input at the input location can be determined by first determining a cascading application at the input location based on the input location What are the programs? Then, from the cascading application at the input location, an application that the first input is reachable at the input location is determined based on the non-location information.
  • the application that is reachable by the first input at the input location may be determined according to the non-location information by: a numerical value Q corresponding to the non-location information and an adjacent two-layer application according to the non-location information.
  • the logical distance D between the programs calculates the number N of the reachable applications.
  • the Layer 1 to Layer N application in the cascading application at the input location is then determined to be the application at which the first input is reachable at the input location.
  • the function f can also be a non-linear function. That is to say, the number of applications that are reachable by the first input does not exhibit a linear change relationship with the touch pressure.
  • Q Limit indicates the numerical value of the non-position information that the system can recognize, such as the upper limit of the touch pressure.
  • the following describes how to display an application that the first input is reachable at the input location in the first display area.
  • the reachable application can be nested centrally with the input location. Specifically, the reachable application is respectively displayed in the N sub-display areas included in the first display area, wherein a sub-display area displays a corresponding output of the reachable application in the sub-display area.
  • the content, the sub-display area corresponding to the reachable i-th layer application is nested in the periphery of the sub-display area corresponding to the i+1th layer application.
  • i i ⁇ N
  • N the number of applications that the first input is reachable at the input location.
  • creating each reachable application corresponding display area centering on the input location may include the following situations: if the sub display area is a rectangle, the center may be two diagonals of a rectangle The intersection of the lines; if the sub-display area is circular or elliptical, the center may be a center of a circle or an ellipse; if the sub-display area is a sector, the center may be a vertex of a sector.
  • the determination strategy of the "center" the embodiment of the present invention is not limited.
  • the system refreshes the content of each reachable application correspondingly in the corresponding display area in real time.
  • only the underlying application at which the first input is reachable at the coordinates can be displayed in the first display area. In this way, the user can completely view the content of the reachable underlying application output in the first display area at a time.
  • the application whose first input is reachable at its coordinates is only an application currently displayed on the top layer of the first display area, such as Facebook.
  • Facebook is both a top-level application and a reachable underlying application.
  • only Facebook may be displayed in the first display area.
  • the application that the first input is reachable at its coordinates is assumed to include: Facebook and Google Map from top to bottom.
  • Google Maps is the underlying application that is reachable.
  • the Google map is displayed in the first display area.
  • the first input is changeable in real time, including the change of the input position and the change of the non-position information. details as follows:
  • an application that the first input is reachable at the new input location can be determined, and then, in the In the first display area, an application that the first input is reachable at the new input location is nested around the new input location.
  • the first input may be determined at the input location. Add a new reachable application, and then create a new sub-display area in the display area of the currently reachable underlying application to display the newly added application.
  • the first input may be determined at the input location
  • the reduced reachable application is then canceled in the display area of the upper layer application of the reduced reachable application in the display area corresponding to the reduced reachable deep application.
  • the input location and the non-location information corresponding to the first input may also change simultaneously.
  • the first input may be released.
  • the application that remains reachable is still displayed at the first within a specified delay time, such as 2 seconds.
  • a specified delay time such as 2 seconds.
  • This embodiment of the invention refers to this operation as a second input.
  • the application that the first input is reachable at the input location may be adjusted according to the operation type of the second input.
  • the display state in the first display area for example, cancels the display of a certain reachable application.
  • the reachable application is cancelled.
  • a corresponding application icon may be set for each display area corresponding to each reachable application.
  • an application icon of the Google map is set for the upper right corner of the display area corresponding to the Google map
  • an application icon of the album is set for the upper right corner of the sub display area of the album.
  • the size of the corresponding display area of each reachable application may be a fixed value.
  • the size of the display area corresponding to each of the reachable applications may also be related to the value of the non-coordinate information corresponding to the first input, such as the touch pressure. For example, the greater the touch pressure, the larger the display area.
  • a graphical user interface is provided on a terminal device, the terminal device having a touch screen, a memory, and one or more processors for executing one or more programs stored in the memory,
  • the graphical user interface includes: a first display area of the touch screen, and a plurality of running applications stacked in the first display area; wherein:
  • the reachable application is respectively displayed in N sub-display areas included in the first display area, wherein one sub-display area displays a reachable application corresponding output
  • the sub-display area corresponding to the reachable i-th layer application is nested in the periphery of the sub-display area corresponding to the i+1th layer application.
  • i i ⁇ N
  • N the number of applications that the first input is reachable at the input location.
  • only the underlying application that the first input is reachable at the input location is displayed in the first display area.
  • the first input may be changed in real time or may be released.
  • the graphical user interface further includes: displaying the first in the first display area in response to detecting the change of the input position corresponding to the first input Enter the application that is reachable at the new input location.
  • the graphical user interface further comprises: responsive to the detected increase in the magnitude of the corresponding non-positional information of the first input at the input location, Newly reachable applications are nested in the display area corresponding to the underlying application.
  • the graphical user interface further comprises: responsive to the detected decrease in the magnitude of the corresponding non-positional information of the first input at the input location, in a reduced The reduced reachable application is undisplayed in the display area corresponding to the upper layer application of the reachable application.
  • the graphical user interface further comprises: in response to the detected first input being released, maintaining the reachable application still displayed during a specified delay time In the first display area.
  • the graphical user interface further includes: adjusting the reachable application in the first display area in response to the detected second input for the first display area within the specified delay time Display status in .
  • a terminal including: a touch screen, a processor, wherein:
  • the touch screen is configured to display a plurality of running applications in a cascade, and detect a first input for the application;
  • the processor is configured to acquire an input location corresponding to the first input, and non-position information corresponding to the first input at the input location;
  • the processor is configured to determine, according to the input location and the non-position information, an application that is reachable by the first input at the input location, and instruct the touch screen to display the reachable application;
  • the touch screen is configured to display an application that the first input is reachable at the input location.
  • the non-positional information is touch pressure.
  • the terminal further includes: a pressure sensor disposed under the touch screen.
  • the pressure sensor is configured to detect a touch pressure of the first input at the input position.
  • the processor acquires a touch pressure of the first input at the input position by using the pressure sensor.
  • the non-location information is a touch duration.
  • the terminal further includes: a timer.
  • the timer is configured to detect a touch duration of the first input at the input location.
  • the processor acquires a touch duration of the first input at the input position by using the timer.
  • the non-location information is a touch area.
  • the touch screen is configured to detect a touch area of the first input at the input position, and the processor acquires the touch area through the touch screen.
  • a terminal comprising a functional unit for performing the method of the above first aspect.
  • a fifth invention provides a readable non-volatile storage medium storing computer instructions that are executed by a terminal device having a touch screen to implement the method described in the first aspect above.
  • FIGS. 1A-1C are schematic diagrams of a stacked application program according to an embodiment of the present invention.
  • 2-9 are schematic diagrams of user interfaces provided by an embodiment of the present invention and implementations of operations for viewing a stacked application in the user interface;
  • FIG. 10 is a schematic structural diagram of a terminal according to an embodiment of the present disclosure.
  • FIG. 11 is a schematic flowchart of a terminal processing user operation of the terminal of FIG. 10 according to an embodiment of the present invention.
  • FIG. 12 is a schematic flowchart of a method for viewing an application according to an embodiment of the present invention.
  • FIG. 13 is a functional block diagram of a terminal according to an embodiment of the present invention.
  • stacking applications In order to facilitate the understanding of the embodiments of the present invention, the application scenario involved in the embodiments of the present invention is first described: stacking applications.
  • FIGS. 1A-1C are schematic diagrams of a cascading application program according to an embodiment of the present invention.
  • a social application such as Facebook (English: Facebook); an image management application, such as an album; a map application, such as Google Maps; a browser, for example Safari, Google Chrome, and more.
  • a browser for example Safari, Google Chrome, and more.
  • the screen 120 of the terminal 100 can only display the content of one application completely, such as Facebook.
  • the embodiment of the present invention refers to the application as a top-level application, that is, a first-tier application, referred to as a first-tier application. .
  • the remaining running applications are occluded by the top-level application, cascading below the top-level application, making it inconvenient for users to view.
  • terminal 100 maintains a data structure, such as a stack, to store a stacked structure of running applications with each other.
  • the embodiment of the present invention refers to such an application that presents a layered structure with each other as a cascading application.
  • each application has a direct upper application (in addition to the top-level application above) and a direct lower-level application (except the underlying application), and each application is applied by its immediate upper application. Blocked.
  • the cascading application may exhibit two types of cascading forms: the first type, the upper layer application completely occludes the lower layer application, as shown in FIG. 1A; the second type, the upper layer application partially occludes the lower layer application, as shown in FIG. 1B.
  • the situation illustrated in FIG. 1B is more common in embodiments in which terminal 100 has a larger screen 120, such as terminal 100 being a tablet.
  • the Z axis represents a coordinate axis perpendicular to the screen, and the applications running in the terminal 100 simultaneously have a stacked structure along the Z axis in the stacking order.
  • Google Maps and photo albums are located on the second and third floors respectively, and are blocked by the top-level application Facebook, which is not convenient for users to view.
  • the stacking order between stacked applications can be related to the timing at which the application is activated. For example, the last application that is activated is usually at the top level.
  • the activation of the application may be triggered by an action of opening the application, or actively operating the application, or by an internal event of the application, such as a report completion event. It should be noted that other strategies may be used to determine the stacking order between the cascading applications, which is not limited by the embodiment of the present invention.
  • FIGS. 1A-1C are merely examples of the cascading application program according to the embodiment of the present invention, and should not be construed as limiting. In practical applications, the number of stacked applications may be more or less than the three shown in Figures 1A-1C, and the stacked application and stacking order are not limited by Figures 1A-1C.
  • UI for viewing a cascading application
  • User Interface User Operation
  • FIG. 2 is a schematic diagram of user operations for viewing a cascading application according to an embodiment of the present invention.
  • the user operation for viewing the cascading application is referred to as a first input.
  • the first input may be a touch operation.
  • the first input typically acts at a location in the user interface where no response action is defined.
  • the position where the response action is defined refers to a touch position capable of triggering a predefined function by a touch operation, such as a touch position of a shortcut of an application, a touch position of a control such as a button.
  • the system when the system detects the first input, the system can obtain two pieces of information: coordinate and non-coordinate information.
  • the coordinate is an input position corresponding to the first input, specifically indicating an output content of the user where the user wants to view the cascading application;
  • the non-coordinate information is that the first input is at the input position
  • the corresponding non-position information may be information such as the touch pressure or the touch duration or the touch area.
  • the system can display, in the user interface, one or more applications in the cascading application correspondingly output content in the vicinity of the coordinates.
  • the coordinates and the non-coordinate information may determine which of the stacked applications are capable of being displayed in the user interface.
  • the first input is an attainable application at the coordinates.
  • the first input may be similar to a "probe”, and the touch pressure represents a depth of contact of the "probe” vertically downward.
  • an application whose first input is reachable at the coordinates can be similarly viewed as an application that the "probe” can reach down at the coordinates. It should be understood that the greater the touch pressure, the greater the downward touch depth of the "probe", and the more applications the first input can reach at the coordinates.
  • the first layer application that the first input is reachable at the coordinates is Facebook, and if the touch pressure corresponding to the first input is sufficiently large, the first input is in the Applications that are reachable at coordinates may further include Google Maps, or, further, may include Google Maps and photo albums.
  • the application which is the first input that is reachable at the coordinates, determines the application displayed in the user interface.
  • the application determines the application displayed in the user interface.
  • the following describes how to determine the number of applications that the first input is reachable at the coordinates.
  • the logical distance between adjacent two-tier applications is assumed to be D.
  • the logical distance is used to measure the number of applications that the first input is reachable at the coordinates.
  • the function f(Q/D) may be used to represent the number N of applications that the first input is reachable at the coordinates.
  • Q represents a numerical value of the non-coordinate information corresponding to the first input, such as a touch pressure.
  • the function f can be a linear function.
  • the function f may also be a linear function of other forms, which is not limited herein.
  • the function f can also be a non-linear function. That is to say, the number of applications that are reachable by the first input does not exhibit a linear change relationship with the touch pressure.
  • the first input may change in real time.
  • the system may sample the coordinate and non-coordinate information corresponding to the first input at intervals, such as 10 microseconds. If the sampling data of the sampling time T and the sampling data of the sampling time T+1 are different, and the difference exceeds a sufficiently significant threshold, the first input is considered to have changed; otherwise, the first input is considered to have not changed.
  • the significant threshold may be determined according to the system configuration of the terminal device, etc., which is not limited by the embodiment of the present invention.
  • the real-time change of the first input may be divided into the following two situations or a combination of the following two situations:
  • the coordinates of the first input change, and the change can be used to change the viewing position.
  • the coordinates of the first input of the sampling time T are (20, 20)
  • the coordinates of the first input of the sampling time T+1 are (30, 30).
  • the user changes the output content of the application at the coordinates (20, 20) from the original viewing application to view the output content of the application at coordinates (30, 30).
  • the non-coordinate information of the first input changes, and the change can be used to change the number of applications that the user can view. For example, as shown in FIG. 2, if the touch pressure of the first input increases, the number of applications that the user can view at the coordinates (X, Y) increases.
  • the first input is considered to be released at the sampling instant T+1.
  • the first input is a touch operation
  • the first input is released may refer to a touch point, For example, a finger, leaving the touch screen of the terminal.
  • the first input may also be a gesture operation.
  • the non-coordinate information corresponding to the first input may be a gesture depth.
  • the release of the first input may refer to a sensing range of a gesture sensor that leaves the terminal, such as a finger or an arm.
  • the first input may also be other types of input operations, and no limitation is made herein.
  • 3A-3D and 4A-4B illustrate some embodiments in which a user implements the first input in a user interface.
  • the screen 120 is a touch screen
  • the first input takes a touch operation as an example
  • the non-coordinate information of the first input is exemplified by a touch pressure.
  • 3A-3D are user interface embodiments for implementing the first input for a first stacked form of cascading application.
  • the user interface can include a first display area 10 of the screen 120, and an application stacked in the first display area 10.
  • the applications stacked in the first display area 10 may include Facebook, Google Maps, and photo albums from top to bottom as shown in FIGS. 1A-1C.
  • the first display area 10 can occupy the entire screen 120 and can also occupy part of the screen 120, such as a split screen application.
  • the system detects the first input.
  • the coordinate of the first input is (X1, Y1)
  • the touch pressure corresponding to the first input is Q1.
  • the system can determine, according to the coordinates (X1, Y1) and the touch pressure Q1, that the application that the first input is reachable at the coordinates (X1, Y1) is: Facebook and Google Maps.
  • Facebook is the first layer application whose first input is reachable at the coordinates.
  • the reachable deep application is relative to the reachable first-tier application, and refers to the underlying application that is occluded by the reachable first-tier application.
  • a new display area is created centering on the coordinates (X1, Y1) of the first input: a display area 20 for display
  • the first input is a second layer application that is reachable at coordinates (X1, Y1), namely Google Maps.
  • the first display area 10 is divided into two sub-display areas: a first sub-display area and a second sub-display area, and the first sub-display area is embedded. Nested on the periphery of the second sub-display area, wherein the first sub-display area is used to display the reachable first layer application
  • the program that is, Facebook, outputs the content in the first sub-display area correspondingly
  • the second sub-display area is used to display the reachable second-tier application, that is, the Google map, and correspondingly output the content in the second sub-display area.
  • the second sub display area is the newly created display area 20.
  • the coordinates of the first input are unchanged, and are still (X1, Y1).
  • the touch pressure corresponding to the first input is increased to Q2, and the application that the first input is reachable at the coordinates (X1, Y1) is increased to: Facebook, Google Maps, and photo album.
  • a new display area is further created in the display area 20 centering on the coordinates (X1, Y1): a display area 30 for displaying the first Enter the third-tier application that is reachable at coordinates (X1, Y1), the album.
  • the first display area 10 is further divided into three sub-display areas: a first sub-display area, a second sub-display area, and a third sub-display area.
  • the second sub display area is nested on the periphery of the third sub display area
  • the first sub display area is nested on the periphery of the second sub display area, wherein the first sub display area is used to display the reachable first layer application.
  • the program that is, Facebook, outputs the content in the first sub-display area
  • the second sub-display area is used to display the reachable second-tier application, that is, the Google map, and correspondingly output the content in the second sub-display area.
  • the third sub display area is for displaying a reachable third layer application, that is, an album, and correspondingly outputting the content in the third sub display area.
  • the third sub display area is the newly created display area 30.
  • the first input further reaches the lower layer application at the coordinates (X1, Y1)
  • newly created in the reachable third layer application ie, the album
  • the corresponding display area 30 Display the area and display the reachable fourth-tier application in the newly created display area.
  • all of the reachable deep applications are nested and displayed in the first display area 10 in a stacked order. That is to say, the first display area 10 is divided into more sub-display areas for displaying the reachable applications.
  • the embodiment of the present invention may refer to the method image for displaying the reachable application described in the above process as a nested display method.
  • the nested display method can be summarized as follows: the reachable application is respectively displayed in the N sub-display areas included in the first display area, wherein one sub-display area displays a reachable application corresponding output The content in the sub-display area, the sub-display area corresponding to the reachable i-th layer application is nested in the periphery of the sub-display area corresponding to the i+1th layer application; wherein i ⁇ N,i is positive An integer N, the first input being reachable at the input location The number of apps.
  • the content of the reachable application correspondingly outputted in a display area refers to: the reachable application outputs the content in the coordinate range of the display area.
  • the coordinate range covered by the second sub-display area in FIG. 3B is [(20, 20), (120, 100)], where (20, 20) represents the coordinates of the upper left corner of the second sub-display area, (120, 100). ) indicates the coordinates of the lower right corner of the second sub display area.
  • the content correspondingly output by the Google map in the second sub-display area refers to the output content of the Google map in the rectangular area defined by the coordinate range [(20, 20), (120, 100)].
  • the examples are merely illustrative of the embodiments of the invention and should not be construed as limiting.
  • the system refreshes the content of each corresponding deep application correspondingly in the corresponding display area in real time. For example, as shown in FIG. 3C, if a photo album is playing a video, the content displayed in the third sub-display area is synchronized with the video playback.
  • FIG. 3C if a photo album is playing a video, the content displayed in the third sub-display area is synchronized with the video playback.
  • 4A-4B are user interface embodiments for implementing the first input for a cascading application in a second stacked form.
  • the second layered form that is, the upper layer application partially blocks the lower layer application, as shown in FIG. 4A.
  • the user interface can include a first display area 10 of the screen 120, and an application stacked in the first display area 10.
  • the applications stacked in the first display area 10 may include Facebook, Google Maps, and photo albums from top to bottom as shown in FIGS. 1A-1C.
  • the first display area 10 can occupy the entire screen 120 and can also occupy part of the screen 120, such as a split screen application.
  • the system detects the first input.
  • the coordinate of the first input is (X2, Y2), and the touch pressure corresponding to the first input is Q1.
  • the system can determine, according to the coordinates (X2, Y2) and the touch pressure Q1, that the application that the first input is reachable at the coordinates (X2, Y2) is: Google map and album.
  • the Google map is the first layer application that the first input is reachable at the coordinates (X2, Y2).
  • the user interface changes as follows: in the first display area 10, a deep application that the first input is reachable at coordinates (X2, Y2) is displayed, ie Album. Specifically, in the first display area 10, a new display area is created centering on the coordinates (X2, Y2): the display area 20, and the album is displayed in the display area 20.
  • the lower layer application can be further displayed in the display area 20.
  • the specific process of the nested display reachable application refer to the content in the embodiment of FIG. 3A-3D, which is not described here.
  • 5A-5B are embodiments in which the first input changes in real time.
  • the user interface can include a first display area 10 of the screen 120, and an application stacked in the first display area 10.
  • the applications stacked in the first display area 10 may include Facebook, Google Maps, and photo albums from top to bottom as shown in FIGS. 1A-1C.
  • the first display area 10 can occupy the entire screen 120 and can also occupy part of the screen 120, such as a split screen application.
  • the system detects a change in the coordinates of the first input: from (X1, Y1) to (X2, Y2).
  • the deep application that the first input is reachable at coordinates (X2, Y2) is re-determined and the deep application to which the first input is reachable at coordinates (X2, Y2) is displayed.
  • the system detects that the touch pressure of the first input changes.
  • the application displayed in the first display area 10 can be adjusted in real time in response to changes in the touch pressure.
  • a new sub-display area is created in the display area of the lowest-level application to display the newly added application.
  • the newly added deep application is a photo album.
  • a new sub-display area is displayed in the display area (second sub-display area) corresponding to the Google map of the upper layer application of the album for displaying the album.
  • the reduced display area of the upper layer application (Google Map) of the deep application is deleted in the display area corresponding to the reduced reachable deep application (the third sub display area).
  • the reduced reachable deep application is an album.
  • the display area (third sub-display area) corresponding to the album is deleted in the display area (second sub-display area) corresponding to the Google map application of the upper layer of the album.
  • the system may detect that the coordinates of the first input and the touch pressure change simultaneously. In response to simultaneous changes in coordinates and touch pressure, the location of the reachable application is displayed in real time and the application displayed in the first display area 10 is adjusted in real time.
  • 6A-6C are operational embodiments in which a user releases the first input and adjusts a display state of the reachable application in the first display area.
  • the user interface can include a first display area 10 of the screen 120, and an application stacked in the first display area 10.
  • the applications stacked in the first display area 10 may include Facebook, Google Maps, and photo albums from top to bottom as shown in FIGS. 1A-1C.
  • the first display area 10 can occupy the entire screen 120 and can also occupy part of the screen 120, such as a split screen application.
  • the system detects that the first input is released.
  • the reachable application remains displayed in the first display area 10, ie, the user, within a specified delay time after the first input is released, for example, 2 seconds.
  • the interface does not change during the specified delay time.
  • the display of the reachable application is cancelled.
  • the system detects a sliding operation for the first display area 10, and the sliding operation is performed on The display area corresponding to the album.
  • the user interface changes as follows: as shown in the right drawing of FIG. 6B, the display area corresponding to the album is deleted in the display area (second sub-display area) corresponding to the upper-layer application Google map of the album (3rd sub display area). It can be seen that the corresponding operation effect of the sliding operation is to cancel display of an application in the reachable application.
  • the system detects a selection operation for the first display area 10, such as a click operation, and the selection
  • the action object of the operation is the display area corresponding to the album.
  • the user interface changes as follows: as shown in the right side of FIG. 6C, the album is adjusted to the top-level application in the first display area 10, and the original top-level application Facebook is adjusted to any layer. For example, the layer of the album. It can be seen that the operation effect corresponding to the selection operation is to adjust an application in the reachable application to the top display.
  • the operation of the embodiment in the specified delay time for adjusting the display state of the reachable application in the first display area is referred to as a second input.
  • the operation effect corresponding to the second input and the second input is not limited to the foregoing embodiment, and may be set according to specific requirements in practical applications, and is not limited herein.
  • the shape and size of the sub-display area in the first display area 10 are not limited by the FIGS. 3A-3D and FIGS. 4A-4B.
  • the sub display area may also be circular.
  • the sub display area may also be a fan shape.
  • the sub-display area represented by a digital number can be used to display the content of a reachable deep application correspondingly outputted in the sub-display area.
  • the circular sub-display area "1" is used to display the content of the reachable second layer application correspondingly outputted in the circular sub-display area "1".
  • the circular sub-display area "2" is used to display the content of the reachable third-layer application correspondingly outputted in the circular sub-display area "2".
  • the examples are merely illustrative of the embodiments of the invention and should not be construed as limiting. In practical applications, the sub-display area may also be other shapes not shown in the drawings, which are not limited herein.
  • the width equally divided into each of the circular sub-display areas may be small, thereby making it difficult for the user to see the contents in the circular sub-display area.
  • a sub-display area can be divided into multiple parts, each part for displaying an application.
  • the outer nested sub-display area can be divided into four sections, each section for displaying an application.
  • the sub-display area does not have a lot of nesting levels, and the width of each layer can clearly display the content of the corresponding application.
  • FIG. 7C shows only one implementation manner of the embodiment of the present invention. In an actual application, a partitioning strategy may be formulated according to specific requirements, and is not limited herein.
  • corresponding application icons may be set for each sub-display area.
  • a Google map is set in the upper right corner of the sub display area (second sub display area) corresponding to the Google map.
  • the application icon sets the application icon of the album for the upper right corner of the sub-display area (the third sub-display area) of the album.
  • the size of the second sub-display area may be a fixed value.
  • the width of the sub-display area nested within the second sub-display area can be adaptively adjusted according to the number of reachable deep-layer applications. Specifically, the more the number of reachable deep applications, the smaller the width of the sub display area; the smaller the number of reachable deep applications, the larger the width of the sub display area.
  • the size of the second sub-display area may also be related to a numerical value of the non-coordinate information corresponding to the first input, such as a touch pressure. Specifically, the larger the touch pressure, the larger the second sub display area.
  • nesting the display of the reachable application centering on the coordinates of the first input may include the following situations: if the sub display area is a rectangle, the center may be a rectangle. An intersection of two diagonal lines; if the sub-display area is circular or elliptical, the center may be a circular or elliptical center; if the sub-display area is a fan, the center may It is the apex of the fan.
  • the determination strategy of the "center" of the sub-display area the embodiment of the present invention is not limited.
  • the first display area 10 can be used to display only the underlying application that the first input is reachable at the coordinates. details as follows:
  • the application that the first input is reachable at its coordinates includes: Facebook and Google Map from top to bottom.
  • Google Maps is the underlying application that is reachable.
  • the Google map is displayed in the first display area 10. It should be understood that only the Google map is visible to the user in the user interface at this time.
  • the application that the first input is reachable at its coordinates includes: Facebook, Google Maps, and photo album from top to bottom.
  • the album is the underlying application that is reachable.
  • the touch pressure corresponding to the first input can be reduced.
  • the terminal device supports multi-threaded operation and can run multiple applications or services at the same time.
  • Applications supported by the terminal device may include: social applications such as Facebook; image management applications such as photo albums; map applications such as Google Maps; browsers such as Safari, Google Chrome, and the like.
  • These applications can have a common input and output device: a touch screen.
  • the touch screen is used to receive the user's touch operation and display the output content of the application.
  • the common input device of the plurality of applications may also be a gesture input device, such as a gesture sensor.
  • FIG. 10 is a structural block diagram of an implementation manner of the terminal device 100.
  • the terminal 100 can include a baseband chip 110, a memory 115, and one or more computer readable storage media, a radio frequency (RF) module 116, and a peripheral system 117. These components can communicate over one or more communication buses 114.
  • RF radio frequency
  • the peripheral system 117 is mainly used to implement the interaction function between the terminal 110 and the user/external environment, and mainly includes the input and output devices of the terminal 100.
  • the peripheral system 117 can include a touch screen controller 118, a camera controller 119, an audio controller 120, and a sensor management module 121.
  • Each controller may be coupled to a respective peripheral device, such as a touch screen 123, a camera 124, an audio circuit 125, and a sensor 126.
  • the gesture sensor in sensor 126 can be used to receive gesture control operations input by the user.
  • the pressure sensor in the sensor 126 can be disposed under the touch screen 123 and can be used to collect the touch pressure applied to the touch screen 123 when the user inputs the touch operation through the touch screen 123. It should be noted that the peripheral system 117 may also include other I/O peripherals.
  • the baseband chip 110 can be integrated to include one or more processors 111, a clock module 112, and a power management module 113.
  • the clock module 112 integrated in the baseband chip 110 is primarily used to generate the clocks required for data transfer and timing control for the processor 111.
  • the power management module 113 integrated in the baseband chip 110 is mainly used to provide stable, high-precision for the processor 111, the radio frequency module 116, and the peripheral system. The voltage of the accuracy.
  • a radio frequency (RF) module 116 is used to receive and transmit radio frequency signals, primarily integrating the receiver and transmitter of the terminal 100.
  • a radio frequency (RF) module 116 communicates with the communication network and other communication devices via radio frequency signals.
  • the radio frequency (RF) module 116 may include, but is not limited to: an antenna system, an RF transceiver, one or more amplifiers, a tuner, one or more oscillators, a digital signal processor, a CODEC chip, a SIM card, and Storage media, etc.
  • a radio frequency (RF) module 116 can be implemented on a separate chip.
  • Memory 115 is coupled to processor 111 for storing various software programs and/or sets of instructions.
  • memory 115 may include high speed random access memory, and may also include non-volatile memory, such as one or more magnetic disk storage devices, flash memory devices, or other non-volatile solid state storage devices.
  • the memory 115 can store an operating system such as an embedded operating system such as ANDROID, IOS, WINDOWS, or LINUX.
  • the memory 115 can also store a network communication program that can be used to communicate with one or more additional devices, one or more terminal devices, one or more network devices.
  • the memory 115 can also store a user interface program, which can realistically display the content image of the application through a graphical operation interface, and receive user control operations on the application through input controls such as menus, dialog boxes, and keys. .
  • the memory 115 can also store one or more programs. As shown in FIG. 10, these programs may include: social applications such as Facebook; image management applications such as photo albums; map applications such as Google Maps; browsers such as Safari, Google Chrome, and the like.
  • FIG. 11 illustrates two main user operation processing stages involved in the embodiment of the present invention from the internal processing flow of the terminal 100.
  • the first stage (1-5) mainly explains how the terminal 100 processes the user operation for viewing the cascading application, that is, the first input
  • the second stage (6-10) mainly explains how the terminal 100 handles the adjustment for display.
  • the touch screen 123 detects the touch operation and notifies the processor 111.
  • the processor 111 determines that this touch operation is a user operation for viewing the cascading application, that is, the first input.
  • the touch screen 123 sends the touch point information of the touch operation in the above step 1 to the processor 111.
  • the touch point information includes location information and non-location information. Where the location information is a coordinate, non-location letter
  • the information is non-coordinate information.
  • the non-coordinate information may be the touch pressure collected by the pressure sensor under the touch screen 123, the touch area collected by the touch screen 123, or the touch duration collected by the timer.
  • the processor 111 can obtain corresponding non-position information through a pressure sensor, a touch screen, a timer, and the like, respectively.
  • the processor 111 combines the application currently displayed in the touch screen 123 and the touch point information obtained in the above step 2, including coordinate and non-coordinate information, to analyze which applications in the cascading application need to be displayed.
  • the applications that can be displayed are the application in which the first input in the above step 1 is reachable at the touch point, that is, the coordinates.
  • the processor 111 may instruct the touch screen 123 to nest the display of the reachable application centering on the coordinates.
  • the processor 111 may instruct the touch screen 123 to nest the display of the reachable application centering on the coordinates.
  • how to nest the display of the reachable application please refer to the content in the foregoing embodiment, and details are not described herein again.
  • the processor 111 sends an instruction to nest the display overlay application to the output device, such as the touch screen 123, and triggers the touch screen 123 to display the content of the reachable application each output in the corresponding display area.
  • the processor 111 may repeatedly perform the foregoing steps 2-4 to obtain the real-time touch point information of the first input. Because the touch point information of the first input, such as coordinate and non-coordinate information, can be changed in real time.
  • the touch screen 123 detects that the contact of the touch operation in the above step 1 leaves the screen and notifies the processor 111. According to the foregoing embodiment, if the touch screen 123 can still sample the contact information at the sampling time T, but the touch screen 123 cannot sample the contact information at the sampling time T+1, it is determined that the contact leaves the touch screen 123 at the sampling time T+1. .
  • the contact leaving the screen means that the first input is released.
  • the processor 111 sends an instruction to the touch screen 123 to maintain the display state of the stacked application for a specified delay time, instructing the touch screen 123 to cascade the display state of the application within a specified delay time after the contact leaves the screen.
  • the contacts slowly leave the screen.
  • the magnitude of the non-coordinate information of the first input such as the touch pressure
  • the magnitude of the non-coordinate information of the first input has been reduced to zero at the sampling instant T. That is to say, at the sampling time T, the first input has no reachable application at the coordinates, and the deeper layer originally displayed in the touch screen 123 The application has disappeared.
  • the touch screen 123 is restored to the display state before receiving the first input.
  • the contacts quickly leave the screen.
  • the value of the non-coordinate information of the first input such as the touch pressure
  • the value of the non-coordinate information of the first input is not reduced to 0 at the sampling time T.
  • the touch screen 123 maintains the display state of the contacts off the screen during the specified delay time of the off-screen.
  • the touch screen 123 detects a new touch operation, such as clicking, sliding, etc., and notifying the processor 111 within a specified delay time after the contact is off the screen.
  • the processor 111 determines that this new touch operation is the second input.
  • the processor 111 determines which application the target of the new touch operation is, and the touch effect corresponding to the new touch operation.
  • the processor 111 sends an instruction to the touch screen 123 to execute the touch effect, instructing the touch screen 123 to perform the touch effect.
  • the processor 111 sends an instruction to the touch screen 123 to execute the touch effect, instructing the touch screen 123 to perform the touch effect.
  • the processor 111 sends an instruction to cancel the nested display cascading application to the touch screen 123.
  • the reachable applications that cancel the nested display by notifying the touch screen 123 each output the content in the corresponding display area.
  • the input device in FIG. 11 may also be a gesture sensor, and the user operation detected by the gesture sensor is corresponding to a gesture operation, and the contact corresponding to the action point of the gesture operation.
  • the input device in FIG. 11 may also be other types of input devices, which are not limited herein.
  • terminal 100 is only an example provided by an embodiment of the present invention, and that the terminal 100 may have more or less components than the illustrated components, may combine two or more components, or may have components. Different configurations are implemented.
  • FIG. 12 is a flowchart showing a method for viewing an application according to an embodiment of the present invention. intention.
  • the first input is a touch operation applied to a touch screen of the terminal, and the non-coordinate information of the first input is a touch pressure.
  • the method includes:
  • the touch screen receives the first input. Specifically, a plurality of running applications are displayed in a cascade display in the first display area of the touch screen.
  • the first display area may occupy the entire touch screen or may occupy part of the touch screen, such as in a split screen application.
  • the first input is for viewing an application displayed in a layered manner in the first display area.
  • the processor acquires an input position corresponding to the first input through the touch screen, and acquires a touch pressure of the first input at the input position by a pressure sensor disposed under the touch screen.
  • the processor may analyze an application that is reachable by the first input at the input location according to an input position of the first input in the touch screen and a touch pressure acquired by the pressure sensor. .
  • the processor instructs an application in the touch screen to display that the first input is reachable at the coordinates.
  • an application that is reachable by the first input at the coordinates may be displayed in the first display area of the touch screen.
  • the processor may first determine, according to the input location, which of the cascading applications at the input location, and then from the cascading application at the input location, according to the non-location information, such as pressure
  • the touch pressure Q detected by the sensor determines an application that the first input is reachable at the input location.
  • the cascading application at the coordinates is the same, that is, including all the applications stacked in the first display area 10.
  • the application layered in the first display area 10 may partially occupy the first display area 10, cascading at different positions in the user interface
  • the application may be different.
  • the cascading applications at coordinates (X2, Y2) are: Google Maps and Albums, and do not include the top-level application Facebook.
  • the processor may determine, according to the non-position information, for example, the touch pressure Q detected by the pressure sensor, that the first input is reachable at the input position Program:
  • Step 1 Calculate the number N of the reachable applications according to the touch pressure Q acquired by the pressure sensor and the logical distance D between adjacent two-layer applications.
  • Step two determining a layer 1 to layer N application in the layered application at the input location as an application that the first input is reachable at the input location.
  • the numerical value of the non-position information that can be recognized by the pressure sensor under the touch screen may have an upper limit, which is represented as Q Limit .
  • Q Limit the upper limit
  • N the number of applications in which the first input is reachable at the coordinates described in the foregoing embodiment
  • Q Limit /D the upper limit
  • the processor may display the reachable application by using the following implementation manner.
  • the processor may instruct the touch screen to nest the display of the reachable application centered on the input location in the first display area of the touch screen Dev.
  • the nested display method can be summarized as follows: the reachable application is respectively displayed in the N sub-display areas included in the first display area, wherein one sub-display area displays a reachable application corresponding output The content in the sub-display area, the sub-display area corresponding to the reachable i-th layer application is nested in the periphery of the sub-display area corresponding to the i+1th layer application.
  • i ⁇ N
  • i is a positive integer
  • N is the number of applications that the first input is reachable at the input location.
  • creating each reachable application corresponding display area centering on the input location may include the following situations: if the sub display area is a rectangle, the center may be two diagonals of a rectangle The intersection of the lines; if the sub-display area is circular or elliptical, the center may be a center of a circle or an ellipse; if the sub-display area is a sector, the center may be a vertex of a sector.
  • the determination strategy of the "center" the embodiment of the present invention is not limited.
  • the processor triggers the touch screen to refresh the content of each reachable application correspondingly in the corresponding display area.
  • the processor may also instruct the touch screen to display only the underlying application that the first input is reachable at the coordinates in the first display area.
  • the processor may also instruct the touch screen to display only the underlying application that the first input is reachable at the coordinates in the first display area.
  • the processor may further monitor a real-time change of the first input through the touch screen and the pressure sensor.
  • the method for viewing an application provided by the embodiment of the present invention further includes: S109, the processor monitors whether the first input changes through the touch screen, if the occurrence occurs. If the change is made, the processor may repeatedly execute S103-S107 to timely adjust the display condition of the cascading application according to the real-time situation of the first input.
  • the processor may determine an application that the first input is reachable at a new input location, and then the processor may indicate The touch screen, in the first display area, nests and displays an application that is accessible by the first input at the new input location, centering on a new input location.
  • the touch screen in the first display area, nests and displays an application that is accessible by the first input at the new input location, centering on a new input location.
  • the processor may determine that the first input is added at the input position
  • the reachable application then instructs the touch screen to create a new sub-display area in the display area of the currently reachable underlying application for displaying the newly added application.
  • the processor may determine that the first input is at the input location
  • the reduced reachable application then instructs the touch screen to cancel the reduced reachable deep application corresponding display area in the display area of the upper level application of the reduced reachable application.
  • the real-time change of the first input may include a change of the input location, or a change of the non-location information, or both.
  • the first input may be released.
  • the processor may indicate the touch screen adjustment according to the operation type of the second input. Determining, by the first input, a display state of an application reachable at the input location in the first display area of the touch screen.
  • the non-position information corresponding to the first input may also be collected by the processor through the touch screen.
  • the touch area to be obtained may also be the touch time recorded by the processor through a timer, etc., and is not limited herein.
  • the first input may also be a gesture operation detected by the gesture sensor.
  • the non-position information corresponding to the first input may be a gesture depth acquired by the processor by the gesture sensor.
  • the release of the first input may refer to a sensing range of the gesture sensor of the user's hand, such as a finger or an arm, leaving the terminal.
  • the first input may also be other types of input operations. There are no restrictions here.
  • FIG. 13 is a functional block diagram of a terminal according to an embodiment of the present invention.
  • the functional blocks of the terminal may implement the inventive arrangements by hardware, software or a combination of hardware and software.
  • the functional blocks depicted in Figure 13 can be combined or separated into several sub-blocks to implement the inventive arrangements. Accordingly, the above description of the invention may support any possible combination or separation or further definition of the functional modules described below.
  • the terminal 200 may include an input unit 201, a processing unit 203, and an output unit 205. among them:
  • the input unit 201 is configured to receive the first input.
  • the processing unit 203 is configured to acquire an input location corresponding to the first input, and non-location information corresponding to the first input at the input location;
  • the processing unit 203 is further configured to analyze, according to the input location and the non-location information, an application that is reachable by the first input at the input location;
  • the output unit 205 is configured to display an application that is accessible by the first input at the input location.
  • the output unit 205 can be a touch display, such as the touch screen 123 in FIG.
  • the first input is for viewing an application stacked in a first display area of the output unit 205.
  • the input unit 201 may be the touch screen 123 in FIG. 10, or may be the gesture sensor in FIG. 10, and may be other input devices.
  • the first input may be a touch operation detected by the touch screen 123, a gesture input detected by the gesture sensor, or other types of user operations.
  • the processing unit 203 determines that the first input is reachable at the input location.
  • output unit 205 can nest the display of the reachable application. Specifically, the output unit 205 is configured to: respectively display the reachable application in the N sub-display areas included in the first display area, where one sub-display area displays a corresponding output of the reachable application in the The sub-display area corresponding to the reachable i-th layer application is nested in the periphery of the sub-display area corresponding to the i+1th layer application.
  • i i ⁇ N
  • N the number of applications that the first input is reachable at the input location.
  • the output unit 205 can be configured to display only the underlying application that the first input is reachable at the coordinates in the first display area.
  • the processor unit 203 may determine an application that the first input is reachable at the new input location, and then output The unit 205 may nest the application in which the first input is reachable at the new input location, centering on the new input location, in the first display area.
  • the processor unit 203 may determine that the first input is The new reachable application at the input location, the output unit 205 can create a new sub-display area in the display area of the currently reachable underlying application for displaying the newly added application.
  • the processor unit 203 may determine that the first input is The reduced reachable application at the input location, the output unit 205 may cancel the reduced display area corresponding to the deep application in the display area of the upper application of the reduced reachable application .
  • the input unit 201 can also be used to monitor whether the first input is released. If released, the output unit 205 keeps the reachable application still displayed in the first display area for a specified delay time, for example 2 seconds.
  • the input unit 201 may also detect the second input during the specified delay time.
  • the output unit 205 may adjust a display state of an application that is reachable by the first input at the input location in the first display area according to an operation type of the second input.
  • embodiments of the present invention can be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment, or a combination of software and hardware. Moreover, the present invention can take the form of a computer program product embodied on one or more computer usable storage media including computer usable program code, including but not limited to disk storage and optical storage.
  • the computer program instructions can also be stored in a computer readable memory that can direct a computer or other programmable data processing device to operate in a particular manner, such that the instructions stored in the computer readable memory produce an article of manufacture comprising the instruction device.
  • the apparatus implements the functions specified in one or more blocks of a flow or a flow and/or block diagram of the flowchart.
  • These computer program instructions can also be loaded onto a computer or other programmable data processing device such that a series of operational steps are performed on a computer or other programmable device to produce computer-implemented processing for execution on a computer or other programmable device.
  • the instructions provide steps for implementing the functions specified in one or more of the flow or in a block or blocks of a flow diagram.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

一种查看应用程序的图形用户界面、方法及终端。该方法包括:接收第一输入;所述第一输入用于查看层叠显示的多个正在运行的应用程序;获取所述第一输入对应的输入位置,以及所述第一输入在所述输入位置处对应的非位置信息;根据所述输入位置和所述非位置信息,确定所述第一输入在所述输入位置处可达的应用程序;显示所述第一输入在所述输入位置处可达的应用程序。上述方案可简化用户查看层叠应用程序中被遮挡的应用程序的操作。

Description

一种查看应用程序的图形用户界面、方法及终端 技术领域
本发明涉及人机交互技术领域,尤其涉及一种查看应用程序的图形用户界面、方法及终端。
背景技术
用户通常会在终端设备,如手机,上打开多个应用程序。由于终端设备的屏幕有限,被打开的这些应用程序通常会层叠显示。最常见的,屏幕一般只能显示最上层的应用程序,而其他正在运行的应用程序会被显示在屏幕最上层的应用程序遮挡,不为用户所见。
现有技术中,为了查看系统中正在运行的其他应用程序,用户需要通过特定方式,比如触控Home键,触发显示正在运行的应用程序的列表,该列表中顺序排列有系统中正在运行的应用程序的缩略图。然后,用户需要从该列表中找到想要查看的应用程序,并选择想要查看的应用程序,这样才能将用户想要查看的应用程序调整到最上层,显示于屏幕中。
但是,现有技术提供的这种查看应用程序的方法需要的用户操作繁琐,不便于用户查看层叠显示的应用程序中被遮挡的应用程序。
发明内容
本发明实施例提供了一种查看应用程序的图形用户界面、方法及终端,可简化用户查看层叠应用程序中被遮挡的应用程序的操作。
第一方面,提供了一种查看应用程序的方法,包括:接收第一输入,并获取所述第一输入对应的输入位置,以及所述第一输入在所述输入位置处对应的非位置信息,然后根据所述输入位置和所述非位置信息,确定所述第一输入在所述输入位置处可达的应用程序,最后,显示所述第一输入在所述输入位置处可达的应用程序。
具体的,所述第一输入用于查看层叠显示的多个正在运行的应用程序。所述多个正在运行的应用程序层叠显示在第一显示区域中。
具体实现中,所述第一输入可以是触摸屏检测到的触控操作。相应的,所述第一输入对应的非位置信息可以是设置在所述触摸屏下方的压力传感器检测到的触控压力,也可以是所述触摸屏采集到的触控面积,还可以是计时器记录的触控时间等,这里不作限制。在一些实施例中,所述第一输入也可以是手势传感器检测到的手势操作。相应的,所述第一输入对应的非位置信息可以是手势深度。实际应用中,所述第一输入还可以是其他类型的输入操作,这里不做限制。
实施第一方面描述的方法,可简化用户查看层叠应用程序中被遮挡的应用程序的操作,并且可实现同时显示层叠应用程序中的多个应用程序的输出内容,方便用户查看层叠应用程序。
结合第一方面,在一些实施例中,可以通过下述方法确定所述第一输入在所述输入位置处可达的应用程序:首先,根据所述输入位置确定所述输入位置处的层叠应用程序有哪些。然后,从所述输入位置处的层叠应用程序中,根据所述非位置信息确定出所述第一输入在所述输入位置处可达的应用程序。
具体的,可以通过下述方法根据所述非位置信息确定出所述第一输入在所述输入位置处可达的应用程序:根据所述非位置信息对应的数值大小Q以及相邻两层应用程序之间的逻辑距离D,计算出所述可达的应用程序的数量N。然后,将所述输入位置处的层叠应用程序中的第1层至第N层应用程序确定为所述第一输入在所述输入位置处可达的应用程序。
在一些实施例中,可以通过下述过程根据所述数值大小Q和所述逻辑距离D计算出所述数量N=f(Q/D):函数f可以是线性函数。例如,N=f(Q/D)=θ*(Q/D)(θ>0)。简单的,当θ=1时,触控压力Q每增加一个数值D,所述第一输入可达的应用程序将会往下增加一层。在一些可能的实施例中,函数f还可以是非线性函数。也即是说,所述第一输入可达的应用程序的数量与所述触控压力并不呈现直线变化的关系。
在一些实施例中,为了确保所有的层叠应用程序均有机会被所述第一输入可达,所述相邻两层应用程序之间的逻辑距离D可以是动态的,D=QLimit/M。也即是说,所述逻辑距离D可以根据当前层叠应用程序的数量M来确定。QLimit表示系统能够识别的所述非位置信息的数值大小,例如触控压力,的上限。
下面说明如何在所述第一显示区域中显示所述第一输入在所述输入位置处可达的应用程序。
结合第一方面,在一些实施例中,可以所述输入位置为中心嵌套显示所述可达的应用程序。具体的:在所述第一显示区域包括的N个子显示区域中分别显示所述可达的应用程序,其中,一个子显示区域显示一个可达的应用程序相应输出在所述子显示区域中的内容,可达的第i层应用程序对应的子显示区域嵌套在可达第i+1层应用程序对应的子显示区域的外围。
其中,i<N,i是正整数,N是所述第一输入在所述输入位置处可达的应用程序的数量。
需要说明的,以所述输入位置为中心创建各个可达的应用程序对应显示区域可以包括下述几种情形:如果所述子显示区域是矩形,则所述中心可以是矩形的两条对角线的交点;如果所述子显示区域是圆形或椭圆形,则所述中心可以是圆形或椭圆形的圆心;如果所述子显示区域是扇形,则所述中心可以是扇形的顶点。关于“中心”的确定策略,本发明实施例不作限制。
本发明实施例中,系统会实时刷新各个可达的应用程序相应输出在各自对应的显示区域中的内容。
结合第一方面,在一些实施例中,可以在所述第一显示区域中仅显示所述第一输入在所述坐标处可达的底层应用程序。这样可实现用户一次能够完整的查看到所述可达的底层应用程序输出在所述第一显示区域中的内容。
例如,当所述第一输入对应的触控压力为0时,所述第一输入在其坐标处可达的应用程序仅仅是当前显示在所述第一显示区域的顶层的应用程序,例如Facebook。此时,Facebook既是可达的顶层应用,也是可达的底层应用。响应触控压力为0的所述第一输入,可以在所述第一显示区域中仅显示Facebook。
例如,当所述所述第一输入对应的触控压力增大为Q1时,假设所述第一输入在其坐标处可达的应用程序从上至下包括:Facebook和Google地图。此时,Google地图是可达的底层应用。响应触控压力为Q1的上述第一输入,在所述第一显示区域中仅显示Google地图。
本发明实施例中,所述第一输入是可以实时变化的,包括所述输入位置发生变化和所述非位置信息发生变化。具体如下:
结合第一方面,在一些实施例中,如果检测到所述第一输入的输入位置发生变化,则可以确定所述第一输入在新的输入位置处可达的应用程序,然后,在所述第一显示区域中,以新的输入位置为中心嵌套显示所述第一输入在所述新的输入位置处可达的应用程序。
结合第一方面,在一些实施例中,如果检测到所述第一输入的所述非位置信息的数值大小,例如触控压力,增大,则可以确定所述第一输入在所述输入位置处新增的可达的应用程序,然后在当前可达的底层应用程序的显示区域中新建一个子显示区域用来显示所述新增的可达的应用程序。
结合第一方面,在一些实施例中,如果检测到所述第一输入的所述非位置信息的数值大小,例如触控压力,减小,则可以确定所述第一输入在所述输入位置处减少的可达的应用程序,然后在该减少的可达的应用程序的上一层应用程序的显示区域中取消该减少的可达的深层应用程序对应的显示区域。
在一些可能的实施方式中,所述第一输入对应的所述输入位置和所述非位置信息也可以同时发生变化。
本发明实施例中,所述第一输入可以被释放。
结合第一方面,在一些实施例中,如果检测到所述第一输入被释放,则在指定的延迟时间,例如2秒,内,保持所述可达的应用程序仍然显示在所述第一显示区域中。这样可使得用户能够在所述指定延迟时间内,实施调整所述可达的应用程序在所述第一显示区域中的显示状态的操作,例如取消显示某一个应用程序等等。本发明实施例将这种操作称为第二输入。
具体的,在所述指定延迟时间内,如果检测到所述第二输入,则可以根据所述第二输入的操作类型,调整所述第一输入在所述输入位置处可达的应用程序在所述第一显示区域中的显示状态,例如取消显示某一个可达的应用程序。
具体的,当所述指定延迟时间到达后,在所述第一显示区域中,取消显示所述可达的应用程序。
针对嵌套显示所述可达的应用程序,在一些可能的实施方式中,为了明显的区别各个可达的应用程序,可以针对各个可达的应用程序各自对应的显示区域设置相应的应用图标。例如,针对Google地图对应的显示区域的右上角设置Google地图的应用图标,针对相册的子显示区域的右上角设置相册的应用图标。
针对嵌套显示所述可达的应用程序,在一些可能的实施方式中,各个可达的应用程序各自对应的显示区域的大小可以是固定值。在一些可能的实施方式中,各个可达的应用程序各自对应的显示区域的大小也可以与所述第一输入对应的非坐标信息的数值大小,例如触控压力,相关。例如,触控压力越大,显示区域越大。
第二方面,提供了一种终端设备上的图形用户界面,所述终端设备具有触摸屏、存储器和用以执行存储于所述存储器中的一个或一个以上程序的一个或一个以上的处理器,所述图形用户界面包括:所述触摸屏的第一显示区域,以及层叠显示在所述第一显示区域中的多个正在运行的应用程序;其中:
响应于所述触摸屏检测到针对所述第一显示区域的第一输入,显示所述第一输入在所述第一输入对应的输入位置处可达的应用程序;其中,所述第一输入在所述第一输入对应的输入位置处可达的应用程序由所述输入位置和第一输入在所述输入位置处对应的非位置信息决定。
下面说明显示所述可达的应用程序的方式:
结合第二方面,在一些实施例中,在所述第一显示区域包括的N个子显示区域中分别显示所述可达的应用程序,其中,一个子显示区域显示一个可达的应用程序相应输出在所述子显示区域中的内容,可达的第i层应用程序对应的子显示区域嵌套在可达第i+1层应用程序对应的子显示区域的外围。
其中,i<N,i是正整数,N是所述第一输入在所述输入位置处可达的应用程序的数量。
结合第二方面,在一些实施例中,在所述第一显示区域中仅显示所述第一输入在所述输入位置处可达的底层应用程序。
本发明实施例中,所述第一输入可以实时变化,也可以被释放。
结合第二方面,在一些实施例中,所述图形用户界面还包括:响应于检测到的所述第一输入对应的输入位置的变化,在所述第一显示区域中,显示所述第一输入在新的输入位置处可达的应用程序。
结合第二方面,在一些实施例中,所述图形用户界面还包括:响应于检测到的所述第一输入在所述输入位置处对应的非位置信息的数值大小的增大,在所述可达的底层应用程序对应的显示区域中嵌套显示新增的可达的应用程序。
结合第二方面,在一些实施例中,所述图形用户界面还包括:响应于检测到的所述第一输入在所述输入位置处对应的非位置信息的数值大小的减小,在减少的可达的应用程序的上一层应用程序对应的显示区域中取消显示所述减少的可达的应用程序。
结合第二方面,在一些实施例中,所述图形用户界面还包括:响应于检测到的所述第一输入被释放,在指定的延迟时间内,保持所述可达的应用程序仍然显示在所述第一显示区域中。
具体的,所述图形用户界面还包括:在所述指定的延迟时间内,响应于检测到的针对第一显示区域的第二输入,调整所述可达的应用程序在所述第一显示区域中的显示状态。
第三方面,提供了一种终端,包括:触摸屏,处理器,其中:
所述触摸屏用于层叠显示多个正在运行的应用程序,检测针对所述应用程序的第一输入;
处理器用于获取所述第一输入对应的输入位置,以及所述第一输入在所述输入位置处对应的非位置信息;
处理器用于根据所述输入位置和所述非位置信息确定所述第一输入在所述输入位置处可达的应用程序,指示所述触摸屏显示所述可达的应用程序;
所述触摸屏用于显示所述第一输入在所述输入位置处可达的应用程序。
结合第三方面,在一些实施例中,所述非位置信息为触控压力。所述终端还包括:设置在所述触摸屏下方的压力传感器。所述压力传感器用于检测所述第一输入在所述输入位置处的触控压力。所述处理器通过所述压力传感器获取所述第一输入在所述输入位置处的触控压力。
结合第三方面,在一些实施例中,所述非位置信息为触控时长。所述终端还包括:计时器。所述计时器用于检测所述第一输入在所述输入位置处的触控时长。所述处理器通过所述计时器获取所述第一输入在所述输入位置处的触控时长。
结合第三方面,在一些实施例中,所述非位置信息为触控面积。所述触摸屏用于检测所述第一输入在所述输入位置处的触控面积,所述处理器通过所述触摸屏获取所述触控面积。
需要说明的,所述处理器和所述触摸屏的功能实现还可参考第一方面描述的方法。
第四方面,提供了一种终端,包括用于执行上述第一方面的方法的功能单元。
第五发明,提供了一种存储计算机指令的可读非易失性存储介质,所述计算机指令被具有触摸屏的终端设备执行以实现上述第一方面描述的方法。
实施本发明方法实施例,通过获取用于查看层叠应用程序的用户操作(即所述第一输入)对应的输入位置和非位置信息,并根据所述输入位置和所述非位置信息确定出所述第一输入在所述入位置处可达的应用程序,最后在用户界面中显示出所述可达的应用程序。上述方案可简化用户查看层叠应用程序中被遮挡的应用程序的操作。
附图说明
为了更清楚地说明本发明实施例中的技术方案,下面将对实施例描述中所需要使用的附图作简单地介绍。
图1A-1C是本发明实施例涉及的层叠应用程序的示意图;
图2-9是本发明实施例提供的用户界面以及在所述用户界面中实施用于查看层叠应用程序的操作的一些实施例的示意图;
图10是本发明实施例提供的一种终端的结构示意图;
图11是本发明实施例提供的图10的终端处理用户操作的流程示意图;
图12是本发明实施例提供的一种查看应用程序的方法的流程示意图;
图13是本发明实施例提供的一种终端的功能框图。
具体实施方式
本发明的实施方式部分使用的术语仅用于对本发明的具体实施例进行解释,而非旨在限定本发明。
为了便于理解本发明实施例,这里先说明本发明实施例涉及的应用场景:层叠应用程序(stacking applications)。
图1A-1C是本发明实施例涉及的层叠应用程序的示意图。如图1A-1C所 示,终端100上可以同时运行多个应用程序,可如下:社交应用程序,例如脸书(英文:Facebook);图像管理应用程序,例如相册;地图类应用程序,例如谷歌地图;浏览器,例如Safari,Google Chrome等等。但是,在一个时刻,终端100的屏幕120只能完整显示一个应用程序的内容,例如Facebook,本发明实施例将该应用程序称为顶层应用程序,即第一层应用程序,简称第一层App。其余正在运行的应用程序,例如Google地图和相册,会被所述顶层应用程序遮挡,层叠在所述顶层应用程序的下方,不便于用户查看。通常,终端100会维护一个数据结构,例如堆栈,来存储正在运行的应用程序彼此间的层叠结构。本发明实施例将这种彼此间呈现层叠结构的应用程序称为层叠应用程序。在层叠应用程序中,每一个应用程序都存在一个直接的上层应用(除上述顶层应用外)和一个直接的下层应用(除底层应用外),并且,每一个应用都会被它的直接的上层应用所遮挡。
层叠应用程序表现出的层叠形式可包括两种:第一种,上层应用完全遮挡下层应用,可如图1A所示;第二种,上层应用部分遮挡下层应用,可如图1B所示。图1B所示的情形更加常见于终端100具备较大的屏幕120的实施例中,例如终端100是平板电脑。
如图1C所示,Z轴表示垂直于屏幕的坐标轴,同时运行于终端100中的应用程序按照层叠顺序沿Z轴呈层叠结构。其中,Google地图和相册分别位于第二层、第三层,被顶层应用Facebook遮挡,不便于被用户查看。通常,层叠应用程序之间的层叠顺序可与应用程序被激活的时序有关。例如,最后被激活的应用程序通常位于顶层。这里,对应用程序的激活可以由开启应用程序,或者主动操作应用程序等动作触发,也可以由应用程序的内部事件,例如下载完成的上报事件,触发。需要说明的,还可以采用其他策略决定层叠应用程序之间的层叠顺序,本发明实施例对此不做限制。
需要说明的,图1A-1C仅仅是本发明实施例涉及的层叠应用程序的一种示例,不应构成限定。实际应用中,层叠应用程序的数量可以多于或少于图1A-1C示出的3个,层叠应用程序以及层叠顺序均不受图1A-1C限制。
下面介绍本发明实施例涉及的用于查看层叠应用程序的用户界面(UI: User Interface)及用户操作实施例。
图2是本发明实施例提供的一种用于查看层叠应用程序的用户操作的示意图。本发明实施例中,将所述用于查看层叠应用程序的用户操作称为第一输入。
如图2所示,所述第一输入可以是触控操作。为了避免误触控,所述第一输入通常作用在用户界面中没有定义响应动作的位置处。这里,定义有响应动作的位置是指能够被触控操作触发预定义的功能的触控位置,例如应用程序的快捷方式的触控位置,按键等控件的触控位置。
在用户界面中,当系统检测到所述第一输入时,系统可获取两方面的信息:坐标和非坐标信息。其中,所述坐标是所述第一输入对应的输入位置,具体可表示用户想查看层叠应用程序大致在哪个位置的输出内容;所述非坐标信息是所述第一输入在所述输入位置处对应的非位置信息,具体可以是触控压力或者触控时长或者触控面积等信息。响应所述第一输入,系统可以在用户界面中显示层叠应用程序中的一个或多个应用程序相应输出在所述坐标附近的内容。所述坐标和所述非坐标信息可决定层叠应用程序中的哪些应用程序能够被显示在用户界面中。
为了简化说明,本发明实施例引入一个概念:所述第一输入在所述坐标处可达的(attainable)应用程序。这里,可以将所述第一输入类似为“探针”,所述触控压力表示“探针”垂直向下的触及深度。形象的,所述第一输入在所述坐标处可达的应用程序可类似看成:“探针”在所述坐标处向下能够触及到的应用程序。应理解的,所述触控压力越大,“探针”向下的触及深度就越大,所述第一输入在所述坐标处可达的应用程序就越多。
如图2所示,所述第一输入在所述坐标处可达的第一层应用程序是Facebook,如果所述第一输入对应的触控压力足够大,则所述第一输入在所述坐标处可达的应用程序还可进一步的包括Google地图,或者,更进一步的可包括Google地图和相册。
本发明实施例中,所述第一输入在所述坐标处可达的应用程序即系统决定显示在用户界面中的应用程序。关于如何在用户界面中显示所述可达的应用程序,请详见后续内容。下面先说明如何确定所述第一输入在所述坐标处可达的应用程序的数量。
如图2所示,假设相邻的两层应用程序之间的逻辑距离是D。这里,所述逻辑距离用于衡量所述第一输入在所述坐标处可达的应用程序的数量。本发明实施例中,可以用函数f(Q/D)来表示所述第一输入在所述坐标处可达的应用程序的数量N。其中,Q表示所述第一输入对应的非坐标信息的数值大小,例如触控压力。
具体实现中,函数f可以是线性函数。例如,N=f(Q/D)=θ*(Q/D)(θ>0)。简单的,当θ=1时,触控压力Q每增加一个数值D,所述第一输入可达的应用程序将会往下增加一层。示例仅仅是本发明实施例的一种实现方式,实际应用中,函数f还可以是其他形式的线性函数,这里不做限制。在一些可能的实施例中,函数f还可以是非线性函数。也即是说,所述第一输入可达的应用程序的数量与所述触控压力并不呈现直线变化的关系。
本发明实施例中,所述第一输入可以实时变化。具体实现中,系统可以每隔一段时间,如10微秒,采样所述第一输入对应的坐标和非坐标信息。如果采样时刻T的采样数据和采样时刻T+1的采样数据存在差异,并且该差异超过一个足够显著的阈值,则认为所述第一输入发生变化,否则,认为所述第一输入没有发生变化。实际应用中,可以根据终端设备的系统配置等情况确定所述显著的阈值(坐标或者非坐标信息),本发明实施例对此不作限制。
具体的,所述第一输入的实时变化可分为下述两种情况或下述两种情况的结合:
第一,所述第一输入的坐标发生变化,这种变化可用于改变查看位置。例如,采样时刻T的所述第一输入的坐标为(20,20),采样时刻T+1的所述第一输入的坐标为(30,30)。用户从原来查看应用程序在坐标(20,20)处的输出内容变化为查看应用程序在坐标(30,30)处的输出内容。
第二,所述第一输入的非坐标信息发生变化,这种变化可用于改变用户能够查看的应用程序的数量。例如,如图2所示,如果所述第一输入的触控压力增大,那么,用户在坐标(X,Y)处能够查看到的应用程序的数量就会增多。
特别的,如果系统在采样时刻T能够获得采样数据,而在采样时刻T+1却不能获取到采样数据,则认为所述第一输入在采样时刻T+1被释放(release)。这里,如果所述第一输入是触控操作,则所述第一输入被释放可是指触控点, 例如手指,离开终端的触摸屏。
需要说明的,不限于图2所示的触控操作,所述第一输入也可以是手势操作。相应的,所述第一输入对应的非坐标信息可以是手势深度。所述第一输入被释放可是指用户手部,例如手指或手臂等,离开终端的手势传感器的感应范围。实际应用中,所述第一输入还可以是其他类型的输入操作,这里不做限制。
图3A-3D和图4A-4B示出了用户在用户界面中实施所述第一输入的一些实施例。为了便于说明,在后续内容中,屏幕120是触摸屏,所述第一输入以触控操作为例,所述第一输入的非坐标信息以触控压力为例。
图3A-3D是针对第一种层叠形式的层叠应用程序实施所述第一输入的用户界面实施例。所述第一种层叠形式即:上层应用完全遮挡下层应用。
如图3A-3D所示,用户界面可包括屏幕120的第一显示区域10,以及层叠在第一显示区域10中的应用程序。层叠在第一显示区域10中的应用程序可如图1A-1C所示从上至下包括:Facebook,Google地图以及相册。第一显示区域10可以占据整个屏幕120,也可以占据部分屏幕120,例如分屏应用,。
如图3A所示,在屏幕120中,系统检测到所述第一输入。所述第一输入的坐标是(X1,Y1),所述第一输入对应的触控压力为Q1。系统可以根据坐标(X1,Y1)和触控压力Q1确定出所述第一输入在坐标(X1,Y1)可达的应用程序是:Facebook和Google地图。其中,Facebook是所述第一输入在所述坐标处可达的第一层应用程序。
下面具体说明响应所述第一输入来如何将可达的深层应用程序显示在用户界面中。这里,可达的深层应用程序是相对于可达的第一层应用程序而言的,是指被所述可达的第一层应用程序遮挡的下层应用程序。
如图3A所示,响应所述第一输入,在第一显示区域10中,以所述第一输入的坐标(X1,Y1)为中心创建一个新的显示区域:显示区域20,用于显示所述第一输入在坐标(X1,Y1)处可达的第二层应用程序,即Google地图。
应理解的,此时用户界面发生下述变化:如图3B所示,第一显示区域10划分成了2个子显示区域:第1子显示区域和第2子显示区域,第1子显示区域嵌套在第2子显示区域的外围,其中,第1子显示区域用于显示可达的第一层应用 程序,即Facebook,相应输出在第1子显示区域中的内容,第2子显示区域用于显示可达的第二层应用程序,即Google地图,相应输出在第2子显示区域中的内容。这里,第2子显示区域即新建的显示区域20。
如图3C所示,所述第一输入的坐标没有改变,仍然是(X1,Y1)。但是,所述第一输入对应的触控压力增大为Q2,所述第一输入在坐标(X1,Y1)处可达的应用程序增加为:Facebook、Google地图和相册。这时,响应触控压力增大的所述第一输入,进一步在显示区域20中,以坐标(X1,Y1)为中心创建一个新的显示区域:显示区域30,用于显示所述第一输入在坐标(X1,Y1)处可达的第三层应用程序,即相册。
应理解的,此时用户界面发生下述变化:如图3D所示,第一显示区域10进一步划分成了3个子显示区域:第1子显示区域,第2子显示区域和第3子显示区域,第2子显示区域嵌套在第3子显示区域的外围,第1子显示区域嵌套在第2子显示区域的外围,其中,第1子显示区域用于显示可达的第一层应用程序,即Facebook,相应输出在第1子显示区域中的内容,第2子显示区域用于显示可达的第二层应用程序,即Google地图,相应输出在第2子显示区域中的内容,第3子显示区域用于显示可达的第三层应用程序,即相册,相应输出在第3子显示区域中的内容。这里,第3子显示区域即新建的显示区域30。
依此类推,如果所述第一输入在坐标(X1,Y1)处进一步可达更下层的应用程序,则可以:在可达的第三层应用程序,即相册,对应的显示区域30中新建显示区域,并在该新建的显示区域中显示可达的第四层应用程序。依此反复,直到全部可达的深层应用程序显示出来。最终,全部的可达的深层应用程序按照层叠顺序嵌套显示在第一显示区域10中。也即是说,第一显示区域10会被划分出更多子显示区域,分别用来显示可达的各个应用程序。
本发明实施例可以将上述过程描述的用于显示可达的应用程序的方法形象的称为嵌套显示方法。所述嵌套显示方法可归纳如下:在所述第一显示区域包括的N个子显示区域中分别显示所述可达的应用程序,其中,一个子显示区域显示一个可达的应用程序相应输出在所述子显示区域中的内容,可达的第i层应用程序对应的子显示区域嵌套在可达第i+1层应用程序对应的子显示区域的外围;其中,i<N,i是正整数,N是所述第一输入在所述输入位置处可达 的应用程序的数量。
本发明实施例中,可达的应用程序相应输出在一个显示区域,例如显示区域20,中的内容是指:该可达的应用程序输出在该显示区域的坐标范围中的内容。例如,假设图3B中的第2子显示区域覆盖的坐标范围是[(20,20),(120,100)],其中,(20,20)表示第2子显示区域的左上角的坐标,(120,100)表示第2子显示区域的右下角的坐标。那么,Google地图相应输出在第2子显示区域中的内容是指:Google地图在坐标范围[(20,20),(120,100)]所限定的矩形区域中的输出内容。示例仅仅用于解释本发明实施例,不应构成限定。
本发明实施例中,系统会实时刷新各个可达的深层应用程序相应输出在各自对应的显示区域中内容。例如,如图3C所示,如果相册正在播放一段视频,那么,显示在第3子显示区域中的内容会与视频播放同步。示例仅仅用于解释本发明实施例,不应构成限定。
图4A-4B是针对第二种层叠形式的层叠应用程序实施所述第一输入的用户界面实施例。所述第二种层叠形式即:上层应用部分遮挡下层应用,可如图4A所示。
如图4A-4B所示,用户界面可包括屏幕120的第一显示区域10,以及层叠在第一显示区域10中的应用程序。层叠在第一显示区域10中的应用程序可如图1A-1C所示从上至下包括:Facebook,Google地图以及相册。第一显示区域10可以占据整个屏幕120,也可以占据部分屏幕120,例如分屏应用,。
如图4B所示,在屏幕120中,系统检测到所述第一输入。所述第一输入的坐标是(X2,Y2),所述第一输入对应的触控压力为Q1。系统可以根据坐标(X2,Y2)和触控压力Q1确定出所述第一输入在坐标(X2,Y2)可达的应用程序是:Google地图和相册。其中,Google地图是所述第一输入在坐标(X2,Y2)处可达的第一层应用程序。
如图4B所示,响应所述第一输入,用户界面发生下述变化:在第一显示区域10中,显示所述第一输入在坐标(X2,Y2)处可达的深层应用程序,即相册。具体的,在第一显示区域10中,以坐标(X2,Y2)为中心创建一个新的显示区域:显示区域20,并在显示区域20中显示相册。
可以理解的,如果所述第一输入在坐标(X2,Y2)处进一步可达更下层的应用程序,则可以进一步在显示区域20中嵌套显示出更下层的应用程序。关于嵌套显示可达的应用程序的具体过程可参考图3A-3D实施例中的内容,这里不赘述。
图5A-5B是所述第一输入实时变化的实施例。
如图5A-5B所示,用户界面可包括屏幕120的第一显示区域10,以及层叠在第一显示区域10中的应用程序。层叠在第一显示区域10中的应用程序可如图1A-1C所示从上至下包括:Facebook,Google地图以及相册。第一显示区域10可以占据整个屏幕120,也可以占据部分屏幕120,例如分屏应用。
如图5A所示,在屏幕120中,系统检测到所述第一输入的坐标发生变化:从(X1,Y1)到(X2,Y2)。响应坐标的变化,重新确定所述第一输入在坐标(X2,Y2)处可达的深层应用程序,并显示所述第一输入在坐标(X2,Y2)处可达的深层应用程序。
如图5B所示,在屏幕120中,系统检测到所述第一输入的触控压力发生变化。响应触控压力的变化,可以实时调整显示在第一显示区域10中的应用程序。
具体的,如果所述第一输入的触控压力增大,则确定所述第一输入在所述坐标处新增的可达的深层应用程序,并使得用户界面发生下述变化:在当前可达的最底层应用程序的显示区域中新建一个子显示区域用来显示所述新增的可达的应用程序。例如,如图5B所示,新增的可达的深层应用程序是相册。那么,在相册的上一层应用程序Google地图对应的显示区域(第2子显示区域)中新建子显示区域(第3子显示区域)用于显示相册。
具体的,如果所述第一输入的触控压力减小,则确定所述第一输入在所述坐标处减少的可达的深层应用程序(相册),并使得用户界面发生下述变化:在该减少的可达的深层应用程序的上一层应用程序(Google地图)的显示区域中删除该减少的可达的深层应用程序对应的显示区域(第3子显示区域)。例如,如图5B所示,减少的可达的深层应用程序是相册。那么,在相册的上一层应用程序Google地图对应的显示区域(第2子显示区域)中删除相册对应的显示区域(第3子显示区域)。
在一些可能的实施方式中,系统可能检测到所述第一输入的坐标和触控压力同时发生变化。响应坐标和触控压力的同时变化,实时更新显示所述可达的应用程序的位置以及实时调整显示在第一显示区域10中的应用程序。
图6A-6C是用户释放所述第一输入以及调整所述可达的应用程序在所述第一显示区域中的显示状态的操作实施例。
如图6A-6C所示,用户界面可包括屏幕120的第一显示区域10,以及层叠在第一显示区域10中的应用程序。层叠在第一显示区域10中的应用程序可如图1A-1C所示从上至下包括:Facebook,Google地图以及相册。第一显示区域10可以占据整个屏幕120,也可以占据部分屏幕120,例如分屏应用,。
如图6A所示,在屏幕120中,系统检测到所述第一输入被释放。关于所述第一输入被释放的定义和说明请参考前述内容,这里不赘述。响应所述第一输入被释放,在所述第一输入被释放后的指定延迟时间,例如2秒,内,将所述可达的应用程序仍然保持显示在第一显示区域10中,即用户界面在所述指定延迟时间内没有变化。当所述指定延迟时间到达后,取消显示所述可达的应用程序。
如图6B的左侧附图所示,在屏幕120中,在所述指定延迟时间,例如2秒,内,系统检测到针对第一显示区域10的滑动操作,且该滑动操作的作用对象是相册对应的显示区域。响应所述滑动操作,用户界面发生下述变化:如图6B的右侧附图所示,在相册的上层应用程序Google地图对应的显示区域(第2子显示区域)中删除相册对应的显示区域(第3子显示区域)。可以看出,所述滑动操作对应的操作效果是取消显示所述可达应用程序中的某一应用程序。
如图6C的左侧附图所示,在屏幕120中,在所述指定延迟时间,例如2秒,内,系统检测到针对第一显示区域10的选择操作,例如点击操作,,且该选择操作的作用对象是相册对应的显示区域。响应所述滑动操作,用户界面发生下述变化:如图6C的右侧附图所示,将相册调整为第一显示区域10中的顶层应用程序,调整原来的顶层应用程序Facebook至任意层,例如相册所在层。可以看出,所述选择操作对应的操作效果是调整所述可达应用程序中的某一应用程序至顶层显示。
为了与所述第一输入区别,本发明实施例将所述指定延迟时间内用于调整所述可达的应用程序在所述第一显示区域中的显示状态的操作称为第二输入。需要说明的,所述第二输入以及所述第二输入对应的操作效果不限于上述实施例,实际应用中还可以根据具体需求设置,这里不作限制。
下面进一步介绍本发明实施例涉及UI的一些可选的实施方式。
第一显示区域10中的子显示区域的形状和大小不受图3A-3D以及图4A-4B的的限制。
如图7A所示,所述子显示区域也可以是圆形。如图7B所示,所述子显示区域还可以是扇形。其中,一个数字编号表示的子显示区域可用于显示一个可达的深层应用程序相应输出在该子显示区域中的内容。例如,圆环形子显示区域“1”用于显示可达的第二层应用程序相应输出在圆环形子显示区域“1”中的内容。圆环形子显示区域“2”用于显示可达的第三层应用程序相应输出在圆环形子显示区域“2”中的内容。示例仅仅用于解释本发明实施例,不应构成限定。实际应用中,所述子显示区域还可以是附图未示出的其他形状,这里不作限制。
应理解的,如果采用图7A实施例来显示所述可达的深层应用程序,当所述可达的深层应用程序的数量较多时,由于所述子显示区域受屏幕所限不可能无限扩展,因此,均分到每个圆环形子显示区域的宽度会很小,从而导致用户难以看清楚该圆环形子显示区域中的内容。为解决这个问题,可以将一个子显示区域分成多个部分,每个部分用于显示一个应用程序。
例如,如图7C所示,可以将外层嵌套子显示区域划分成4个部分,每个部分用于显示一个应用程序。这样,即使所述可达的深层应用程序较多,所述子显示区域的嵌套层次也不会很多,每层宽度仍能够清楚显示出相应应用程序的的内容。图7C仅仅示出了本发明实施例的一种实现方式,实际应用中,可以根据具体需求制定划分策略,这里不做限制。
在一些可能的实施方式中,为了明显的区别各个子显示区域所显示的应用程序,可以针对各个子显示区域设置相应的应用图标。例如,如图8所示,针对Google地图对应的子显示区域(第2子显示区域)的右上角设置Google地图 的应用图标,针对相册的子显示区域(第3子显示区域)的右上角设置相册的应用图标。示例仅仅用于解释本发明实施例,不应构成限定。
本发明实施例中,第2子显示区域的大小可以是固定值。嵌套在第2子显示区域内的子显示区域的宽度可以根据所述可达的深层应用程序的数量自适应的调整。具体的,所述可达的深层应用程序的数量越多,所述子显示区域的宽度越小;所述可达的深层应用程序的数量越少,所述子显示区域的宽度越大。
在一些可能的实现方式中,第2子显示区域的大小也可以与所述第一输入对应的非坐标信息的数值大小,例如触控压力,相关。具体的,触控压力越大,第2子显示区域越大。
另外,需要说明的,以所述第一输入的坐标为中心嵌套显示所述可达的应用程序可以包括下述几种情形:如果所述子显示区域是矩形,则所述中心可以是矩形的两条对角线的交点;如果所述子显示区域是圆形或椭圆形,则所述中心可以是圆形或椭圆形的圆心;如果所述子显示区域是扇形,则所述中心可以是扇形的顶点。关于所述子显示区域的“中心”的确定策略,本发明实施例不作限制。
如图9所示,在一些可能的实施方式中,第一显示区域10可用于仅显示所述第一输入在所述坐标处可达的底层应用程序。具体如下:
当所述第一输入对应的触控压力为0时,所述第一输入在其坐标处可达的应用程序仅仅是Facebook。此时,Facebook既是可达的顶层应用,也是可达的底层应用。响应触控压力为0的所述第一输入,在第一显示区域10中仅显示Facebook。
当所述所述第一输入对应的触控压力增大为Q1时,所述第一输入在其坐标处可达的应用程序从上至下包括:Facebook和Google地图。此时,Google地图是可达的底层应用。响应触控压力为Q1的上述第一输入,在第一显示区域10中仅显示Google地图。应理解的,此时用户界面中只有Google地图为用户所见。
当所述第一输入对应的触控压力进一步增大为Q2时,所述第一输入在其坐标处可达的应用程序从上至下包括:Facebook,Google地图和相册。此时,相册是可达的底层应用。响应触控压力为Q2的上述第一输入,在第一显示区 域10中仅显示相册。应理解的,此时用户界面中只有相册为用户所见。
依此类推,如果所述第一输入在其坐标处进一步可达更下层的应用程序,则可以在第一显示区域10中仅显示所述可达的更下层的应用程序。可以理解的,如果用户想查看更上层的应用程序,则可以减小所述第一输入对应的触控压力。
下面介绍本发明实施例涉及的终端设备的一种实现方式。该终端设备支持多线程运行,能同时运行多个应用程序或服务。该终端设备支持的应用程序可以包括:社交应用程序,例如Facebook;图像管理应用程序,例如相册;地图类应用程序,例如谷歌地图;浏览器,例如Safari,Google Chrome等等。这些应用程序可具有公共的输入输出设备:触摸屏。触摸屏用于接收用户的触控操作,而且显示应用程序的输出内容。在一些可能的实施例中,多个应用程序的公共的输入设备还可以是手势输入装置,例如手势传感器。
图10是终端设备100的一种实现方式的结构框图。如图10所示,终端100可包括:基带芯片110、存储器115,包括一个或多个计算机可读存储介质、射频(RF)模块116、外围系统117。这些部件可在一个或多个通信总线114上通信。
外围系统117主要用于实现终端110和用户/外部环境之间的交互功能,主要包括终端100的输入输出装置。具体实现中,外围系统117可包括:触摸屏控制器118、摄像头控制器119、音频控制器120以及传感器管理模块121。其中,各个控制器可与各自对应的外围设备,例如触摸屏123、摄像头124、音频电路125以及传感器126,耦合。在一些实施例中,传感器126中的手势传感器可用于接收用户输入的手势控制操作。传感器126中的压力传感器可设置于触摸屏123的下方,可用于采集用户通过触摸屏123输入触控操作时作用于触摸屏123上的触控压力。需要说明的,外围系统117还可以包括其他I/O外设。
基带芯片110可集成包括:一个或多个处理器111、时钟模块112以及电源管理模块113。集成于基带芯片110中的时钟模块112主要用于为处理器111产生数据传输和时序控制所需要的时钟。集成于基带芯片110中的电源管理模块113主要用于为处理器111、射频模块116以及外围系统提供稳定的、高精 确度的电压。
射频(RF)模块116用于接收和发送射频信号,主要集成了终端100的接收器和发射器。射频(RF)模块116通过射频信号与通信网络和其他通信设备通信。具体实现中,射频(RF)模块116可包括但不限于:天线系统、RF收发器、一个或多个放大器、调谐器、一个或多个振荡器、数字信号处理器、CODEC芯片、SIM卡和存储介质等。在一些实施例中,可在单独的芯片上实现射频(RF)模块116。
存储器115与处理器111耦合,用于存储各种软件程序和/或多组指令。具体实现中,存储器115可包括高速随机存取的存储器,并且也可包括非易失性存储器,例如一个或多个磁盘存储设备、闪存设备或其他非易失性固态存储设备。存储器115可以存储操作系统,例如ANDROID,IOS,WINDOWS,或者LINUX等嵌入式操作系统。存储器115还可以存储网络通信程序,该网络通信程序可用于与一个或多个附加设备,一个或多个终端设备,一个或多个网络设备进行通信。存储器115还可以存储用户界面程序,该用户界面程序可以通过图形化的操作界面将应用程序的内容形象逼真的显示出来,并通过菜单、对话框以及按键等输入控件接收用户对应用程序的控制操作。
存储器115还可以存储一个或一个以上程序。如图10所示,这些程序可包括:社交应用程序,例如Facebook;图像管理应用程序,例如相册;地图类应用程序,例如谷歌地图;浏览器,例如Safari,Google Chrome等等。
以触控操作为例,图11从终端100的内部处理流程说明了本发明实施例涉及的两个主要的用户操作处理阶段。第一阶段(1-5)主要说明终端100如何处理用于查看层叠应用程序的用户操作,即所述第一输入,第二阶段(6-10)主要说明终端100如何处理用于调整已显示出的层叠应用程序的显示状态的用户操作,即所述第二输入。其中:
1.在输入装置,例如触摸屏123,中层叠显示有多个运行于终端100中的应用程序的条件下,触摸屏123检测到触控操作,并通知处理器111。处理器111判定这个触控操作就是用于查看层叠应用程序的用户操作,即所述第一输入。
2.触摸屏123将上述步骤1中的触控操作的触控点信息发送给处理器111。所述触控点信息包括位置信息和非位置信息。其中,位置信息为坐标,非位置信 息为非坐标信息。其中,所述非坐标信息可以是触摸屏123下方的压力传感器采集到的触控压力,也可以是触摸屏123采集到的触控面积,还可以是计时器采集到的触控时长等。处理器111可以分别通过压力传感器、触摸屏、计时器等获取到相应的非位置信息。
3.处理器111结合当前层叠显示在触摸屏123中的应用程序和上述步骤2中获得的触控点信息,包括坐标和非坐标信息,分析出需要显示层叠应用程序中的哪些应用程序。这里,能够被显示出来的这些应用程序即上述步骤1中的所述第一输入在触控点,即所述坐标处,可达的应用程序。
具体的,处理器111可以指示触摸屏123以所述坐标为中心嵌套显示所述可达的应用程序。关于如何嵌套显示所述可达的应用程序的实现方式,请参考前述实施例中的内容,这里不再赘述。
4,处理器111向输出装置,例如触摸屏123,发送嵌套显示层叠应用程序的指令,触发触摸屏123显示所述可达的应用程序各自输出在相应显示区域中的内容。
需要说明的,在检测到所述第一输入后,处理器111可以反复执行上述步骤2-4,以获取所述第一输入的实时触控点信息。因为,所述第一输入的触控点信息,如坐标和非坐标信息,可以实时变化。
5.触摸屏123检测到上述步骤1中的触控操作的触点离开屏幕,并通知处理器111。根据前述实施例可知,如果在采样时刻T触摸屏123还能采样到触点信息,但是在采样时刻T+1触摸屏123不能采样到触点信息,则判定触点在采样时刻T+1离开触摸屏123。这里,触点离开屏幕意味着所述第一输入被释放。
6.处理器111向触摸屏123发送在指定延迟时间内保持层叠应用程序的显示状态的指令,指示触摸屏123在触点离开屏幕后的指定延迟时间内,层叠应用程序的显示状态。
假设触点在采样时刻T+1离屏。关于离屏,有下述两种情形:
第一种情形,触点缓慢离开屏幕。在离屏过程中,所述第一输入的非坐标信息,例如触控压力,的数值大小缓慢减小,在采样时刻T所述第一输入的非坐标信息的数值大小已经减小为0。也即是说,采样时刻T,所述第一输入在所述坐标处已经没有任何可达的应用程序,原来显示在触摸屏123中的较深层的 应用程序已经消失。触摸屏123恢复至接收所述第一输入之前的的显示状态。
第二种情形,触点迅速离开屏幕。在离屏过程中,所述第一输入的非坐标信息,例如触控压力,的数值大小迅速减小,在采样时刻T所述第一输入的非坐标信息的数值大小没有减小为0,甚至可以保持在离屏开始前的大小。也即是说,采样时刻T,所述第一输入在所述坐标处已经存在一些可达的应用程序。针对第二种情形,在触点离屏的指定延迟时间内,触摸屏123保持触点离屏前的显示状态。
7.在触点离屏后的指定延迟时间内,触摸屏123检测到新的触控操作,例如点击、滑动等,并通知处理器111。处理器111判定这个新的触控操作就是所述第二输入。
8.处理器111确定这个新的触控操作的作用对象是哪一个应用程序,以及这个新的触控操作对应的触控效果。
9.处理器111向触摸屏123发送执行所述触控效果的指令,指示触摸屏123执行所述触控效果。关于上述步骤8-9的具体实现方式,请参考前述实施例,这里不赘述。
10.如果在触点离屏后的指定延迟时间内触摸屏123没有检测到新的触控操作,那么,当指定延迟时间到达时,处理器111向触摸屏123发送取消嵌套显示层叠应用程序的指令,以通知触摸屏123取消嵌套显示的所述可达的应用程序各自输出在相应显示区域中的内容。
需要说明的,图11中的输入装置也可以是手势传感器,手势传感器检测到的用户操作相应是手势操作,触点相应是手势操作的作用点。实际应用中,图11中的输入装置也可以是其他类型的输入装置,这里不作限制。
应当理解,终端100仅为本发明实施例提供的一个例子,并且,终端100可具有比示出的部件更多或更少的部件,可以组合两个或更多个部件,或者可具有部件的不同配置实现。
基于图10实施例描述的终端100,下面介绍本发明实施例提供的一种查看应用程序的方法。
参见图12,图12是本发明实施例提供的一种查看应用程序的方法的流程示 意图。在图12实施例中,所述第一输入是作用在终端的触摸屏中的触控操作,所述第一输入的非坐标信息是触控压力。如图12所示,该方法包括:
S101,触摸屏接收第一输入。具体的,所述触摸屏的第一显示区域中层叠显示有多个正在运行的应用程序。所述第一显示区域可以占据整个所述触摸屏,也可以占据部分所述触摸屏,例如在分屏应用中。所述第一输入用于查看层叠显示在所述第一显示区域中的应用程序。
S103,所述处理器通过所述触摸屏获取所述第一输入对应的输入位置,并且通过设置在所述触摸屏下方的压力传感器获取所述第一输入在所述输入位置处的触控压力。
S105,根据所述第一输入在所述触摸屏中的输入位置和所述压力传感器获取的触控压力,所述处理器可分析出所述第一输入在所述输入位置处可达的应用程序。
S107,所述处理器指示所述触摸屏中显示所述第一输入在所述坐标处可达的应用程序。
具体的,可以在所述触摸屏的所述第一显示区域中显示所述第一输入在所述坐标处可达的应用程序。
本发明实施例中,处理器首先可以根据所述输入位置确定所述输入位置处的层叠应用程序有哪些,然后从所述输入位置处的层叠应用程序中,根据所述非位置信息,例如压力传感器检测到的触控压力Q,确定出所述第一输入在所述输入位置处可达的应用程序。
参考前述实施例可知,对于图3A-3D所示的第一种层叠形式来说,由于层叠在第一显示区域10中的应用程序均占据整个第一显示区域10,因此,用户界面中的任何坐标处的层叠应用都一样,即包括层叠在第一显示区域10中的全部应用程序。然而,对于图4A-4B所示的第二种层叠形式来说,由于层叠在第一显示区域10中的应用程序可能部分占据第一显示区域10,因此,用户界面中的不同位置处的层叠应用程序可能会不同。例如,如图4B所示,坐标(X2,Y2)处的层叠应用程序是:Google地图和相册,并不包括顶层应用Facebook。
具体的,所述处理器可以通过下述方法来根据所述非位置信息,例如压力传感器检测到的触控压力Q,确定出所述第一输入在所述输入位置处可达的应 用程序:
步骤一,根据所述压力传感器获取到的触控压力Q以及相邻两层应用程序之间的逻辑距离D,计算出所述可达的应用程序的数量N。
步骤二,将所述输入位置处的层叠应用程序中的第1层至第N层应用程序确定为所述第一输入在所述输入位置处可达的应用程序。
这里,关于如何根据触控压力Q和所述逻辑距离D计算出所述数量N,可参考前述实施例中的相关内容,这里不赘述。关于所述第一输入在所述输入位置处可达的应用程序的定义请参考前述实施例中的说明,这里不赘述。
需要说明的,设置在所述触摸屏下方的所述压力传感器能够识别的所述非位置信息的数值大小,例如触控压力,可能存在上限,表示为QLimit。根据前述实施例描述的计算所述第一输入在所述坐标处可达的应用程序的数量N的方法可知,N也具有上限,表示为QLimit/D。应理解的,如果当前层叠应用程序的数量M非常大,即:M大于QLimit/D,那么,层叠应用程序中就会存在一些应用程序不能被所述第一输入可达,从而无法显示在触摸屏123的所述第一显示区域中为用户所见。
为了确保所有的层叠应用程序均有机会被所述第一输入可达,所述相邻两层应用程序之间的逻辑距离D可以是动态的,D=QLimit/M。也即是说,所述逻辑距离D可以根据当前层叠应用程序的数量M来确定。
本发明实施例中,在确定处所述第一输入在所述输入位置处可达的应用程序之后,所述处理器可以通过下述实现方式显示所述可达的应用程序。
在一些实施方式中,所述处理器可以指示所述触摸屏在所述触摸屏Dev所述第一显示区域中以所述输入位置为中心嵌套显示所述可达的应用程序。所述嵌套显示方法可归纳如下:在所述第一显示区域包括的N个子显示区域中分别显示所述可达的应用程序,其中,一个子显示区域显示一个可达的应用程序相应输出在所述子显示区域中的内容,可达的第i层应用程序对应的子显示区域嵌套在可达第i+1层应用程序对应的子显示区域的外围。其中,i<N,i是正整数,N是所述第一输入在所述输入位置处可达的应用程序的数量。
这里,关于所述可达的应用程序的嵌套显示效果请参考前述实施例,关于可达的应用程序相应输出在一个显示区域中的内容的定义和说明也请参考前 述实施例,这里不赘述。
需要说明的,以所述输入位置为中心创建各个可达的应用程序对应显示区域可以包括下述几种情形:如果所述子显示区域是矩形,则所述中心可以是矩形的两条对角线的交点;如果所述子显示区域是圆形或椭圆形,则所述中心可以是圆形或椭圆形的圆心;如果所述子显示区域是扇形,则所述中心可以是扇形的顶点。关于“中心”的确定策略,本发明实施例不作限制。
本发明实施例中,所述处理器会实时触发所述触摸屏刷新各个可达的应用程序相应输出在各自对应的显示区域中的内容。
在一些实施方式中,参考前述图9实施例可知,所述处理器也可以指示所述触摸屏在所述第一显示区域中仅显示所述第一输入在所述坐标处可达的底层应用程序。具体示例可参考前述实施例,这里不赘述。
本发明实施例中,所述处理器还可以通过所述触摸屏和所述压力传感器监测所述第一输入的实时变化。
为了监测这种变化,如图12所示,本发明实施例提供的查看应用程序的所述方法还包括:S109,所述处理器通过所述触摸屏监测所述第一输入是否发生变化,如果发生变化,则所述处理器可重复执行S103-S107,以根据所述第一输入的实时情况及时调整所述层叠应用程序的显示情况。
如果所述触摸屏检测到所述第一输入的输入位置发生变化,则:所述处理器可以确定所述第一输入在新的输入位置处可达的应用程序,然后,所述处理器可以指示所述触摸屏在所述第一显示区域中,以新的输入位置为中心嵌套显示所述第一输入在所述新的输入位置处可达的应用程序。关于如何以所述新的输入位置为中心嵌套显示所述第一输入在所述新的输入位置处可达的应用程序,请参考前述实施例,这里不赘述。
如果所述压力传感器检测到所述第一输入的所述非位置信息,例如触控压力,的数值大小增大,则所述处理器可以确定所述第一输入在所述输入位置处新增的可达的应用程序,然后,指示所述触摸屏在当前可达的底层应用程序的显示区域中新建一个子显示区域用来显示所述新增的可达的应用程序。
如果所述压力传感器检测到所述第一输入的所述非位置信息,例如触控压力,的数值大小减小,则所述处理器可以确定所述第一输入在所述输入位置处 减少的可达的应用程序,然后,指示所述触摸屏在该减少的可达的应用程序的上一层应用程序的显示区域中取消该减少的可达的深层应用程序对应的显示区域。
本发明实施例中,所述第一输入的实时变化可包括所述输入位置的变化,或者所述所述非位置信息的变化,或者二者同时变化。
本发明实施例中,所述第一输入可以被释放。关于所述第一输入被释放的定义和说明请参考前述实施例,这里不赘述。如图12所示,本发明实施例提供的查看应用程序的所述方法还可包括:S111,所述处理器通过所述触摸屏监测所述第一输入是否被释放。如果被释放,则在指定的延迟时间,例如2秒,内,所述处理器可以指示所述触摸屏保持所述可达的应用程序仍然显示在所述第一显示区域中,可如S115所示。这样可使得用户能够在所述指定延迟时间内,实施调整所述可达的应用程序在所述第一显示区域中的显示状态的操作,例如取消显示某一个应用程序等等。本发明实施例将这种操作称为第二输入。如果所述指定延迟时间到达,则所述处理器可以指示所述触摸屏在所述第一显示区域中取消显示所述可达的应用程序,可如S119所示。
具体的,如S117所示,在所述指定延迟时间内,如果所述触摸屏检测到所述第二输入,则所述处理器可以根据所述第二输入的操作类型,指示所述触摸屏调整所述第一输入在所述输入位置处可达的应用程序在所述触摸屏的所述第一显示区域中的显示状态。关于所述第二输入以及所述第二输入对应的操作效果的具体实现和说明请参考前述实施例,这里不赘述。
需要说明的,不限于图12,如果所述第一输入是所述触摸屏检测到的触控操作,那么,所述第一输入对应的非位置信息也可以是所述处理器通过所述触摸屏采集到的触控面积,还可以是所述处理器通过计时器记录的触控时间等,这里不作限制。
需要说明的,不限于上述内容所描述的所述触摸屏检测到的触控操作,所述第一输入也可以是手势传感器检测到的手势操作。相应的,所述第一输入对应的非位置信息可以是所述处理器通过所述手势传感器获取的手势深度。所述第一输入被释放可是指用户手部,例如手指或手臂等,离开终端的所述手势传感器的感应范围。实际应用中,所述第一输入还可以是其他类型的输入操作, 这里不做限制。
需要说明的,图12实施例中没有提及的内容请参考前述图1-11实施例,这里不再赘述。
图13示出了本发明实施例提供的一种终端的功能框图。终端的功能块可由硬件、软件或硬件与软件的组合来实施本发明方案。所属领域的技术人员应理解,图13中所描述的功能块可经组合或分离为若干子块以实施本发明方案。因此,本发明中上面描述的内容可支持对下述功能模块的任何可能的组合或分离或进一步定义。如图13所示,终端200可包括:输入单元201、处理单元203和输出单元205。其中:
输入单元201,用于接收第一输入。
处理单元203,用于获取所述第一输入对应的输入位置,以及所述第一输入在所述输入位置处对应的非位置信息;
处理单元203,还用于根据所述输入位置和所述非位置信息,分析出所述第一输入在所述输入位置处可达的应用程序;
输出单元205,用于显示所述第一输入在所述输入位置处可达的应用程序。
具体的,输出单元205可以是触摸显示器,例如图10中的触摸屏123。所述第一输入用于查看层叠显示在输出单元205的第一显示区域中的应用程序。
具体的,输入单元201可以是图10中的触摸屏123,也可以是图10中的手势传感器,还可以是其他输入装置。相应的,所述第一输入可以是触摸屏123检测到的触控操作,也可以是手势传感器检测到的手势输入,还可以是其他类型的用户操作。
这里,关于处理单元203如何确定出所述第一输入在所述输入位置处可达的应用程序,请参考前述实施例,这里不赘述。
在一些实施例中,输出单元205可以嵌套显示所述可达的应用程序。具体的,输出单元205可用于:在所述第一显示区域包括的N个子显示区域中分别显示所述可达的应用程序,其中,一个子显示区域显示一个可达的应用程序相应输出在所述子显示区域中的内容,可达的第i层应用程序对应的子显示区域嵌套在可达第i+1层应用程序对应的子显示区域的外围。
其中,i<N,i是正整数,N是所述第一输入在所述输入位置处可达的应用程序的数量。
在一些可能的实施方式中,参考前述实施例可知,输出单元205可用于:在所述第一显示区域中仅显示所述第一输入在所述坐标处可达的底层应用程序。
在一些可能的实施方式中,如果输入单元201检测到所述第一输入的输入位置发生变化,则处理器单元203可以确定所述第一输入在新输入位置处可达的应用程序,然后输出单元205可以在所述第一显示区域中,以新的输入位置为中心嵌套显示所述第一输入在所述新的输入位置处可达的应用程序。
在一些可能的实施方式中,如果输入单元201检测到所述第一输入的所述非位置信息的数值大小,例如触控压力,增大,则处理器单元203可以确定所述第一输入在所述输入位置处新增的可达的应用程序,输出单元205可以在当前可达的底层应用程序的显示区域中新建一个子显示区域用来显示所述新增的可达的应用程序。
在一些可能的实施方式中,如果输入单元201检测到所述第一输入的所述非位置信息的数值大小,例如触控压力,减小,则处理器单元203可以确定所述第一输入在所述输入位置处减少的可达的应用程序,输出单元205可以在该减少的可达的应用程序的上一层应用程序的显示区域中取消该减少的可达的深层应用程序对应的显示区域。
在一些可能的实施方式中,输入单元201还可用于监测所述第一输入是否被释放。如果被释放,则在指定的延迟时间,例如2秒,内,输出单元205保持所述可达的应用程序仍然显示在所述第一显示区域中。
在所述指定延迟时间内,输入单元201还可检测到所述第二输入。响应所述第二输入,输出单元205可以根据所述第二输入的操作类型,调整所述第一输入在所述输入位置处可达的应用程序在所述第一显示区域中的显示状态。关于所述第二输入以及所述第二输入对应的操作效果的具体实现和说明请参考前述实施例,这里不赘述。
可以理解的,关于图13的终端200包括的功能块的具体实现方式,可参考前述实施例,这里不赘述。
实施本发明方法实施例,通过获取用于查看层叠应用程序的用户操作,即所述第一输入,对应的输入位置和非位置信息,并根据所述输入位置和所述非位置信息确定出所述第一输入在所述入位置处可达的应用程序,最后在用户界面中显示出所述可达的应用程序。上述方案可简化用户查看层叠应用程序中被遮挡的应用程序的操作,并且可实现同时显示层叠应用程序中的多个应用程序的输出内容,方便用户查看层叠应用程序。
本领域内的技术人员应明白,本发明的实施例可提供为方法、系统、或计算机程序产品。因此,本发明可采用完全硬件实施例、完全软件实施例、或结合软件和硬件方面的实施例的形式。而且,本发明可采用在一个或多个其中包含有计算机可用程序代码的计算机可用存储介质,包括但不限于磁盘存储器和光学存储器等,上实施的计算机程序产品的形式。
本发明是参照根据本发明实施例的方法、设备(系统)、和计算机程序产品的流程图和/或方框图来描述的。应理解可由计算机程序指令实现流程图和/或方框图中的每一流程和/或方框、以及流程图和/或方框图中的流程和/或方框的结合。可提供这些计算机程序指令到通用计算机、专用计算机、嵌入式处理机或其他可编程数据处理设备的处理器以产生一个机器,使得通过计算机或其他可编程数据处理设备的处理器执行的指令产生用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的装置。
这些计算机程序指令也可存储在能引导计算机或其他可编程数据处理设备以特定方式工作的计算机可读存储器中,使得存储在该计算机可读存储器中的指令产生包括指令装置的制造品,该指令装置实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能。
这些计算机程序指令也可装载到计算机或其他可编程数据处理设备上,使得在计算机或其他可编程设备上执行一系列操作步骤以产生计算机实现的处理,从而在计算机或其他可编程设备上执行的指令提供用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的步骤。
显然,本领域的技术人员可以对本发明进行各种改动和变型而不脱离本发明的精神和范围。这样,倘若本发明的这些修改和变型属于本发明权利要求及其等同技术的范围之内,则本发明也意图包含这些改动和变型在内。

Claims (36)

  1. 一种查看应用程序的方法,其特征在于,包括:
    接收第一输入;所述第一输入用于查看层叠显示的多个正在运行的应用程序;
    获取所述第一输入对应的输入位置,以及所述第一输入在所述输入位置处对应的非位置信息;
    根据所述输入位置和所述非位置信息,确定所述第一输入在所述输入位置处可达的应用程序;
    显示所述第一输入在所述输入位置处可达的应用程序。
  2. 如权利要求1所述的方法,其特征在于,所述根据所述输入位置和所述非位置信息,确定所述第一输入在所述输入位置处可达的应用程序,包括:
    根据所述输入位置确定所述输入位置处的层叠应用程序;
    从所述输入位置处的层叠应用程序中,根据所述非位置信息确定所述第一输入在所述输入位置处可达的应用程序。
  3. 如权利要求2所述的方法,其特征在于,所述从所述输入位置处的层叠应用程序中,根据所述非位置信息确定出所述第一输入在所述输入位置处可达的应用程序,包括:
    根据所述非位置信息对应的数值大小Q以及相邻两层应用程序之间的逻辑距离D,计算出所述可达的应用程序的数量N;所述逻辑距离D用于衡量所述第一输入在所述输入位置处可达的应用程序的数量;
    将所述输入位置处的层叠应用程序中的第1层至第N层应用程序确定为所述第一输入在所述输入位置处可达的应用程序;
    其中,Q,D是正数,N是正整数。
  4. 如权利要求3所述的方法,其特征在于,所述相邻两层应用程序之间的逻辑距离D=QLimit/M,其中,QLimit是所述非坐标信息能够被识别的上限值,M是当前层叠显示的应用程序的数量;其中,QLimit,M是正数。
  5. 如权利要求1-4中所述的方法,其特征在于,所述多个正在运行的应用程序层叠显示在第一显示区域中。
  6. 如权利要求5项所述的方法,其特征在于,所述显示所述第一输入在所 述输入位置处可达的应用程序,包括:
    在所述第一显示区域包括的N个子显示区域中分别显示所述可达的应用程序,其中,一个子显示区域显示一个可达的应用程序相应输出在所述子显示区域中的内容,可达的第i层应用程序对应的子显示区域嵌套在可达第i+1层应用程序对应的子显示区域的外围;
    其中,i<N,i是正整数,N是所述第一输入在所述输入位置处可达的应用程序的数量。
  7. 如权利要求5所述的方法,其特征在于,所述显示所述第一输入在所述输入位置处可达的应用程序,包括:在所述第一显示区域中仅显示所述第一输入在所述输入位置处可达的底层应用程序。
  8. 如权利要求5-6中任一项所述的方法,其特征在于,还包括:当所述非位置信息对应的数值大小增大时,确定所述第一输入在所述输入位置处新增的可达的应用程序,并在可达的底层应用程序对应的显示区域中新建一个子显示区域用来显示所述新增的可达的应用程序。
  9. 如权利要求5-6中任一项所述的方法,其特征在于,还包括:当所述非位置信息对应的数值大小减小时,确定所述第一输入在所述输入位置处减少的可达的应用程序,并在所述减少的可达的应用程序的上一层应用程序对应的显示区域中删除所述减少的可达的应用程序的显示区域。
  10. 如权利要求1-9中任一项所述的方法,其特征在于,还包括:当所述第一输入对应的输入位置发生变化时,确定所述第一输入在新的输入位置处可达的应用程序,并显示所述第一输入在所述新的输入位置处可达的应用程序。
  11. 如权利要求1-10中任一项所述的方法,其特征在于,还包括:当所述第一输入被释放时,在指定的延迟时间内,保持显示所述第一输入在所述输入位置处可达的应用程序。
  12. 如权利要求11所述的方法,其特征在于,还包括:在所述指定的延迟时间内,接收第二输入,并响应所述第二输入,根据所述第二输入的操作类型,调整所述第一输入在所述输入位置处可达的应用程序的显示状态。
  13. 一种终端设备上的图形用户界面,所述终端设备具有触摸屏、存储器和用以执行存储于所述存储器中的一个或一个以上程序的一个或一个以上的 处理器,所述图形用户界面包括:所述触摸屏的第一显示区域,以及层叠显示在所述第一显示区域中的多个正在运行的应用程序;其中:
    响应于所述触摸屏检测到针对所述第一显示区域的第一输入,显示所述第一输入在所述第一输入对应的输入位置处可达的应用程序;其中,所述第一输入在所述输入位置处可达的应用程序由所述输入位置和第一输入在所述输入位置处对应的非位置信息决定。
  14. 如权利要求13所述的图形用户界面,其特征在于,所述显示所述第一输入在所述第一输入对应的输入位置处可达的应用程序,包括:
    在所述第一显示区域包括的N个子显示区域中分别显示所述可达的应用程序,其中,一个子显示区域显示一个可达的应用程序相应输出在所述子显示区域中的内容,可达的第i层应用程序对应的子显示区域嵌套在可达第i+1层应用程序对应的子显示区域的外围;
    其中,i<N,i是正整数,N是所述第一输入在所述输入位置处可达的应用程序的数量。
  15. 如权利要求13所述的图形用户界面,其特征在于,所述显示所述第一输入在所述第一输入对应的输入位置处可达的应用程序,包括:在所述第一显示区域中仅显示所述第一输入在所述输入位置处可达的底层应用程序。
  16. 如权利要求13-15中任一项所述的图形用户界面,其特征在于,还包括:响应于检测到的所述第一输入对应的输入位置的变化,在所述第一显示区域中,显示所述第一输入在新的输入位置处可达的应用程序。
  17. 如权利要求13-16中任一项所述的图形用户界面,其特征在于,还包括:响应于检测到的所述第一输入在所述输入位置处对应的非位置信息的数值大小的增大,在所述可达的底层应用程序对应的显示区域中嵌套显示新增的可达的应用程序。
  18. 如权利要求13-16中任一项所述的图形用户界面,其特征在于,还包括:响应于检测到的所述第一输入在所述输入位置处对应的非位置信息的数值大小的减小,在减少的可达的应用程序的上一层应用程序对应的显示区域中取消显示所述减少的可达的应用程序。
  19. 如权利要求13-18中任一项所述的图形用户界面,其特征在于,还包 括:响应于检测到的所述第一输入被释放,在指定的延迟时间内,保持所述可达的应用程序仍然显示在所述第一显示区域中。
  20. 如权利要求19所述的图形用户界面,其特征在于,还包括:在所述指定的延迟时间内,响应于检测到的针对第一显示区域的第二输入,调整所述可达的应用程序在所述第一显示区域中的显示状态。
  21. 一种存储计算机指令的可读非易失性存储介质,所述计算机指令被具有触摸屏的终端设备执行以实现以下步骤:
    接收第一输入;所述第一输入用于查看层叠显示的多个正在运行的应用程序;
    获取所述第一输入对应的输入位置,以及所述第一输入在所述输入位置处对应的非位置信息;
    根据所述输入位置和所述非位置信息,确定所述第一输入在所述输入位置处可达的应用程序;
    显示所述第一输入在所述输入位置处可达的应用程序。
  22. 一种终端,其特征在于,包括:触摸屏,处理器,其中:
    所述触摸屏用于层叠显示多个正在运行的应用程序,检测针对所述应用程序的第一输入;
    处理器用于获取所述第一输入对应的输入位置,以及所述第一输入在所述输入位置处对应的非位置信息;
    处理器用于根据所述输入位置和所述非位置信息确定所述第一输入在所述输入位置处可达的应用程序,指示所述触摸屏显示所述可达的应用程序;
    所述触摸屏用于显示所述第一输入在所述输入位置处可达的应用程序。
  23. 如权利要求22所述的终端,其特征在于,所述非位置信息为触控压力,所述终端还包括:设置在所述触摸屏下方的压力传感器;所述压力传感器用于检测所述第一输入在所述输入位置处的触控压力;
    所述处理器通过所述压力传感器获取所述第一输入在所述输入位置处的触控压力。
  24. 如权利要求22-23中任一项所述的终端,其特征在于,所述非位置信息为触控时长,所述终端还包括:计时器;所述计时器用于检测所述第一输入 在所述输入位置处的触控时长;
    所述处理器具体用于通过所述计时器获取所述第一输入在所述输入位置处的触控时长。
  25. 如权利要求22-24中任一项所述的终端,其特征在于,所述非位置信息为触控面积,所述触摸屏用于检测所述第一输入在所述输入位置处的触控面积,所述处理器通过所述触摸屏获取所述触控面积。
  26. 如权利要求22-25中任一项所述的终端,其特征在于,所述处理器具体用于根据所述输入位置确定出所述输入位置处的层叠应用程序,并从所述输入位置处的层叠应用程序中,根据所述非位置信息确定出所述第一输入在所述输入位置处可达的应用程序。
  27. 如权利要求26所述的终端,其特征在于,所述处理器具体用于:
    根据所述非位置信息对应的数值大小Q以及相邻两层应用程序之间的逻辑距离D,计算出所述可达的应用程序的数量N;所述逻辑距离D用于衡量所述第一输入在所述输入位置处可达的应用程序的数量;
    将所述输入位置处的层叠应用程序中的第1层至第N层应用程序确定为所述第一输入在所述输入位置处可达的应用程序;
    其中,Q,D是正数,N是正整数。
  28. 如权利要求27所述的终端,其特征在于,所述相邻两层应用程序之间的逻辑距离D=QLimit/M,其中,QLimit是所述非坐标信息能够被识别的上限值,M是当前层叠显示的应用程序的数量;其中,QLimit,M是正数。
  29. 如权利要求20-23中任一项所述的终端,其特征在于,所述触摸屏具体用于将所述多个正在运行的应用程序层叠显示在所述触摸屏的第一显示区域中。
  30. 如权利要求29所述的终端,其特征在于,所述触摸屏具体用于:在所述第一显示区域包括的N个子显示区域中分别显示所述可达的应用程序,其中,一个子显示区域显示一个可达的应用程序相应输出在所述子显示区域中的内容,可达的第i层应用程序对应的子显示区域嵌套在可达第i+1层应用程序对应的子显示区域的外围;
    其中,i<N,i是正整数,N是所述第一输入在所述输入位置处可达的应 用程序的数量。
  31. 如权利要求29所述的终端,其特征在于,所述触摸屏具体用于在所述第一显示区域中仅显示所述第一输入在所述输入位置处可达的底层应用程序。
  32. 如权利要求29-30中任一项所述的终端,其特征在于,所述触摸屏还用于检测到所述非位置信息对应的数值大小的增大;所述处理器还用于响应所述数值大小的减小,并确定所述第一输入在所述输入位置处新增的可达的应用程序;所述触摸屏还用于在当前可达的底层应用程序对应的显示区域中新建一个子显示区域用来显示所述新增的可达的应用程序。
  33. 如权利要求29-30中任一项所述的终端,其特征在于,所述触摸屏还用于检测到所述非位置信息对应的数值大小的减小;所述处理器还用于响应所述数值大小的减小,并确定所述第一输入在所述输入位置处减少的可达的应用程序;所述触摸屏还用于在所述减少的可达的应用程序的上一层应用程序对应的显示区域中删除所述减少的可达的应用程序的显示区域。
  34. 如权利要求22-33中任一项所述的终端,其特征在于,所述触摸屏还用于检测所述第一输入对应的输入位置的变化;所述处理器还用于响应所述输入位置的变化,并确定所述第一输入在新的输入位置处可达的应用程序;所述触摸屏还用于显示所述第一输入在所述新的输入位置处可达的应用程序。
  35. 如权利要求22-34中任一项所述的终端,其特征在于,所述触摸屏还用于检测所述第一输入被释放,并在指定的延迟时间内,在所述触摸屏中保持显示所述第一输入在所述输入位置处可达的应用程序。
  36. 如权利要求35所述的终端,其特征在于,所述触摸屏还用于在所述指定的延迟时间内,检测针对所述可达的应用程序的第二输入,并根据所述第二输入的操作类型,调整所述第一输入在所述输入位置处可达的应用程序的显示状态。
PCT/CN2016/088015 2016-06-30 2016-06-30 一种查看应用程序的图形用户界面、方法及终端 WO2018000382A1 (zh)

Priority Applications (3)

Application Number Priority Date Filing Date Title
PCT/CN2016/088015 WO2018000382A1 (zh) 2016-06-30 2016-06-30 一种查看应用程序的图形用户界面、方法及终端
US16/313,796 US11314388B2 (en) 2016-06-30 2016-06-30 Method for viewing application program, graphical user interface, and terminal
CN201680086699.5A CN109313531A (zh) 2016-06-30 2016-06-30 一种查看应用程序的图形用户界面、方法及终端

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2016/088015 WO2018000382A1 (zh) 2016-06-30 2016-06-30 一种查看应用程序的图形用户界面、方法及终端

Publications (1)

Publication Number Publication Date
WO2018000382A1 true WO2018000382A1 (zh) 2018-01-04

Family

ID=60785674

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/088015 WO2018000382A1 (zh) 2016-06-30 2016-06-30 一种查看应用程序的图形用户界面、方法及终端

Country Status (3)

Country Link
US (1) US11314388B2 (zh)
CN (1) CN109313531A (zh)
WO (1) WO2018000382A1 (zh)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180121000A1 (en) * 2016-10-27 2018-05-03 Microsoft Technology Licensing, Llc Using pressure to direct user input
CN106730827B (zh) * 2016-12-06 2018-10-19 腾讯科技(深圳)有限公司 一种对象显示的方法以及终端设备
CN110874166B (zh) * 2018-08-29 2022-05-03 腾讯科技(深圳)有限公司 页面切换方法、装置、存储介质及计算机设备
CN112799562B (zh) * 2021-02-07 2023-02-10 韩龚隆 app或客户端系统运作新方法

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130113715A1 (en) * 2011-11-07 2013-05-09 Immersion Corporation Systems and Methods for Multi-Pressure Interaction on Touch-Sensitive Surfaces
US20140365854A1 (en) * 2013-06-09 2014-12-11 Apple Inc. Stacked Tab View
CN105094618A (zh) * 2015-08-25 2015-11-25 努比亚技术有限公司 管理后台应用程序的方法及装置
CN105159530A (zh) * 2015-08-27 2015-12-16 广东欧珀移动通信有限公司 一种应用的显示对象切换方法及装置
CN105373321A (zh) * 2014-08-13 2016-03-02 中兴通讯股份有限公司 移动终端功能对象的控制方法及装置、移动终端

Family Cites Families (105)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6002397A (en) * 1997-09-30 1999-12-14 International Business Machines Corporation Window hatches in graphical user interface
US6657644B1 (en) * 1999-09-07 2003-12-02 International Business Machines Corporation Layer viewport for enhanced viewing in layered drawings
US6670970B1 (en) * 1999-12-20 2003-12-30 Apple Computer, Inc. Graduated visual and manipulative translucency for windows
WO2001063919A1 (en) * 2000-02-23 2001-08-30 Penta Trading Ltd. Systems and methods for generating and providing previews of electronic files such as web files
US6654036B1 (en) * 2000-06-05 2003-11-25 International Business Machines Corporation Method, article of manufacture and apparatus for controlling relative positioning of objects in a windows environment
TW521205B (en) * 2001-06-05 2003-02-21 Compal Electronics Inc Touch screen capable of controlling amplification with pressure
US7019757B2 (en) * 2002-01-28 2006-03-28 International Business Machines Corporation Changing the alpha levels of an application window to indicate a status of a computing task
JP4115198B2 (ja) * 2002-08-02 2008-07-09 株式会社日立製作所 タッチパネルを備えた表示装置
US7296230B2 (en) * 2002-11-29 2007-11-13 Nippon Telegraph And Telephone Corporation Linked contents browsing support device, linked contents continuous browsing support device, and method and program therefor, and recording medium therewith
US8164573B2 (en) * 2003-11-26 2012-04-24 Immersion Corporation Systems and methods for adaptive interpretation of input from a touch-sensitive input device
EP1776630A2 (en) * 2004-08-02 2007-04-25 Koninklijke Philips Electronics N.V. Pressure-controlled navigating in a touch screen
US7429993B2 (en) * 2004-09-17 2008-09-30 Microsoft Corporation Method and system for presenting functionally-transparent, unobtrusive on-screen windows
US9071870B2 (en) * 2004-12-08 2015-06-30 Nokia Technologies Oy System and method for viewing digital visual content on a device
US7683889B2 (en) * 2004-12-21 2010-03-23 Microsoft Corporation Pressure based selection
US20060271877A1 (en) * 2005-05-25 2006-11-30 Citrix Systems, Inc. A system and methods for selective sharing of an application window
JP2006345209A (ja) * 2005-06-08 2006-12-21 Sony Corp 入力装置、情報処理装置、情報処理方法、及びプログラム
WO2007016704A2 (en) * 2005-08-02 2007-02-08 Ipifini, Inc. Input device having multifunctional keys
JP5008560B2 (ja) * 2005-11-02 2012-08-22 パナソニック株式会社 表示オブジェクト透過装置
KR101269375B1 (ko) * 2006-05-24 2013-05-29 엘지전자 주식회사 터치스크린 장치 및 이의 이미지 표시방법
JP2008033739A (ja) * 2006-07-31 2008-02-14 Sony Corp 力覚フィードバックおよび圧力測定に基づくタッチスクリーンインターラクション方法および装置
US8191003B2 (en) * 2007-02-14 2012-05-29 International Business Machines Corporation Managing transparent windows
US20090031237A1 (en) * 2007-07-26 2009-01-29 Nokia Corporation Displaying and navigating through multiple applications
US9513765B2 (en) * 2007-12-07 2016-12-06 Sony Corporation Three-dimensional sliding object arrangement method and system
US8117557B2 (en) * 2008-01-03 2012-02-14 People Driven Performance, Inc. Multi-mode viewer control for viewing a series of statistical values
US20090237374A1 (en) * 2008-03-20 2009-09-24 Motorola, Inc. Transparent pressure sensor and method for using
US8335996B2 (en) * 2008-04-10 2012-12-18 Perceptive Pixel Inc. Methods of interfacing with multi-input devices and multi-input display systems employing interfacing techniques
US7958447B2 (en) * 2008-05-23 2011-06-07 International Business Machines Corporation Method and system for page navigating user interfaces for electronic devices
KR101495559B1 (ko) * 2008-07-21 2015-02-27 삼성전자주식회사 사용자 명령 입력 방법 및 그 장치
KR101044679B1 (ko) * 2008-10-02 2011-06-29 (주)아이티버스 문자입력방법
US9116569B2 (en) * 2008-11-26 2015-08-25 Blackberry Limited Touch-sensitive display method and apparatus
JP5173870B2 (ja) * 2009-01-28 2013-04-03 京セラ株式会社 入力装置
JP4723656B2 (ja) * 2009-02-03 2011-07-13 京セラ株式会社 入力装置
JP5267229B2 (ja) * 2009-03-09 2013-08-21 ソニー株式会社 情報処理装置、情報処理方法及び情報処理プログラム
US20100275122A1 (en) * 2009-04-27 2010-10-28 Microsoft Corporation Click-through controller for mobile interaction
US20100287493A1 (en) * 2009-05-06 2010-11-11 Cadence Design Systems, Inc. Method and system for viewing and editing an image in a magnified view
WO2011011025A1 (en) * 2009-07-24 2011-01-27 Research In Motion Limited Method and apparatus for a touch-sensitive display
JP5197521B2 (ja) * 2009-07-29 2013-05-15 京セラ株式会社 入力装置
JP5182260B2 (ja) * 2009-09-02 2013-04-17 ソニー株式会社 操作制御装置、操作制御方法およびコンピュータプログラム
JP2011053971A (ja) * 2009-09-02 2011-03-17 Sony Corp 情報処理装置、情報処理方法およびプログラム
KR20110028834A (ko) * 2009-09-14 2011-03-22 삼성전자주식회사 터치스크린을 구비한 휴대 단말기의 터치 압력을 이용한 사용자 인터페이스 제공 방법 및 장치
KR101092722B1 (ko) * 2009-12-02 2011-12-09 현대자동차주식회사 차량의 멀티미디어 시스템 조작용 사용자 인터페이스 장치
KR101087479B1 (ko) * 2010-01-29 2011-11-25 주식회사 팬택 멀티 디스플레이 장치 및 그 제어 방법
US10140003B1 (en) * 2010-03-26 2018-11-27 Open Invention Network Llc Simultaneous zoom in windows on a touch sensitive device
US9383887B1 (en) * 2010-03-26 2016-07-05 Open Invention Network Llc Method and apparatus of providing a customized user interface
US9223529B1 (en) * 2010-03-26 2015-12-29 Open Invention Network, Llc Method and apparatus of processing information in an environment with multiple devices and limited resources
US9182948B1 (en) * 2010-04-08 2015-11-10 Cadence Design Systems, Inc. Method and system for navigating hierarchical levels using graphical previews
KR101699739B1 (ko) * 2010-05-14 2017-01-25 엘지전자 주식회사 휴대 단말기 및 그 동작방법
US8860672B2 (en) * 2010-05-26 2014-10-14 T-Mobile Usa, Inc. User interface with z-axis interaction
WO2012037688A1 (en) * 2010-09-24 2012-03-29 Research In Motion Limited Transitional view on a portable electronic device
US9030419B1 (en) * 2010-09-28 2015-05-12 Amazon Technologies, Inc. Touch and force user interface navigation
US9262002B2 (en) * 2010-11-03 2016-02-16 Qualcomm Incorporated Force sensing touch screen
KR101788049B1 (ko) * 2010-12-15 2017-10-19 엘지전자 주식회사 이동 단말기 및 그 제어방법
CN102541399A (zh) * 2010-12-20 2012-07-04 联想(北京)有限公司 电子设备及其显示切换方法
KR101873787B1 (ko) * 2011-02-10 2018-07-03 삼성전자주식회사 터치스크린 단말기에서 멀티 터치 입력 처리 방법 및 장치
US9142193B2 (en) * 2011-03-17 2015-09-22 Intellitact Llc Linear progression based window management
US8952987B2 (en) * 2011-05-19 2015-02-10 Qualcomm Incorporated User interface elements augmented with force detection
US9013366B2 (en) * 2011-08-04 2015-04-21 Microsoft Technology Licensing, Llc Display environment for a plurality of display devices
JP5576841B2 (ja) * 2011-09-09 2014-08-20 Kddi株式会社 押圧による画像のズームが可能なユーザインタフェース装置、画像ズーム方法及びプログラム
US8997017B2 (en) * 2011-10-21 2015-03-31 International Business Machines Corporation Controlling interactions via overlaid windows
US9207951B2 (en) * 2011-11-17 2015-12-08 Prezi, Inc. Grouping with frames to transform display elements within a zooming user interface
CN104169848B (zh) * 2011-11-18 2017-10-20 森顿斯公司 检测触摸输入力
KR101824007B1 (ko) * 2011-12-05 2018-01-31 엘지전자 주식회사 이동 단말기 및 그의 멀티 태스킹 방법
EP2790095B8 (en) * 2011-12-07 2018-10-24 International Business Machines Corporation Method of displaying electronic document, and apparatus and computer program therefor
US20130154933A1 (en) * 2011-12-20 2013-06-20 Synaptics Incorporated Force touch mouse
US9257098B2 (en) * 2011-12-23 2016-02-09 Nokia Technologies Oy Apparatus and methods for displaying second content in response to user inputs
KR101710547B1 (ko) * 2012-01-10 2017-02-27 엘지전자 주식회사 이동 단말기 및 이동 단말기의 제어 방법
US9594499B2 (en) * 2012-02-21 2017-03-14 Nokia Technologies Oy Method and apparatus for hover-based spatial searches on mobile maps
US20150029149A1 (en) * 2012-03-13 2015-01-29 Telefonaktiebolaget L M Ericsson (Publ) Apparatus and Method for Navigating on a Touch Sensitive Screen Thereof
US8760425B2 (en) * 2012-03-20 2014-06-24 Sony Corporation Method and apparatus for enabling touchpad gestures
WO2013169851A2 (en) * 2012-05-09 2013-11-14 Yknots Industries Llc Device, method, and graphical user interface for facilitating user interaction with controls in a user interface
WO2013169843A1 (en) * 2012-05-09 2013-11-14 Yknots Industries Llc Device, method, and graphical user interface for manipulating framed graphical objects
US9898155B2 (en) * 2012-05-11 2018-02-20 Samsung Electronics Co., Ltd. Multiple window providing apparatus and method
US8816989B2 (en) * 2012-05-22 2014-08-26 Lenovo (Singapore) Pte. Ltd. User interface navigation utilizing pressure-sensitive touch
US8843964B2 (en) * 2012-06-27 2014-09-23 Cable Television Laboratories, Inc. Interactive matrix cell transformation user interface
US9594469B2 (en) * 2012-07-25 2017-03-14 Sap Se Dynamic layering user interface
KR101946365B1 (ko) * 2012-08-20 2019-02-11 엘지전자 주식회사 디스플레이 디바이스 및 그 제어 방법
TWI484405B (zh) * 2012-08-23 2015-05-11 Egalax Empia Technology Inc 圖形使用者界面的顯示方法及使用該方法的電子裝置
US20140168093A1 (en) * 2012-12-13 2014-06-19 Nvidia Corporation Method and system of emulating pressure sensitivity on a surface
CN104380231B (zh) * 2012-12-20 2017-10-24 英特尔公司 包括压力传感器的触摸屏
KR102301592B1 (ko) * 2012-12-29 2021-09-10 애플 인크. 사용자 인터페이스 계층을 내비게이션하기 위한 디바이스, 방법 및 그래픽 사용자 인터페이스
US9626008B2 (en) * 2013-03-11 2017-04-18 Barnes & Noble College Booksellers, Llc Stylus-based remote wipe of lost device
US10203815B2 (en) * 2013-03-14 2019-02-12 Apple Inc. Application-based touch sensitivity
KR102102157B1 (ko) * 2013-03-29 2020-04-21 삼성전자주식회사 복수 어플리케이션을 실행하는 디스플레이 장치 및 그 제어 방법
US9389718B1 (en) * 2013-04-04 2016-07-12 Amazon Technologies, Inc. Thumb touch interface
US9110573B2 (en) * 2013-04-26 2015-08-18 Google Inc. Personalized viewports for interactive digital maps
CN104166505B (zh) 2013-05-20 2018-11-06 腾讯科技(深圳)有限公司 一种信息查看方法、装置及移动终端
KR102089447B1 (ko) * 2013-06-04 2020-03-16 삼성전자 주식회사 전자 기기 및 그의 애플리케이션 제어 방법
KR102148725B1 (ko) * 2013-07-31 2020-08-28 삼성전자주식회사 어플리케이션을 표시하는 방법 및 장치
EP3037938A4 (en) * 2013-08-22 2017-05-10 Samsung Electronics Co., Ltd. Application execution method by display device and display device thereof
US20150193096A1 (en) * 2014-01-07 2015-07-09 Samsung Electronics Co., Ltd. Electronic device and method for operating the electronic device
US9594489B2 (en) * 2014-08-12 2017-03-14 Microsoft Technology Licensing, Llc Hover-based interaction with rendered content
KR102289786B1 (ko) * 2014-11-21 2021-08-17 엘지전자 주식회사 이동 단말기 및 그 제어 방법
US10430067B2 (en) * 2014-12-18 2019-10-01 Rovi Guides, Inc. Methods and systems for presenting scrollable displays
US9645732B2 (en) * 2015-03-08 2017-05-09 Apple Inc. Devices, methods, and graphical user interfaces for displaying and using menus
US9891811B2 (en) * 2015-06-07 2018-02-13 Apple Inc. Devices and methods for navigating between user interfaces
US10346030B2 (en) * 2015-06-07 2019-07-09 Apple Inc. Devices and methods for navigating between user interfaces
KR20170024846A (ko) * 2015-08-26 2017-03-08 엘지전자 주식회사 이동단말기 및 그 제어방법
US9886095B2 (en) * 2015-09-24 2018-02-06 Stmicroelectronics Sa Device and method for recognizing hand gestures using time-of-flight sensing
KR102426695B1 (ko) * 2015-10-20 2022-07-29 삼성전자주식회사 화면 출력 방법 및 이를 지원하는 전자 장치
US9652069B1 (en) * 2015-10-22 2017-05-16 Synaptics Incorporated Press hard and move gesture
CN105607829A (zh) 2015-12-16 2016-05-25 魅族科技(中国)有限公司 一种显示方法及装置
CN105677178B (zh) 2015-12-30 2019-02-05 Oppo广东移动通信有限公司 一种调整控件所属图层的方法及移动终端
CN107132941B (zh) * 2016-02-29 2020-04-21 华为技术有限公司 一种压力触控方法及电子设备
US10346020B2 (en) * 2016-10-20 2019-07-09 Adobe Inc. Relatively changing a parametric value using a pressure sensitive user interface element
EP3385831A1 (en) * 2017-04-04 2018-10-10 Lg Electronics Inc. Mobile terminal

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130113715A1 (en) * 2011-11-07 2013-05-09 Immersion Corporation Systems and Methods for Multi-Pressure Interaction on Touch-Sensitive Surfaces
US20140365854A1 (en) * 2013-06-09 2014-12-11 Apple Inc. Stacked Tab View
CN105373321A (zh) * 2014-08-13 2016-03-02 中兴通讯股份有限公司 移动终端功能对象的控制方法及装置、移动终端
CN105094618A (zh) * 2015-08-25 2015-11-25 努比亚技术有限公司 管理后台应用程序的方法及装置
CN105159530A (zh) * 2015-08-27 2015-12-16 广东欧珀移动通信有限公司 一种应用的显示对象切换方法及装置

Also Published As

Publication number Publication date
US20190155462A1 (en) 2019-05-23
CN109313531A (zh) 2019-02-05
US11314388B2 (en) 2022-04-26

Similar Documents

Publication Publication Date Title
US10387016B2 (en) Method and terminal for displaying a plurality of pages,method and terminal for displaying a plurality of applications being executed on terminal, and method of executing a plurality of applications
EP3617861A1 (en) Method of displaying graphic user interface and electronic device
EP2825944B1 (en) Touch screen hover input handling
US8654076B2 (en) Touch screen hover input handling
JP6931641B2 (ja) 情報処理装置、情報処理方法及びコンピュータプログラム
US20170131835A1 (en) Touch-Sensitive Bezel Techniques
US9626102B2 (en) Method for controlling screen and electronic device thereof
US20150160849A1 (en) Bezel Gesture Techniques
US9665177B2 (en) User interfaces and associated methods
US20130061122A1 (en) Multi-cell selection using touch input
KR20170076357A (ko) 사용자 단말 장치, 이의 스피커 장치의 음량을 조절하기 위한 모드 전환 방법 및 음향 시스템
TWI547855B (zh) 資訊處理裝置,資訊處理方法及程式
WO2018000382A1 (zh) 一种查看应用程序的图形用户界面、方法及终端
KR20150065141A (ko) 모바일 디바이스 및 모바일 디바이스의 아이콘 표시 방법
EP2703983B1 (en) Method of controlling touch function and an electronic device thereof
AU2014201249B2 (en) Method for controlling display function and an electronic device thereof
EP3528103B1 (en) Screen locking method, terminal and screen locking device
US9870085B2 (en) Pointer control method and electronic device thereof
US20170123623A1 (en) Terminating computing applications using a gesture
WO2019051846A1 (zh) 终端界面的显示方法和装置
EP3659024A1 (en) Programmable multi-touch on-screen keyboard

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16906782

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16906782

Country of ref document: EP

Kind code of ref document: A1