CN110456949A - For the equipment, method and graphic user interface of taskbar to be navigated and shown between user interface - Google Patents

For the equipment, method and graphic user interface of taskbar to be navigated and shown between user interface Download PDF

Info

Publication number
CN110456949A
CN110456949A CN201811166251.1A CN201811166251A CN110456949A CN 110456949 A CN110456949 A CN 110456949A CN 201811166251 A CN201811166251 A CN 201811166251A CN 110456949 A CN110456949 A CN 110456949A
Authority
CN
China
Prior art keywords
display
user interface
application program
contact
input
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201811166251.1A
Other languages
Chinese (zh)
Inventor
B·M·沃金
S·奇迪亚
C·G·卡鲁纳姆尼
M·阿朗索鲁伊斯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Apple Inc
Original Assignee
Apple Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from DKPA201870336A external-priority patent/DK180116B1/en
Priority to CN202110465095.4A priority Critical patent/CN113220177A/en
Application filed by Apple Inc filed Critical Apple Inc
Priority to EP19724034.4A priority patent/EP3791248A2/en
Priority to JP2020554462A priority patent/JP7022846B2/en
Priority to KR1020207035129A priority patent/KR102503076B1/en
Priority to AU2019266126A priority patent/AU2019266126B2/en
Priority to KR1020237005896A priority patent/KR102662244B1/en
Priority to PCT/US2019/030385 priority patent/WO2019217196A2/en
Priority to US17/603,879 priority patent/US11797150B2/en
Priority to AU2019100488A priority patent/AU2019100488B4/en
Priority to AU2019101068A priority patent/AU2019101068B4/en
Priority to US16/661,964 priority patent/US11079929B2/en
Publication of CN110456949A publication Critical patent/CN110456949A/en
Priority to AU2021282433A priority patent/AU2021282433B2/en
Priority to JP2022017212A priority patent/JP7337975B2/en
Priority to AU2023202742A priority patent/AU2023202742B2/en
Priority to JP2023135764A priority patent/JP2023166446A/en
Priority to US18/368,531 priority patent/US20240045564A1/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Abstract

Entitled " for the equipment, method and graphic user interface of taskbar to be navigated and shown between user interface " of the invention.The invention discloses a kind of electronic equipment with touch-sensitive display, which shows the first user interface different from home screen.The equipment detects the first input in the first edge of display.In response, when continue to test on the first edge to first contact when, when detecting the first input in the first part in first edge and the first input meets taskbar display standard, the equipment is shown at the first position along first edge has multiple application program image target taskbars, and when detecting the first input on the second part different from first part in first edge and the first input meets taskbar display standard, along first edge be selected as include the second part of first edge the second place show taskbar, wherein the second position is different from first position.

Description

For navigated and shown between user interface the equipment of taskbar, method and Graphic user interface
Technical field
The electronic equipment with touch sensitive surface is related generally to herein, is including but not limited to had between user interface Navigated and shown the electronic equipment of the touch sensitive surface of taskbar.
Background technique
Use in recent years significant growth of the touch sensitive surface as the input equipment of computer and other electronic computing devices. Exemplary touch sensitive surface includes touch tablet and touch-screen display.Such surface is widely used for manipulating user circle on display Face and object therein.Exemplary user interface object include digital picture, video, text, icon and control element (such as, Button) and other figures.
Exemplary manipulation includes: to adjust position and/or the size of one or more user interface objects, activated user interface Button represented by object opens file/application program represented by user interface object, by metadata and one or more User interface object is associated, navigates between user interface, or otherwise manipulate user interface.Example user Interface object includes digital picture, video, text, icon, control element (such as, button) and other figures.In certain feelings Under condition, user will need to execute such manipulation to the user interface object in the following terms: documentor is (for example, come from The Finder of Apple Inc. (Cupertino, California));Image management application is (for example, come from Apple Inc. Aperture, iPhoto, Photos of (Cupertino, California));Digital content (for example, video and music) Manage application program (for example, the iTunes for coming from Apple Inc. (Cupertino, California));Drawing application program; Show application program (for example, the Keynote for coming from Apple Inc. (Cupertino, California));Text processing application Program (for example, the Pages for coming from Apple Inc. (Cupertino, California));Or spreadsheet applications (example Such as, the Numbers of Apple Inc. (Cupertino, California) is come from).
But the method for executing these manipulations is trouble and inefficient.For example, using a series of based on the defeated of mouse Enter to close the first user interface, navigates in multipage home screen to identify second user interface, then select second User interface is cumbersome and error-prone for showing.In addition, these methods spend the longer time than the time required to, thus Waste energy.This later regard is especially important in battery-driven equipment.
Summary of the invention
Therefore, it is necessary to using more rapidly, the electronic equipment of more effective way and interface, between user interface into Row navigation and display taskbar.Such method and interface optionally supplement or replace for navigate between user interface with And the conventional method of display taskbar.Such method and interface reduce the quantity, degree, and/or property of input from the user Matter, and generate more effective man-machine interface.For battery operated device, such method and interface can save electricity consumption and increase Time between the charging of two primary cells.
It can be reduced or eliminated by disclosed equipment associated with having the user interface of electronic equipment of touch sensitive surface Drawbacks described above and other problems.In some embodiments, which is desktop computer.In some embodiments, should Equipment is portable (for example, laptop, tablet computer or handheld device).In some embodiments, which is Personal electronic equipments (for example, wearable electronic, such as wrist-watch).In some embodiments, which has Trackpad. In some embodiments, which has touch-sensitive display (also referred to as " touch screen " or " touch-screen display ").Some In embodiment, which has graphic user interface (GUI), one or more processors, memory and one or more moulds Block is stored in memory with the program or instruction set for executing multiple functions.In some embodiments, user is main It is interacted by the gesture on stylus and/or finger contact and touch sensitive surface with GUI.In some embodiments, this A little functions optionally include picture editting, drawing, displaying, word processing, electrical form production, play game, making and receiving calls, video Meeting is sent and received e-mail, instant messaging, body-building support, digital photography, digital video recordings, web page browsing, digital sound It is happy to play, record the note and/or digital video plays.Executable instruction for executing these functions, which is optionally included in, is matched It sets in the non-transient computer readable storage medium being performed by one or more processors or other computer program products.
According to some embodiments, a kind of method is executed at the equipment with touch-sensitive display.This method is included in aobvious Show and show the first user interface on device, wherein the first user interface is different from home on-screen user interface, home screen user circle Face includes multiple application icons corresponding with the different application in the multiple application programs being mounted in equipment.The party Method further includes detecting in the first edge of display when showing the first user interface over the display through the first contact progress First input.This method further include: in response to detecting the first input on the edge of the display, and when the of display It is continued to test on one edge to when the first contact;It is in the first part of the first edge of display according to determining first input Detect and the first input meets taskbar and shows standard, shows at the first position along the first edge of display With multiple application program image target taskbars;It also, is to be different from first edge in display according to determining first input Detect on the second part of the first edge of first part and the first input meets taskbar and shows standard, along The first edge of display be selected as include the second part of the first edge of display the second place show task Column, wherein the second position is different from first position.
According to some embodiments, a kind of method is executed at the equipment with touch sensitive surface and display.This method packet It includes while showing the first application program user interface in the first part of display, is being different from second of first part Divide upper display the second application program user interface.This method further includes showing first in the first part of display at the same time When the second application program user interface on the second part of application program user interface and display, detection passes through the first contact The first input carried out, first contact include the movement on first direction.This method further include: in response to detecting that first is defeated Enter, according to first input the first standard of satisfaction is determined, wherein it includes beyond on first direction that the first standard, which includes the first input, First threshold amount of movement replaces the first user interface and second user circle to meet the requirement of the first standard, with full frame home screen The display in face;Also, according to determining that the first input meets the second standard, wherein it includes less than that the second standard, which includes the first input, First threshold amount of movement on one direction is to meet the requirement of the second standard, and determines that the first input is corresponding to the first application Start in the first edge region of the display of program user interface, replaces the first application program with the first replacement user interface and use The display at family interface, while maintaining the display of the second application program user interface in the second part of display;Also, according to It determines that the first input meets the second standard, and determines the first input on the second side for corresponding to the second application program user interface Edge starts in region, replaces the display of the second application program user interface with the second replacement user interface, while maintaining display First part in the first application program user interface display.
According to some embodiments, a kind of method is executed at the equipment with touch sensitive surface and display.This method packet Include the user interface of the first application program of the multiple application programs being shown mounted in equipment over the display.This method is also wrapped The gesture on detection touch sensitive surface is included, wherein detection gesture includes showing the user interface of the first application program over the display The initial part of detection gesture simultaneously, and detection gesture includes the multiple contacts detected on touch sensitive surface simultaneously and detects multiple The movement of contact.This method further include: in response to detecting the gesture on touch sensitive surface: according to determine gesture include two simultaneously The contact detected executes behaviour in the first application program based on the movement of two contacts being detected simultaneously by during the gesture Make;According to the contact detected while determining that gesture includes more than predetermined quantity, the predetermined quantity be greater than two, and at the same time Mobile satisfaction first standard of the contact detected during the gesture is switched to from the user interface of the first application program of display It shows in multiple application programs different from the user interface of the second application program of the first application program;And according to determining gesture The contact that detects while including being more than predetermined quantity, and the mobile satisfaction for the contact being detected simultaneously by during gesture with The second different standard of first standard is switched to display user interface from the user interface of the first application program of display, the user Interface includes the corresponding application programs icon for being used to open the multiple application programs being mounted in equipment.
According to some embodiments, a kind of electronic equipment includes: display;Touch sensitive surface;Optionally one or more is deposited Reservoir, for detecting and the contact strength of touch sensitive surface;Optionally one or more tactile output generators;At one or more Manage device;And memory, the memory store one or more programs;The one or more program is configured to by one or more A processor executes, and one or more programs include the operation for executing or causing to execute any method described herein Instruction.According to some embodiments, computer readable storage medium is stored with instruction wherein, these instructions are when aobvious by having Show device, touch sensitive surface, for detecting and optionally one or more sensors of the contact strength of touch sensitive surface and optionally When the electronic equipment of one or more tactile output generators executes, so that the equipment executes the behaviour of any method as described herein Make or the operation of any method described herein is performed.According to some embodiments, there is display, touch sensitive surface, use It is sent out in detection and the optionally one or more sensors of the contact strength of touch sensitive surface, optionally one or more tactile outputs The electronics of raw device, memory and the one or more processors for executing one or more programs stored in memory is set Standby upper graphic user interface includes shown one or more elements in any method described herein, the one or more Element responds are updated in input, described in any method as described herein.According to some embodiments, Yi Zhong electricity Sub- equipment includes: display, touch sensitive surface, for detecting optional one or more sensings with the contact strength of touch sensitive surface Device and optionally one or more tactile output generators.And for executing or causing to execute appointing in methods described herein The device of the operation of one method.According to some embodiments, for having display, touch sensitive surface, for detecting and touch-sensitive table The electronics of the optionally one or more sensors of the contact strength in face and optionally one or more tactile output generators Information processing equipment in equipment includes operation for executing any method as described herein or makes as described herein any The device that the operation of method is performed.
Therefore, to display, touch sensitive surface, for detect with optionally one of the contact strength of touch sensitive surface or Multiple sensors, optionally one or more tactile output generators, optionally one or more apparatus orientation sensors and Optionally the electronic equipment of audio system provides the improved side for taskbar to be navigated and shown between user interface Method and interface, to improve the validity of these equipment, efficiency and user satisfaction.Such method and interface can supplement or replace For the conventional method of taskbar to be navigated and shown between user interface.
Detailed description of the invention
The various embodiments in order to better understand should refer to following specific embodiment in conjunction with the following drawings, Wherein similar drawing reference numeral indicates corresponding part in all the appended drawings.
Figure 1A is the block diagram for showing the portable multifunction device in accordance with some embodiments with touch-sensitive display.
Figure 1B is the block diagram for showing the example components for event handling according to some embodiments.
Fig. 2 shows the portable multifunction devices with touch screen according to some embodiments.
Fig. 3 is the block diagram according to the exemplary multifunctional equipment with display and touch sensitive surface of some embodiments.
Fig. 4 A shows the exemplary of the application menu on the portable multifunction device according to some embodiments User interface.
Fig. 4 B is shown according to some embodiments for the multifunctional equipment with the touch sensitive surface separated with display Exemplary user interface.
Fig. 4 C to Fig. 4 E shows the embodiment of the resistance to vibration threshold value according to some embodiments.
Fig. 5 A1 to Fig. 5 A29 is shown according to some embodiments for one or more sides along touch-sensitive display Edge shows the exemplary user interface with multiple application program image target taskbars at variable position.
Fig. 5 B1 to Fig. 5 B36, which is shown, to be used for according to some embodiments from user circle shown with split screen display available mode Face navigates to the exemplary user interface at different user interface.
Fig. 5 C1 to Fig. 5 C59 is shown according to some embodiments for using more contact gestures at different user interface Between the exemplary user interface navigated.
Fig. 6 A to Fig. 6 F is for the one or more edges along touch-sensitive display according to some embodiments can Conjugating the place's of setting display has the flow chart of process of multiple application program image target taskbars.
Fig. 7 A to Fig. 7 I is to be used for according to some embodiments from the user interface navigation shown with split screen display available mode To the flow chart of the process at different user interface.
Fig. 8 is shown according to some embodiments in application program user interface, application program switch user circle The flow chart for the method navigated between face and home on-screen user interface.
Fig. 9 A to Fig. 9 C shows showing for what is navigated between different user interface according to some embodiments Example property threshold value.
Figure 10 A to Figure 10 D is the method navigated between user interface shown according to some embodiments Flow chart.
Figure 11 A to Figure 11 F is according to some embodiments for being carried out between user interface based on more contact gestures The flow chart of the process of navigation.
Specific embodiment
Between user interface, especially in application program user interface and system user interface (for example, home screen is used Family interface or application program switch user interface) between the conventional method navigated usually require multiple individual inputs (for example, gesture and button press inferior) and irreversible discrete user interface transition.Following embodiments provides single hand Gesture, the single gesture are dynamically adapted, and are facilitated based on various criterion (for example, being held based on shown by contact The type of capable gesture, the position of contact and/or user interface object, the time, moving parameter various criterion) navigate to difference In user interface (for example, the application program opened recently, home on-screen user interface, application program switch user interface). In addition, following embodiments provides real-time vision feedback, to indicate navigating when user executes the input of single gesture navigation User interface.This is by allowing user to have an opportunity to mitigate mistake before input is completed (for example, by changing before being lifted away from The attribute of input) Lai Tigao user navigation accuracy.This then avoids unnecessary navigation event, to save the time And battery life.
In addition, two hands of user, which are usually involved in, holds equipment when operating biggish equipment (for example, tablet computer) (for example, from either side support equipment) makes it difficult to execute the position initiation that must be orientated from equipment far from user's hand Navigation gesture.The biggish equipment of one-handed performance is equally difficult to be utilized, because the hand, which necessarily participates in, supports the equipment.Following embodiment party Case allows to show application program taskbar (for example, being used to open/navigating to the multiple using journey of application-specific by providing Showing for sequence icon can indicate display) input of position is defined as the user along one or more edges in equipment to improve User interface navigation in larger equipment.This allows user's access application taskbar without by their heavy-handed new definition In equipment (for example, position near any position that their hand is located in equipment).This saves operation equipment when Between (for example, the need for relocating their hand by bypassing user before calling and/or interact with application program taskbar Want), this after and save the battery life of equipment.
In addition, following embodiments provides gesture, which facilitates based on various criterion (for example, based on shown Contact and/or the position of user interface object, the time, kinematic parameter various criterion) navigate to the son of split screen user interface In different user interface in part or on entire screen (for example, the application program opened recently, home on-screen user interface and Application program switch user interface).In this way can easily access equipment navigation feature, without utilize additional display control Part disarrays user interface, and reduces the time and input quantity for realizing that expected screen configuration is required, in addition, which reduce electric power to make With and extend the battery life of equipment.
In addition, following embodiments facilitates based on the gesture initiated from application program user interface (for example, using multiple The gesture that the contact being detected simultaneously by executes), another user outside application program is navigate to from application program user interface Interface such as navigates to different application programs or arrives system user interface (for example, home screen), or in application program Execute operation.In these embodiments, the result of gesture meets multiple and different marks based on gesture (for example, when gesture terminates) Which of quasi- collection (for example, based on the gesture-type executed by contact, the sum for the contact being detected simultaneously by, position, time And/or contact moving parameter, and/or display user interface object standard).When the destination state (example for determining equipment Such as, to execute any operation and/or to show what user interface) when, for the hand of different standard set continuous assessment inputs Gesture.Dynamic vision feedback is continuously displayed to be inputted based on what is up to the present detected come the possibility destination shape of indicating equipment State, so that user be made to have an opportunity to adjust his/her input, to modify the actual purpose of the equipment reached after input terminates State.Determine the final destination state of equipment (for example, performed operation and/or final aobvious using different standard sets The user interface shown) allow user that can change in stream using fluid gesture (for example, because user determines to change them It is wanting to realize as a result, user be based on equipment feedback sense and to him/her provide incorrect input for expected results) To realize expected results.This helps avoid user and cancels the influence of unexpected gesture and then start again at the needs of gesture, makes User's equipment interface it is more effective (for example, by the input needed for helping user to provide with realize expected results and when operation/ User's mistake is reduced when interacting with equipment), in addition, can faster and more effectively use equipment by using family, reduce electricity Power uses and extends the battery life of equipment.
In the following, Figure 1A -1B, Fig. 2 and Fig. 3 provide the description to example devices.Fig. 4 C to Fig. 4 E shows resistance to vibration The example of threshold value.Fig. 4 A to Fig. 4 B, Fig. 5 A1 to Fig. 5 A29, Fig. 5 B1 to Fig. 5 B36 and Fig. 5 C1 to Fig. 5 C59 show for It carries out navigating and showing in application program taskbar between user interface or executes the exemplary user interface of operation.Fig. 6 A is extremely Fig. 6 F is to show that there are multiple application program image targets to appoint at variable position along one or more edges of touch-sensitive display The flow chart of the method on business column.Fig. 7 A to Fig. 7 I is from the user interface navigation shown with split screen display available mode to different user circle The flow chart of the method in face.Figure 11 A to Figure 11 F is the stream for the process navigated between user interface based on more contact gestures Cheng Tu.User interface in Fig. 5 A1 to Fig. 5 A29, Fig. 5 B1 to Fig. 5 B36 and Fig. 5 C1 to Fig. 5 C59 is for showing Fig. 6 A to figure Process in 6F, Fig. 7 A to Fig. 7 I and Figure 11 A to Figure 11 E.Fig. 8 is shown according to some embodiments in user circle The flow chart for the various standards navigated between face.Fig. 9 A to Fig. 9 C, which is shown, to navigate between different user interface Example thresholds.Figure 10 A to 10D is shown according to the various for navigating between user interface of some embodiments The flow chart of standard.
Example devices
Reference will now be made in detail to embodiment, the example of these embodiments is shown in the accompanying drawings.Following retouches in detail Many details are shown in stating, are fully understood in order to provide to various described embodiments.But to this field Those of ordinary skill is evident that various described embodiments can be without these specific details It is practiced.In other cases, well-known method, process, component, circuit and network are not described in detail, thus not It can unnecessarily make the various aspects of embodiment hard to understand.
Although will be further understood that in some cases, term " first ", " second " etc. are various for describing herein Element, but these elements should not be limited by these terms.These terms are only intended to an element and another element region It separates.For example, the first contact can be named as the second contact, and similarly, the second contact can be named as the first contact, and The range of various described embodiments is not departed from.First contact and the second contact are contact, but they are not same Contact, unless the context clearly.
Term used in the description to the various embodiments is intended merely to description particular implementation side herein The purpose of case, and be not intended to be limiting.Such as in the description and the appended claims in the various embodiments Used "one" is intended to also include plural form with "the" singular like that, indicates unless the context clearly. It will be further understood that term "and/or" used herein refer to and cover in associated listed project one Any and all possible combinations of a or multiple projects.It will be further understood that term " includes " (" includes ", " including ", " comprises " and/or " comprising ") it specifies to exist when using in the present specification and be stated Feature, integer, step, operations, elements, and/or components, but it is not excluded that in the presence of or add other one or more features, whole Number, step, operation, component, assembly unit and/or its grouping.
As used herein, based on context, term " if " be optionally interpreted to mean " and when ... when " (" when " or " upon ") or " in response to determination " or " in response to detecting ".Similarly, based on context, phrase is " if really Calmly ... " or " if detecting [condition or event stated] " is optionally interpreted to refer to " when in determination ... " Or it " in response to determination ... " or " when detecting [condition or event stated] " or " [is stated in response to detecting Condition or event] ".
This document describes electronic equipments, the embodiment party of the user interface of such equipment and the correlated process using such equipment Case.In some embodiments, which is also portable comprising other function such as PDA and/or music player functionality Communication equipment, such as mobile phone.The exemplary implementation scheme of portable multifunction device includes but is not limited to come from Apple Inc. (Cupertino, California)iPodWithEquipment.Optionally just using other Take formula electronic equipment, such as laptop computer or plate with touch sensitive surface (for example, touch-screen display and/or touch tablet) Computer.It is to be further understood that in some embodiments, which is not portable communication device, but there is touch-sensitive table The desktop computer in face (for example, touch-screen display and/or touch tablet).
In the following discussion, a kind of electronic equipment including display and touch sensitive surface is described.However, should manage Solution, the electronic equipment optionally include other one or more physical user-interface devices, such as physical keyboard, mouse and/or Control stick.
The equipment usually supports various application programs, one or more application programs in such as following application program: note Application program, word-processing application, website creation application program, disk volume is presented in notes application program, drawing application program Collect application program, spreadsheet applications, game application, telephony application, videoconference application, electronics postal Application program, photo management application program, digital camera applications journey are supported in part application program, instant message application program, body-building Sequence, digital camera applications program, web-browsing application program, digital music player application, and/or digital video are broadcast Put device application program.
The physical user-interface device that the various application programs executed in equipment optionally use at least one general, it is all Such as touch sensitive surface.One or more functions of touch sensitive surface and the corresponding informance being displayed in equipment are optionally for difference Application program is adjusted and/or changes, and/or is adjusted and/or changes in corresponding application programs.In this way, equipment shares Physical structure (such as touch sensitive surface) supports various answer optionally with intuitive for a user and clear user interface Use program.
The embodiment that attention is drawn to the portable device with touch-sensitive display.Figure 1A is shown according to one The block diagram of the portable multifunction device 100 with touch-sensitive display system 112 of a little embodiments.Touch-sensitive display system 112 are called " touch screen " sometimes for convenient, and are called touch-sensitive display for short sometimes.Equipment 100 includes memory 102 (it optionally includes one or more computer readable storage mediums), Memory Controller 122, one or more processing Unit (CPU) 120, peripheral device interface 118, RF circuit 108, voicefrequency circuit 110, loudspeaker 111, microphone 113, input/ Export (I/O) subsystem 106, other inputs or control equipment 116 and outside port 124.Equipment 100 optionally includes one A or multiple optical sensors 164.Equipment 100 is optionally included for detection device 100 (for example, touch sensitive surface, such as equipment 100 touch-sensitive display system 112) on contact strength one or more intensity sensors 165.Equipment 100 is optionally wrapped One or more tactile output generators 167 for generating tactile output on the appliance 100 are included (for example, all in touch sensitive surface As generated tactile output in the touch-sensitive display system 112 of equipment 100 or the touch tablet 355 of equipment 300).These components are optional Ground passes through one or more communication bus or signal wire 103 is communicated.
As used in the specification and claims, term " tactile output ", which refers to, to utilize user's by user The equipment that sense of touch detects is opposite relative to physical displacement, the component (for example, touch sensitive surface) of equipment of the previous position of equipment In the displacement relative to the mass center of equipment of physical displacement or component of another component (for example, shell) of equipment.For example, In The component and user of equipment or equipment connect to sensitive surface (for example, other parts of finger, palm or user's hand) is touched In the case where touching, the tactile output generated by physical displacement will be construed to sense of touch by user, which corresponds to equipment or set The variation of the physical features of standby component perceived.For example, the movement of touch sensitive surface (for example, touch-sensitive display or Trackpad) " pressing click " or " unclamp and click " to physical actuation button is optionally construed to by user.In some cases, user will Feel sense of touch, such as " click is pressed " or " unclamp and click ", even if being physically pressed (example by the movement of user Such as, be shifted) physical actuation button associated with touch sensitive surface when not moving.For another example, though touch sensitive surface light When slippery is unchanged, the movement of touch sensitive surface also optionally can be explained by user or be sensed as touch sensitive surface " roughness ".Though Individuation sensory perception by user is limited such explanation of touch by right user, but is known many sense organs of touch Feel is that most users are shared.Therefore, when tactile output is described as the specific sensory perception corresponding to user (for example, " pressing Lower click ", " unclamp click ", " roughness ") when, unless otherwise stated, otherwise tactile output generated correspond to equipment or The physical displacement of its component, the physical displacement will generate the sensory perception of typical (or common) user.It is defeated using tactile Touch feedback is provided a user out and enhances the operability of equipment, and keeps user's equipment interface more efficient (for example, passing through side User's mistake when user being helped to provide input appropriate and reduce operation equipment/interact with equipment), thus can by using family More rapidly and efficiently uses equipment and further reduce electricity usage and extend the battery life of equipment.
In some embodiments, tactile output mode specifies the characteristic of tactile output, such as amplitude of tactile output, touching Feel the duration of the shape of the moving wave shape of output, the frequency of tactile output, and/or tactile output.
When equipment (such as the one or more tactile output generators for generating tactile output via mobile removable mass) When generating the tactile output with different tactile output modes, tactile output can generate not in the user of gripping or touch apparatus Same sense of touch.Although the perception that the sense organ of user exports tactile based on user, most users will identification apparatus generation Tactile output waveform, frequency and amplitude variation.Therefore, waveform, frequency and amplitude can be conditioned to indicate to the user that Perform different operation.In this way, have be designed, select and/or be used for simulate given environment (e.g., including figure is special Seek peace the user interface of object, the analog physical environment with virtual boundary and virtual objects, there is physical boundary and physics pair The actual physical situation of elephant, and/or the combination of any of the above person) in Properties of Objects (such as size, material, weight, rigidity, Smoothness etc.);Behavior (such as oscillation, displacement, acceleration, rotation, stretching, extension etc.);And/or interaction (such as collide, adherency, repel, Attract, friction etc.) tactile output mode tactile output helpful feedback will be provided in some cases for user, subtract Few input error and the efficiency for improving operation of the user to equipment.In addition, tactile output is optionally generated as corresponding to and institute The unrelated feedback of analog physical characteristic (such as input threshold value or Object Selection).Such tactile output will be in some cases use Family provides helpful feedback, reduces input error and improves the efficiency of operation of the user to equipment.
In some embodiments, the tactile output with suitable tactile output mode is served as in the user interface or is being set The prompt of events of interest occurs behind standby middle screen.The example of events of interest includes providing in equipment or in user interface Show and can indicate the activation of (such as true or virtual push button or tumbler switch), request the success or failure operated, reach or Boundary in user interface, into new state, between objects switch input focus, activation new model, reach or pass through Input threshold value, detection or a type of input of identification or gesture etc..In some embodiments, tactile output is provided to fill When about unless change direction or interrupt input be timely detected, otherwise what can be occurred will occur the warning of event or result Or prompt.Tactile output is also used for abundant user experience in other situations, improves with vision or dyskinesia or other The user that accessibility needs is to the accessibility of equipment, and/or the efficiency and functionality of improvement user interface and/or equipment.Optionally Tactile is exported and is compared with audio input and/or visual user interface change by ground, this further enhances user and user circle The experience of user when face and/or equipment interaction, and be conducive to the more preferable biography of the information of state about user interface and/or equipment It is defeated, and this reduces input error and improves the efficiency of operation of the user to equipment.
It should be appreciated that equipment 100 is only an example of portable multifunction device, and equipment 100 optionally has There are components more more or fewer than shown component, optionally combines two or more components, or optionally there is this The different configurations of a little components or arrangement.Various parts shown in Figure 1A are in hardware, software, firmware or any combination of them Implement in (including one or more signal processing circuits and/or specific integrated circuit).
Memory 102 optionally includes high-speed random access memory, and also optionally includes nonvolatile memory, Such as one or more disk storage equipments, flash memory device or other non-volatile solid state memory equipment.Equipment 100 other component (such as one or more CPU 120 and peripheral device interface 118) to the access of memory 102 optionally It is controlled by Memory Controller 122.
Peripheral device interface 118 can be used for the input peripheral of equipment and output peripheral equipment being couple to memory 102 and one or more CPU 120.It is stored in the operation of one or more processors 120 or execution memory 102 various soft Part program and/or instruction set are to execute the various functions of equipment 100 and handle data.
In some embodiments, peripheral device interface 118, one or more CPU 120 and Memory Controller 122 Optionally realized on one single chip such as chip 104.In some other embodiments, they are optionally in independent chip Upper realization.
RF (radio frequency) circuit 108 sends and receivees the RF signal for being also designated as electromagnetic signal.RF circuit 108 turns electric signal It is changed to electromagnetic signal/by electromagnetic signal and is converted to electric signal, and via electromagnetic signal and communication network and other communication equipments It is communicated.RF circuit 108 optionally includes the well known circuit for executing these functions, including but not limited to aerial system System, RF transceiver, one or more amplifiers, tuner, one or more oscillators, digital signal processor, encoding and decoding core Piece group, subscriber identity module (SIM) card, memory etc..RF circuit 108 comes and network and other optionally by wireless communication Equipment is communicated, these networks are that such as internet (also referred to as WWW (WWW)), Intranet and/or wireless network are (all Such as, cellular phone network, WLAN (LAN) and/or Metropolitan Area Network (MAN) (MAN)).The wireless communication optionally uses a variety of communications Any one of standard, agreement and technology, including but not limited to global system for mobile communications (GSM), enhanced data GSM ring Border (EDGE), high-speed downlink packet access (HSDPA), High Speed Uplink Packet access (HSUPA), evolution clear data (EV-DO), HSPA, HSPA+, double unit HSPA (DC-HSPDA), long term evolution (LTE), near-field communication (NFC), wideband code division Multiple access (W-CDMA), CDMA (CDMA), time division multiple acess (TDMA), bluetooth, Wireless Fidelity (Wi-Fi) are (for example, IEEE 802.11a, IEEE 802.11ac, IEEE 802.11ax, IEEE 802.11b, IEEE 802.11g and/or IEEE 802.11n), internet protocol voice technology (VoIP), Wi-MAX, email protocol are (for example, internet message access protocol (IMAP) and/or post office protocol (POP)), instant message (for example, scalable message processing and there are agreement (XMPP), for i.e. When message and there is Session initiation Protocol (SIMPLE), instant message and the presence service (IMPS) using extension), and/or it is short Messaging service (SMS) or include this document submission date also it is untapped go out communication protocol other are any appropriate logical Believe agreement.
Voicefrequency circuit 110, loudspeaker 111 and microphone 113 provide the audio interface between user and equipment 100.Audio Circuit 110 receives audio data from peripheral device interface 118, audio data is converted to electric signal, and electric signal transmission is arrived Loudspeaker 111.Loudspeaker 111 converts electrical signals to the audible sound wave of human ear.Voicefrequency circuit 110 is also received by microphone 113 electric signals converted from sound wave.Voicefrequency circuit 110 converts electrical signals to audio data, and audio data is transferred to Peripheral device interface 118 is for handling.Audio data is optionally retrieved from and/or is transmitted to by peripheral device interface 118 and deposited Reservoir 102 and/or RF circuit 108.In some embodiments, voicefrequency circuit 110 further includes earphone jack (for example, in Fig. 2 212).Earphone jack provides the interface between voicefrequency circuit 110 and removable audio input/output peripheral equipment, this is removable The earphone or have output (for example, single head-receiver or ears ear that the audio input removed/output peripheral equipment such as only exports Machine) and input both (for example, microphone) headset.
I/O subsystem 106 is by such as touch-sensitive display system 112 of the input/output peripheral equipment in equipment 100 and other Input or control equipment 116 and peripheral device interface 118 couple.I/O subsystem 106 optionally includes display controller 156, light It learns sensor controller 158, intensity sensor controller 159, tactile feedback controller 161 and inputs or control for other One or more input controllers 160 of equipment.One or more of input controllers 160 are from other inputs or control equipment 116 reception electric signals/send other described inputs for electric signal or control equipment.Other input control apparatus 116 are optionally Including physical button (for example, pushing button, rocker buttons etc.), dial, slide switch, control stick, click wheel etc..Some In alternative embodiment, one or more input controllers 160 are optionally coupled to any one of the following terms (or not coupling It is connected to any one of the following terms): keyboard, infrared port, USB port, stylus, and/or pointing device such as mouse.One A or multiple buttons (for example, 208 in Fig. 2) optionally include the volume control for loudspeaker 111 and/or microphone 113 Up/down button.One or more buttons, which optionally include, pushes button (for example, 206 in Fig. 2).
Touch-sensitive display system 112 provides the input interface and output interface between equipment and user.Display controller 156 Electric signal is received from touch-sensitive display system 112 and/or electric signal is sent to touch-sensitive display system 112.Touch-sensitive display System 112 shows visual output to user.Visual output optionally includes figure, text, icon, video and theirs is any It combines (being referred to as " figure ").In some embodiments, the visual output of some visual outputs or whole corresponds to user circle In face of as.As used herein, term " showing can indicate " refers to user's interactive graphical user interface object (for example, being configured as The graphical user interface object that the input for being led to graphical user interface object is responded).User interactive graphics user The example of interface object includes but is not limited to button, sliding block, icon, optional menu item, switch, hyperlink or other users Interface control.
Touch-sensitive display system 112 has the touch-sensitive table for receiving input from the user based on tactile and/or tactile contact Face, sensor or sensor group.Touch-sensitive display system 112 and display controller 156 are (any related in memory 102 The module and/or instruction set of connection are together) detection touch-sensitive display system 112 on contact (and the contact any movement or in It is disconnected), and the contact that will test be converted to and be displayed in touch-sensitive display system 112 user interface object (for example, One or more soft-key buttons, icon, webpage or image) interaction.In some embodiments, in touch-sensitive 112 He of display system Contact point between user corresponds to the finger or stylus of user.
Touch-sensitive display system 112 optionally uses LCD (liquid crystal display) technology, LPD (light emitting polymer displays) Technology or LED (light emitting diode) technology, but other display technologies are used in other embodiments.Touch-sensitive display system 112 and display controller 156 optionally using in the currently known or later a variety of touch-sensing technologies that will be developed appoint What technology and other proximity sensor arrays or for determine the one or more points contacted with touch-sensitive display system 112 its His element is including but not limited to capacitive, electric to detect contact and its any movement or interruption, a variety of touch-sensing technologies Resistive, infrared ray and surface acoustic wave technique.In some embodiments, using projection-type mutual capacitance detection technology, such as From Apple Inc.'s (Cupertino, California)iPodWithThe technology of middle discovery.
Touch-sensitive display system 112 is optionally with the video resolution for being more than 100dpi.In some embodiments, it touches Touching screen video resolution is more than 400dpi (for example, 500dpi, 800dpi or bigger).User optionally uses any suitable object Body or additives stylus, finger etc. are contacted with touch-sensitive display system 112.In some embodiments, by user interface It is designed to work together with contact and gesture based on finger, since the contact area of finger on the touchscreen is larger, this It may be accurate not as good as the input based on stylus.In some embodiments, the rough input based on finger is converted essence by equipment True pointer/cursor position or order is for executing the desired movement of user.
In some embodiments, in addition to a touch, equipment 100 optionally includes specific for activating or deactivating The Trackpad (not shown) of function.In some embodiments, Trackpad is the touch sensitive regions of equipment, different from touch screen, should Touch sensitive regions do not show visual output.Touch tablet is optionally the touch sensitive surface separated with touch-sensitive display system 112, either By the extension for the touch sensitive surface that touch screen is formed.
Equipment 100 further includes the electric system 162 for powering for various parts.Electric system 162 optionally includes electricity Power management system, one or more power supply (for example, battery, alternating current (AC)), recharging system, power failure detection circuit, Power converter or inverter, power supply status indicator (for example, light emitting diode (LED)) and with the electricity in portable device Power generates, managees, and distributes any other associated component.
Equipment 100 optionally further includes one or more optical sensors 164.Figure 1A show in I/O subsystem 106 The optical sensor that optical sensor controller 158 couples.One or more optical sensors 164 optionally include Charged Couple Device (CCD) or complementary metal oxide semiconductor (CMOS) phototransistor.One or more optical sensors 164 are from environment The light projected by one or more lens is received, and converts light to indicate the data of image.In conjunction with image-forming module 143 (being also designated as camera model), one or more optical sensors 164 optionally capture still image and/or video.Some In embodiment, optical sensor be located at equipment 100 with the opposite facing rear portion of touch-sensitive display system 112 on equipment front On, so that touch screen can be used as the view finder for still image and/or video image acquisition.In some embodiments, Another optical sensor is located on the front of equipment, to obtain the image of the user (for example, for self-timer, being used in user Video conference etc. is carried out when watching other video conference participants on the touchscreen).
Equipment 100 optionally further includes one or more contact strength sensors 165.Figure 1A is shown and I/O subsystem The contact strength sensor that intensity sensor controller 159 in 106 couples.One or more contact strength sensors 165 are appointed Selection of land includes one or more piezoresistive strain instrument, capacitive force transducer, electric force snesor, piezoelectric force transducer, optics Force snesor, capacitive touch sensitive surfaces or other intensity sensors (for example, for measuring the contact on touch sensitive surface power (or Pressure) sensor).One or more contact strength sensors 165 receive contact strength information (for example, pressure is believed from environment The surrogate of breath or pressure information).In some embodiments, at least one contact strength sensor and touch sensitive surface (for example, Touch-sensitive display system 112) Alignment or neighbouring.In some embodiments, at least one contact strength sensor is located at Equipment 100 be located at equipment 100 front on the opposite facing rear portion of touch-sensitive display system 112 on.
Equipment 100 optionally further includes one or more proximity sensors 166.Figure 1A is shown and peripheral device interface The proximity sensor 166 of 118 couplings.Alternatively, 160 coupling of input controller in proximity sensor 166 and I/O subsystem 106 It connects.In some embodiments, when multifunctional equipment is placed near user's ear (for example, when user is making a phone call), Proximity sensor closes and disables touch-sensitive display system 112.
Equipment 100 optionally further includes one or more tactile output generators 167.Figure 1A is shown and I/O subsystem The tactile output generator that tactile feedback controller 161 in 106 couples.In some embodiments, one or more tactiles Output generator 167 includes one or more electroacoustic equipment such as loudspeaker or other acoustic components;And/or for energy to be turned Change into the electromechanical equipment of linear movement such as motor, solenoid, electroactive polymerizer, piezoelectric actuator, electrostatic actuator or its His tactile exports generating unit (for example, component for converting the electrical signal to the output of the tactile in equipment).Tactile output hair Raw device 167 receives touch feedback from haptic feedback module 133 and generates instruction, and generating on the appliance 100 can be by equipment 100 User feel tactile output.In some embodiments, at least one tactile output generator and touch sensitive surface (example Such as, touch-sensitive display system 112) Alignment or neighbouring, and optionally by vertically (for example, to the surface of equipment 100 Inside/outside) or laterally (for example, in plane identical with the surface of equipment 100 rearwardly and a forwardly) mobile touch sensitive surface next life It is exported at tactile.In some embodiments, at least one tactile output generator sensor be located at equipment 100 be located at set On the opposite facing rear portion of touch-sensitive display system 112 on standby 100 front.
Equipment 100 optionally further includes one or more accelerometers 168.Figure 1A is shown and 118 coupling of peripheral device interface The accelerometer 168 connect.Alternatively, accelerometer 168 is optionally coupled with the input controller 160 in I/O subsystem 106. In some embodiments, it is shown based on to the analysis from the one or more accelerometer received data in touch screen Information is shown with longitudinal view or transverse views on device.Equipment 100 further includes magnetic force optionally other than accelerometer 168 Instrument (not shown) and GPS (or GLONASS or other Global Navigation Systems) receiver (not shown), for obtaining about equipment The information of 100 position and orientation (for example, vertical or horizontal).
In some embodiments, the software component being stored in memory 102 includes operating system 126, communication module (or instruction set) 128, contact/motion module (or instruction set) 130, figure module (or instruction set) 132, haptic feedback module (or instruction set) 133, text input module (or instruction set) 134, global positioning system (GPS) module (or instruction set) 135, with And application program (or instruction set) 136.In addition, in some embodiments, memory 102 stores equipment/overall situation internal state 157, if figure is shown in the 1A and Fig. 3.Equipment/overall situation internal state 157 includes one or more of the following: activity application Program state indicates which application program (if any) is currently movable;What display state indicates using journey Sequence, view or other information occupy each region of touch-sensitive display system 112;Sensor states, including from each of equipment The information that sensor and other inputs or control equipment 116 obtain;And about equipment position and/or posture position and/ Or azimuth information.
Operating system 126 is (for example, iOS, Darwin, RTXC, LINUX, UNIX, OSX, WINDOWS or embedded operation System such as VxWorks) include for control and manage general system task (for example, memory management, storage equipment control, Power management etc.) various component softwares and/or driver, and be conducive to the communication between various hardware and software components.
Communication module 128 promotes the communication with other equipment by one or more outside ports 124, and further includes For handling by the various component softwares of 124 received data of RF circuit 108 and/or outside port.124 (example of outside port Such as, universal serial bus (USB), firewire etc.) it is suitable for being directly coupled to other equipment, or indirectly by network (for example, interconnection Net, Wireless LAN etc.) it is coupled.In some embodiments, outside port be with Apple Inc. (Cupertino, California) someiPodThe identical or class with 30 needle connectors used in iPod equipment Like and/or compatible spininess (for example, 30 needles) connector.In some embodiments, outside port is and Apple Inc. (Cupertino, California's) is someiPodWith used in iPod equipment The same or like and/or compatible Lightning connector of Lightning connector.
Contact/motion module 130 optionally detects and touch-sensitive display system 112 (in conjunction with display controller 156) and its The contact of his touch-sensitive device (for example, touch tablet or physics click wheel).Contact/motion module 130 include various software components with The relevant various operations of detection are contacted with (such as passing through finger or stylus) for executing, such as to determine that whether being in contact (for example, detection finger down event), the intensity of determining contact (for example, the power or pressure of contact, or the power or pressure of contact Sub), determine whether there is the movement of contact and track the movement across touch sensitive surface (for example, detecting one or more hands Refer to drag events) and determine contact (for example, detection finger is lifted away from event or contact disconnects) whether it has stopped.Contact/fortune Dynamic model block 130 receives contact data from touch sensitive surface.Determine that the movement of contact point optionally includes the rate (amount of determining contact point Value), speed (magnitude and direction) and/or acceleration (change in magnitude and/or direction), the movement of the contact point is by a series of Contacting data indicates.These operations are optionally applied to single-contact (for example, single abutment or stylus contact) or multiple spot Contact (for example, " multiple point touching "/more abutments) simultaneously.In some embodiments, contact/motion module 130 and display control Device 156 processed detects the contact on Trackpad.
Contact/motion module 130 optionally detects the gesture input of user.Different gestures on touch sensitive surface have difference Contact mode (for example, the different motion of detected contact, timing and/or intensity).Therefore, special optionally by detection Determine contact mode and carrys out detection gesture.For example, detection singly refers to that Flick gesture includes detection finger down event, then pressed with finger Detection finger lifts and (is lifted away from) thing at (for example, at picture mark position) the identical position (or substantially the same position) of lower event Part.For another example, detecting the finger on touch sensitive surface and gently sweeping gesture includes detection finger down event, then detects one or more hands Refer to drag events, and then detection finger lifts and (be lifted away from) event.Similarly, by detect stylus specific contact patterns come The tap of stylus is optionally detected, gently sweeps, drag and other gestures.
In some embodiments, detect that finger Flick gesture depends on detecting that finger down event is lifted with finger Time span between event, but the finger contact strength between finger down event and digit up event is unrelated.In In some embodiments, it is less than according to the time span determined between finger down event and digit up event predetermined It is worth (for example, less than 0.1,0.2,0.3,0.4 or 0.5 second), detects Flick gesture, the intensity contacted but regardless of finger during tap Whether given intensity threshold (be greater than Nominal contact detection intensity threshold value), such as light press or deep pressing intensity threshold are reached. Therefore, finger Flick gesture can satisfy specific input standard, which does not require the characteristic strength of contact to meet Intensity threshold is given to meet specific input standard.For clarity, the finger contact in Flick gesture usually requires to meet mark Claim contact detection intensity threshold value to detect finger down event, when being lower than the Nominal contact detection intensity threshold value, will not detect To contact.Similar analysis is suitable for through stylus or other contact detection Flick gestures.It is able to detect in equipment in touch-sensitive table In the case that the finger or stylus to hover above face contacts, Nominal contact detection intensity threshold value optionally not with finger or stylus with Physical contact between touch sensitive surface is corresponding.
Same concept is suitable for other kinds of gesture in a similar manner.For example, can based on for include in gesture Contact intensity is unrelated or the satisfaction that does not require the contact for executing gesture to reach intensity threshold so as to identified standard Optionally gesture, kneading gesture, expansion gesture and/or long pressing gesture are gently swept in detection.For example, gently sweeping gesture is based on one or more The amount of movement of a contact detects;Scaling gesture is detected based on two or more movements of contact towards each other;Expand gesture of letting go Movement away from each other is contacted based on two or more to detect;Long pressing gesture is based on touch sensitive surface having less than threshold value The duration of the contact of amount of movement is detected.Contact strength is not required to meet accordingly, with respect to certain gestures criterion of identification corresponding Intensity threshold mean that certain gestures criterion of identification can connect in gesture to meet the statement of certain gestures criterion of identification Touching is satisfied when being not up to corresponding intensity threshold, and can also meet or exceed phase in one or more contacts in gesture It is satisfied in the case where the intensity threshold answered.In some embodiments, it is detected in time predefined section based on determination in one's hands Finger presses event and digit up event to detect Flick gesture, without considering that contact is above also during time predefined section It is less than corresponding intensity threshold, and gently sweeps gesture to detect greater than predefined magnitude based on determining that contact is mobile, even if It is also such that contact, which is higher than corresponding intensity threshold, at the end of contact is mobile.Even if in the detection to gesture by execution gesture (for example, equipment quickly detects when the intensity of contact is higher than intensity threshold in the specific implementation of the influence of the intensity of contact Long pressing, or when the intensity of contact is higher, equipment can postpone the detection to tap input), as long as being not up to spy in contact Determine the standard that can satisfy identification gesture in the case where intensity threshold, then contact will not be required to reach the detection of these gestures Certain strength threshold value (for example, even if time quantum needed for identification gesture changes).
In some cases, contact strength threshold value, duration threshold and mobile threshold value carry out group with various various combinations It closes, distinguishes gestures different for two or more of identical input element or region to create heuritic approach, so that The set of richer user interaction and response is capable of providing with multiple and different interactions of identical input element.It is specific about one group Gesture identification standard does not require the intensity of contact to meet corresponding intensity threshold to meet the statement of certain gestures criterion of identification not It excludes that other intensity related gesture criterion of identification are carried out while being assessed, to identify that with gesture is worked as include having to be higher than accordingly by force Spend other gestures for the standard being satisfied when the contact of the intensity of threshold value.For example, in some cases, first gesture it is first-hand Gesture criterion of identification (it does not require the intensity of contact to meet corresponding intensity threshold to meet first gesture criterion of identification) and second Second gesture criterion of identification (it depends on the contact for reaching respective strengths threshold value) competition of gesture.In such competition, such as The second gesture criterion of identification standard first of fruit second gesture is met, then gesture is not identified as optionally meeting first-hand The first gesture criterion of identification of gesture.For example, if contact reaches corresponding intensity before the mobile predefined amount of movement of contact Threshold value then detects deep pressing gesture rather than gently sweeps gesture.On the contrary, if being connect before contact reaches corresponding intensity threshold The mobile predefined amount of movement of touching then detects and gently sweeps gesture rather than deep pressing gesture.Even in this case, first-hand The first gesture criterion of identification of gesture does not require the intensity of contact to meet corresponding intensity threshold still to meet first gesture identification Standard, because if contact keeps below corresponding intensity threshold until gesture terminates (for example, phase will not be increased to above by having That answers the contact of the intensity of intensity threshold gently sweeps gesture), gesture will be identified as gently sweeping gesture by first gesture criterion of identification.Cause This, does not require the intensity of contact to meet corresponding intensity threshold to meet the certain gestures criterion of identification of certain gestures criterion of identification Will (A) in some cases, ignore relative to the contact strength (for example, for Flick gesture) of intensity threshold and/or (B) in some cases, if before certain gestures criterion of identification identifies gesture corresponding with input, the intensity of one group of competition Related gesture criterion of identification (for example, for pressing gesture deeply) input is identified as it is corresponding with intensity related gesture, then not It is able to satisfy certain gestures criterion of identification (for example, for long pressing gesture), in this sense, still depends on phase For the contact strength (for example, for long pressing gesture of deep pressing gesture competition identification) of intensity threshold.
Figure module 132 includes for figure to be rendered and shown in touch-sensitive display system 112 or other displays Various known software components, including for changing shown figure visual impact (for example, brightness, transparency, saturation degree, Contrast or other perceptual properties) component.As used herein, term " figure " includes can be displayed to user any right As without limitation including text, webpage, icon (such as including the user interface object of soft key), digital picture, video, moving Draw etc..
In some embodiments, figure module 132 stores the data for indicating figure ready for use.Each figure is appointed Selection of land is assigned corresponding code.Figure module 132 is used to specify one of figure to be shown from receptions such as application programs Or multiple codes, also receive coordinate data and other graphic attribute data together in the case of necessary, and then generate screen Curtain image data, with output to display controller 156.
Haptic feedback module 133 includes for generating instruction (for example, the instruction used by tactile feedback controller 161) Various software components use with the interaction in response to user and equipment 100 tactile output generator 167 on the appliance 100 One or more positions generate tactile output.
The text input module 134 for being optionally the component of figure module 132 is provided in various application program (examples Such as, contact person 137, Email 140, IM 141, browser 147 and any other application program for needing text input) in Input the soft keyboard of text.
GPS module 135 determines the position of equipment and provides this information to use in various application programs (for example, mentioning It is supplied to the phone 138 for location-based dialing;It is provided to camera 143 and is used as picture/video metadata;And it is provided to For location based service such as weather desktop small routine, local Yellow Page desktop small routine and map/navigation desktop small routine Application program).
Application program 136 is optionally included with lower module (or instruction set) or its subset or superset:
Contact module 137 (sometimes referred to as address list or contacts list);
Phone module 138;
Video conference module 139;
Email client module 140;
Instant message (IM) module 141;
Body-building support module 142;
For still image and/or the camera model 143 of video image;
Image management module 144;
Browser module 147;
Calendaring module 148;
Desktop small routine module 149, optionally includes one or more of the following terms: weather desktop small routine 149-1, stock market desktop small routine 149-2, calculator desktop small routine 149-3, alarm clock desktop small routine 149-4, dictionary desktop The desktop small routine 149-6 of small routine 149-5 and other desktop small routines and user's creation for being obtained by user;
It is used to form the desktop small routine builder module 150 of the desktop small routine 149-6 of user's creation;
Search module 151;
Video and musical player module 152, are optionally made of video player module and musical player module;
Notepad module 153;
Mapping module 154;And/or
Online Video module 155.
The example for the other applications 136 being optionally stored in memory 102 includes other text processing application journeys Other picture editting's application programs, drawing application program, application program, the application program for supporting JAVA, encryption, number is presented in sequence Word rights management, speech recognition and speech reproduction.
It is defeated in conjunction with touch-sensitive display system 112, display controller 156, contact module 130, figure module 132 and text Enter module 134, contact module 137 includes executable instruction for managing address list or contacts list (for example, being stored in In the application program internal state 192 of contact module 137 in reservoir 102 or memory 370), comprising: addition name to lead to News record;Name is deleted from address book;Telephone number, e-mail address, physical address or other information are associated with name;It will Image is associated with name;Name is sorted out and is classified;Telephone number and/or e-mail address are provided initiating and/or Promote to pass through phone 138, video conference 139, the communication of Email 140 or instant message 141;Etc..
In conjunction with RF circuit 108, voicefrequency circuit 110, loudspeaker 111, microphone 113, touch-sensitive display system 112, display Controller 156, contact module 130, figure module 132 and text input module 134, phone module 138 include for carry out with The executable instruction of lower operation: the one or more electricity in input character string corresponding with telephone number, accessing address list 137 Words number, modification inputted telephone number, dial corresponding telephone number, conversate and when session complete when disconnection Or it hangs up.As described above, wireless communication is optionally using any one of a variety of communication standards, agreement and technology.
In conjunction with RF circuit 108, voicefrequency circuit 110, loudspeaker 111, microphone 113, touch-sensitive display system 112, display control Device 156 processed, one or more optical sensors 164, optical sensor controller 158, contact module 130, figure module 132, Text input module 134, contacts list 137 and phone module 138, video conference module 139 include coming according to user instructions Initiate, carry out and terminate the executable instruction of the video conference between user and other one or more participants.
In conjunction with RF circuit 108, touch-sensitive display system 112, display controller 156, contact module 130, figure module 132 With text input module 134, email client module 140 includes for creating, sending in response to user instruction, receive With the executable instruction of management Email.In conjunction with image management module 144, email client module 140 makes very It is easy creation and sends the Email with the still image or video image that are shot by camera model 143.
In conjunction with RF circuit 108, touch-sensitive display system 112, display controller 156, contact module 130, figure module 132 With text input module 134, instant message module 141 includes the executable instruction for performing the following operation: being inputted and instant The corresponding character string of message modifies the character being previously entered, sends corresponding instant message (for example, using for based on phone Instant message short message service (SMS) or multimedia messaging service (MMS) agreement or using for Internet-based XMPP, SIMPLE, Apple push notification service (APNs) of instant message or IMPS), receive instant message and check institute Received instant message.In some embodiments, the instant message transferred and/or received optionally include figure, photograph, Other attachments for being supported in audio file, video file, and/or MMS and/or enhancing messaging service (EMS).Such as this paper institute With " instant message " is referred to message (for example, the message sent using SMS or MMS) based on phone and Internet-based disappeared Both breaths (for example, the message sent using XMPP, SIMPLE, APNs or IMPS).
In conjunction with RF circuit 108, touch-sensitive display system 112, display controller 156, contact module 130, figure module 132, text input module 134, GPS module 135, mapping module 154 and video and musical player module 152, body-building branch Holding module 142 includes executable instruction for creating body-building (for example, having time, distance and/or caloric burn target);With The communication of (in sporting equipment and smartwatch) body-building sensor;Receive workout sensor data;Calibration is for monitoring body-building Sensor;Music is selected and played for body-building;And it shows, store and transmit workout data.
In conjunction with touch-sensitive display system 112, display controller 156, one or more optical sensors 164, optical sensing Device controller 158, contact module 130, figure module 132 and image management module 144, camera model 143 include for carrying out The executable instruction operated below: still image or video (including video flowing) are captured and stores them in memory 102 In, the feature of modification still image or video, and/or delete still image or video from memory 102.
In conjunction with touch-sensitive display system 112, display controller 156, contact module 130, figure module 132, text input Module 134 and camera model 143, image management module 144 include for arranging, modifying (for example, editor) or with its other party Formula manipulates, tags, deleting, showing (for example, in digital slide or photograph album) and storage still image and/or video The executable instruction of image.
In conjunction with RF circuit 108, touch-sensitive display system 112, display system controller 156, contact module 130, figure mould Block 132 and text input module 134, browser module 147 include according to user instructions come browse internet (including search, chain Be connected to, receive and show webpage or part thereof and be linked to the attachment and alternative document of webpage) executable instruction.
In conjunction with RF circuit 108, touch-sensitive display system 112, display system controller 156, contact module 130, figure mould Block 132, text input module 134, email client module 140 and browser module 147, calendaring module 148 include using According to user instructions come create, show, modify and store calendar and data associated with calendar (for example, calendar, Backlog etc.) executable instruction.
In conjunction with RF circuit 108, touch-sensitive display system 112, display system controller 156, contact module 130, figure mould Block 132, text input module 134 and browser module 147, desktop small routine module 149 are optionally to be downloaded and made by user Miniature applications program is (for example, weather desktop small routine 149-1, stock market desktop small routine 149-2, calculator desktop little Cheng Sequence 149-3, alarm clock desktop small routine 149-4 and dictionary desktop small routine 149-5) or by user create miniature applications program (for example, desktop small routine 149-6 of user's creation).In some embodiments, desktop small routine includes HTML (hypertext mark Remember language) file, CSS (cascading style sheets) file and JavaScript file.In some embodiments, desktop small routine packet XML (extensible markup language) file and JavaScript file are included (for example, Yahoo!Desktop small routine).
In conjunction with RF circuit 108, touch-sensitive display system 112, display system controller 156, contact module 130, figure mould Block 132, text input module 134 and browser module 147, desktop small routine builder module 150 include for creating desktop The executable instruction of small routine (for example, user's specified portions of webpage are gone in desktop small routine).
In conjunction with touch-sensitive display system 112, display system controller 156, contact module 130, figure module 132 and text Input module 134, search module 151 include for carrying out searching in searching storage 102 with one or more according to user instructions The matched text of rope condition (for example, search term that one or more user specifies), music, sound, image, video and/or its The executable instruction of his file.
In conjunction with touch-sensitive display system 112, display system controller 156, contact module 130, figure module 132, audio-frequency electric Road 110, loudspeaker 111, RF circuit 108 and browser module 147, video and musical player module 152 include allowing user Download and play back the music recorded stored with one or more file formats (such as MP3 or AAC file) and other sound The executable instruction of file, and for showing, presenting or otherwise play back video (for example, in touch-sensitive display system 112 Executable instruction above or on the external display being wirelessly connected via outside port 124).In some embodiments, if Standby 100 optionally include the function of MP3 player such as iPod (trade mark of Apple Inc.).
In conjunction with touch-sensitive display system 112, display controller 156, contact module 130, figure module 132 and text input Module 134, notepad module 153 include creating and managing holding for notepad, backlog etc. for according to user instructions Row instruction.
In conjunction with RF circuit 108, touch-sensitive display system 112, display system controller 156, contact module 130, figure mould Block 132, text input module 134, GPS module 135 and browser module 147, mapping module 154 include for being referred to according to user It enables to receive, show, modify and store map and data associated with map (for example, driving route;Specific location or The data in neighbouring shop and other points of interest;With other location-based data) executable instruction.
In conjunction with touch-sensitive display system 112, display system controller 156, contact module 130, figure module 132, audio-frequency electric Road 110, loudspeaker 111, RF circuit 108, text input module 134, email client module 140 and browser module 147, Online Video module 155 includes allowing user to access, browsing, receiving (for example, by stream transmission and/or downloading), return It puts (such as on touch screen 112 or on external display that is wireless connection or connecting via outside port 124), send tool Have to the Email of the link of specific Online Video and otherwise manages one or more file formats such as H.264 the executable instruction of Online Video.In some embodiments, using instant message module 141 rather than electronics postal Part client modules 140 are sent to the link of specific Online Video.
Above-mentioned identified each module and application correspond to for executing above-mentioned one or more functions and in this Shen Please described in one group of method (for example, computer implemented method described herein and other information processing method) Executable instruction.These modules (that is, instruction set) need not be realized with independent software program, process, or module, therefore these moulds Each subset of block is optionally combined in various embodiments or is otherwise rearranged.In some embodiments, Memory 102 optionally stores the subgroup of above-mentioned module and data structure.It is not retouched above in addition, memory 102 optionally stores The other module and data structure stated.
In some embodiments, equipment 100 is that the operation of predefined one group of function in the equipment uniquely passes through Touch screen and/or Trackpad are performed equipment.By using touch screen and/or Trackpad as operating equipment 100 Main input control apparatus is physically entered control equipment (for example, pushing button, driver plate etc. optionally in reduction equipment 100 Deng) quantity.
Predefined one group of function is uniquely performed by touch screen and/or Trackpad and is optionally included in user circle Navigation between face.In some embodiments, when user touches touch tablet, appoint what equipment 100 was shown from equipment 100 What user interface navigation is to main menu, home menu or root menu.In such embodiment, " dish is realized using touch tablet Single button ".In some other embodiments, menu button is that physics pushes button or other are physically entered control equipment, Rather than touch tablet.
Figure 1B is the block diagram for showing the example components for event handling according to some embodiments.In some implementations In scheme, memory 102 (in Figure 1A) or memory 370 (Fig. 3) include event classifier 170 (for example, in operating system 126 In) and corresponding application program 136-1 (for example, any of aforementioned applications program 136,137 to 155,380 to 390 apply Program).
Event classifier 170 receive the application program 136-1 that event information is delivered to by event information and determination and The application view 191 of application program 136-1.Event classifier 170 includes event monitor 171 and event dispatcher module 174.In some embodiments, application program 136-1 includes application program internal state 192, the application program internal state The one or more that indicating is movable when application program or while being carrying out shows in touch-sensitive display system 112 is currently answered Use Views.In some embodiments, equipment/overall situation internal state 157 is by event classifier 170 for which to be determined (which) application program is currently movable, and application program internal state 192 will for determination by event classifier 170 The application view 191 that event information is delivered to.
In some embodiments, application program internal state 192 includes additional information, such as one of the following terms Or more persons: when application program 136-1 restores to execute recoverys information to be used, indicate just shown by application program 136-1 Or it is ready for the user interface state information of the information shown by the application program, answered for allowing users to return to With the repetition/revocation queue for the prior actions that the previous state of program 236-1 or the state queue of view and user are taken.
Event monitor 171 receives event information from peripheral device interface 118.Event information includes about subevent (example Such as, as in the touch-sensitive display system 112 of a part of multi-touch gesture user touch) information.Peripheral equipment connects Mouth 118 transmits it from I/O subsystem 106 or sensor such as proximity sensor 166, accelerometer 168 and/or microphone 113 (passing through voicefrequency circuit 110) received information.Peripheral device interface 118 is from the 106 received information of institute of I/O subsystem including coming from The information of touch-sensitive display system 112 or touch sensitive surface.
In some embodiments, event monitor 171 sends the request to peripheral equipment at predetermined intervals and connects Mouth 118.In response, 118 transmitting event information of peripheral device interface.In other embodiments, peripheral device interface 118 is only When there are significant events (for example, receiving the input higher than predetermined noise threshold and/or receiving is more than to predefine Duration input) when ability transmitting event information.
In some embodiments, event classifier 170 further includes hit view determination module 172 and/or life event Identifier determining module 173.
When touch-sensitive display system 112 shows more than one view, hit view determination module 172 is provided for determining The subevent software process where occurred in one or more views.View can be seen over the display by user The control and other elements arrived is constituted.
The another aspect of user interface associated with application program is one group of view, otherwise referred to as applies journey herein Sequence view or user interface windows are wherein showing information and the gesture based on touch occur.Wherein detecting touch (corresponding application programs) application view optionally corresponds in the sequencing or view hierarchies structure of application program Sequencing is horizontal.For example, being optionally referred to as hit view in the floor level view for wherein detecting touch, and it is identified Event set to correctly enter is based in part on the hit view of initial touch optionally at least to determine, the initial touch is opened Primordium is in the gesture of touch.
It hits view determination module 172 and receives information relevant to the subevent of the gesture based on touch.Work as application program When with the multiple views organized in hierarchical structure, hit view determination module 172 will hit view, and be identified as should be to sub- thing Minimum view in the hierarchical structure that part is handled.In most cases, hit view is to initiate subevent (to form thing The first subevent in the subevent sequence of part or potential event) in the floor level view wherein occurred.Once hitting view It is hit view determination module to be identified, hit view, which is just usually received, to be identified as hitting the targeted same touching of view with it It touches or the relevant all subevents of input source.
It is specific that life event identifier determining module 173 determines which or which view in view hierarchies structure should receive Subevent sequence.In some embodiments, life event identifier determining module 173 determines that only hit view should just receive spy Stator sequence of events.In other embodiments, life event identifier determining module 173 determines the physical bit including subevent All views set all are the active views participated in, and thereby determine that all views actively participated in all and should receive specific sub- thing Part sequence.In other embodiments, even if touch subevent is confined to region associated with a particular figure completely, Higher view in hierarchical structure will still maintain view for active participation.
Event information is assigned to event recognizer (for example, event recognizer 180) by event dispatcher module 174.It is wrapping In the embodiment for including life event identifier determining module 173, event information is delivered to by living by event dispatcher module 174 Dynamic 173 definite event identifier of event recognizer determining module.In some embodiments, event dispatcher module 174 exists Event information is stored in event queue, which is retrieved by corresponding event receiver module 182.
In some embodiments, operating system 126 includes event classifier 170.Alternatively, application program 136-1 packet Include event classifier 170.In yet another embodiment, event classifier 170 is standalone module, or is stored in memory A part of another module (such as, contact/motion module 130) in 102.
In some embodiments, application program 136-1 includes multiple button.onreleases 190 and one or more application Views 191, wherein each application view includes for handling the corresponding view occurred in the user interface of application program The instruction of touch event in figure.Each application view 191 of application program 136-1 includes one or more event recognitions Device 180.In general, corresponding application programs view 191 includes multiple event recognizers 180.In other embodiments, event recognition One or more event recognizers in device 180 are a part of standalone module, which is such as user interface tool The object of packet (not shown) or application program the 136-1 therefrom higher level of inheritance method and other attributes.In some embodiment party In case, corresponding event processing routine 190 includes one or more of the following terms: data renovator 176, object renovator 177, GUI renovator 178, and/or from the received event data 179 of event classifier 170.Button.onrelease 190 is optionally Using or call data renovator 176, object renovator 177 or GUI renovator 178 come more new application internal state 192. Alternatively, one or more application views in application view 191 include that one or more corresponding events handle journey Sequence 190.In addition, in some embodiments, one of data renovator 176, object renovator 177 and GUI renovator 178 Or more persons are included in corresponding application programs view 191.
Corresponding event recognizer 180 receives event information (for example, event data 179) from event classifier 170, and From event information identification events.Event recognizer 180 includes Event receiver 182 and event comparator 184.In some embodiment party In case, event recognizer 180 further include metadata 183 and event transmitting instruction 188 (its optionally include subevent delivering refers to Enable) at least one subset.
Event receiver 182 receives the event information from event classifier 170.Event information includes about subevent example As touched or touching mobile information.According to subevent, event information further includes additional information, the position of such as subevent.When When subevent is related to the movement touched, event information optionally further includes rate and the direction of subevent.In some embodiments In, event include equipment from an orientation rotate to another orientation (for example, rotate to horizontal orientation from machine-direction oriented, or vice versa ), and event information includes the corresponding informance of the current orientation (also referred to as equipment posture) about equipment.
Event information and predefined event or subevent definition are compared by event comparator 184, and being based on should Compare to determine event or subevent, or the state of determining or update event or subevent.In some embodiments, event Comparator 184 includes that event defines 186.Event defines 186 definition (for example, predefined subevent sequence) comprising event, Such as event 1 (187-1), event 2 (187-2) and other events.In some embodiments, the subevent in event 187 Start for example including touch, touch terminate, touch it is mobile, touch and cancel and multiple point touching.In one example, 1 (187- of event 1) definition is the double-click on shown object.For example, double-clicking includes the first of the predetermined duration being shown on object When the secondary first time for touching (touch starts), predefining duration lifts (touch terminates), predetermined on shown object Long second of touch (touch starts) and predetermined duration lifts (touch terminates) for the second time.In another example In, the definition of event 2 (187-2) is the dragging on shown object.For example, dragging includes predefining on shown object Movement and touch of the touch (or contact), touch of duration in touch-sensitive display system 112 are lifted away from (touch terminates). In some embodiments, event further includes the information for one or more associated button.onreleases 190.
In some embodiments, it includes the definition for the event of respective user interfaces object that event, which defines 187,.One In a little embodiments, event comparator 184 executes hit test, to determine which user interface object is associated with subevent. For example, being shown in touch-sensitive display system 112 in the application view of three user interface objects, when in touch-sensitive display When detecting touch in system 112, event comparator 184 executes hit Test to determine which in these three user interface objects One user interface object is associated with touch (subevent).If each shown object and corresponding event handling journey Sequence 190 is associated, then event comparator using the hit test as a result, to determine which button.onrelease 190 should be by Activation.For example, the selection of event comparator 184 event handling journey associated with the object of subevent and triggering hit test Sequence.
It in some embodiments, further include delay voltage to the definition of corresponding event 187, which postpones event The delivering of information, until having determined that whether subevent sequence exactly corresponds to or do not correspond to the event type of event recognizer.
It, should when the determining subevent series of corresponding event identifier 180 does not define any event in 186 with event to be matched 180 entry event of corresponding event identifier is impossible, event fails or event terminates state, ignores after this based on touch The subsequent subevent of gesture.In this case, for hit view keep other movable event recognizers (if there is Words) continue to track and handle the subevent of the gesture based on touch of lasting progress.
In some embodiments, corresponding event identifier 180 includes metadata 183, which has instruction event Delivery system how should execute the configurable attribute to the delivering of the subevent of the event recognizer of active participation, label and/or List.In some embodiments, metadata 183 includes how instruction event recognizer interacts or how to interact each other Configurable attribute, mark and/or list.In some embodiments, metadata 183 includes whether instruction subevent is delivered to view Configurable attribute, mark and/or the list of figure or the different levels in sequencing hierarchical structure.
In some embodiments, when one or more specific subevents of identification events, corresponding event identifier 180 Activate button.onrelease 190 associated with event.In some embodiments, corresponding event identifier 180 will be with event Associated event information is delivered to button.onrelease 190.Activation button.onrelease 190 is different from sending subevent (and delaying to send) hits view to corresponding.In some embodiments, event recognizer 180 is dished out and the event phase that is identified Associated label, and button.onrelease 190 associated with the label obtains the label and executes predefined process.
In some embodiments, event delivery instruction 188 includes delivering the event information about subevent without swashing The subevent delivery instructions of button.onrelease living.On the contrary, event information is delivered to and subevent sequence by subevent delivery instructions It arranges associated button.onrelease or is delivered to the view of active participation.View with subevent sequence or with active participation Associated button.onrelease receives event information and executes predetermined process.
In some embodiments, data renovator 176 creates and updates the data used in application program 136-1. It is updated for example, data renovator 176 docks telephone number used in touch block 137, or video or music is broadcast Video file used in device module 152 is put to be stored.In some embodiments, object renovator 177 creation and more The newly object used in application program 136-1.For example, object renovator 177 creates new user interface object or updates user The position of interface object.GUI renovator 178 updates GUI.For example, GUI renovator 178 prepares display information, and display is believed Breath is sent to figure module 132 to show on the touch sensitive display.
In some embodiments, one or more button.onreleases 190 include data renovator 176, object update Device 177 and GUI renovator 178, or with the access authority to the data renovator, the object renovator and the GUI renovator. In some embodiments, data renovator 176, object renovator 177 and GUI renovator 178 are included in respective application journey In sequence 136-1 or the individual module of application view 191.In other embodiments, they are included in two or more In a software module.
It should be appreciated that the above-mentioned discussion of the event handling about user's touch on touch-sensitive display is applied also for using defeated Enter user's input that equipment carrys out the other forms of operating multifunction equipment 100, not all user's input is all in touch screen Upper initiation.For example, optionally pressing or pinning the mouse movement to cooperate and mouse button down with single or multiple keyboards;Touching Contact in template is mobile, tap, dragging, rolling etc.;Stylus input;The movement of equipment;Spoken command;The eye detected Eyeball is mobile;Biological characteristic input;And/or any combination of them is optionally used as corresponding to the son for limiting the event to be identified The input of event.
Fig. 2 shows according to some embodiments with touch screen (for example, touch-sensitive display system 112 of Figure 1A) Portable multifunction device 100.Touch screen optionally shows one or more figures in user interface (UI) 200.At these In embodiment and in other embodiments for being described below, user can be by, for example, one or more fingers 202 (being not drawn on scale in figure) or one or more stylus 203 (being not drawn on scale in figure) are sold on figure Gesture selects one or more figures in these figures.In some embodiments, it interrupts as user and schemes with one or more When the contact of shape, the selection to one or more figures will occur.In some embodiments, gesture optionally include it is primary or Repeatedly (from left to right, from right to left, up and/or down) and/or tap, one or many gently sweep occur with equipment 100 The rolling (from right to left, from left to right, up and/or down) of the finger of contact.In some specific implementations or in some feelings Under condition, figure inadvertently will not be selected with pattern contact.For example, applying journey when gesture corresponding with selection is tap Swept above sequence icon gently corresponding application program will not optionally be selected by sweeping gesture.
Equipment 100 optionally further includes one or more physical buttons, such as " home " button or menu button 204.Such as Preceding described, menu button 204 is optionally for times navigate in one group of application program being optionally performed on the appliance 100 What application program 136.Alternatively, in some embodiments, menu button is implemented as being displayed on touch The soft key in GUI on panel type display.
In some embodiments, equipment 100 (is sometimes referred to as main button including touch-screen display, menu button 204 204), pushing button 206, the volume knob 208, user identity for keeping equipment power on/off and for locking device Module (SIM) card slot 210, earphone jack 212 and docking/charging external port 124.Button 206 is pushed optionally for passing through Depressing the button and the button is maintained at depressed state continues predefined time interval to carry out machine open/close to equipment; By depressing the button and discharging the button before in the past in the predefined time interval come locking device;And/or to equipment It is unlocked or initiates unlocking process.In some embodiments, equipment 100 is also received by microphone 113 for activating Or deactivate the voice input of certain functions.Equipment 100 is also optionally included for detecting connecing in touch-sensitive display system 112 One or more contact strength sensors 165 of the intensity of touching and/or for for equipment 100 user generate tactile output one A or multiple tactile output generators 167.
Fig. 3 is the block diagram according to the exemplary multifunctional equipment with display and touch sensitive surface of some embodiments. Equipment 300 needs not be portable.In some embodiments, equipment 300 is laptop computer, desktop computer, plate electricity Brain, multimedia player device, navigation equipment, educational facilities (such as children for learning toy), game system or control equipment (example Such as, household controller or industrial controller).Equipment 300 generally include one or more processing units (CPU) 310, one or Multiple networks or other communication interfaces 360, memory 370 and one or more communication bus for interconnecting these components 320.Communication bus 320 optionally include make system unit interconnect and control system component between communication circuit (sometimes It is called chipset).Equipment 300 includes input/output (I/O) interface 330 with display 340, which is usually Touch-screen display.I/O interface 330 also optionally includes keyboard and/or mouse (or other sensing equipments) 350 and Trackpad 355, for generating the tactile output generator 357 of tactile output in equipment 300 (for example, being similar to above with reference to Figure 1A institute The one or more tactile output generators 167 stated), sensor 359 is (for example, optical sensor, acceleration transducer, close Sensor, touch-sensitive sensors, and/or similar to one or more contact strengths sensor 165 above with reference to described in Figure 1A Contact strength sensor).Memory 370 include high-speed random access memory, such as DRAM, SRAM, DDR RAM or other with Machine accesses solid-state memory device;And nonvolatile memory is optionally included, such as one or more disk storage equipments, Optical disc memory apparatus, flash memory device or other non-volatile solid-state memory devices.Memory 370 optionally includes separate One or more storage equipment that one or more CPU 310 are positioned.In some embodiments, memory 370 storage with it is portable The similar program of program, module and the data structure stored in the memory 102 of formula multifunctional equipment 100 (Figure 1A), module, And data structure or their subgroup.In addition, memory 370 is optionally stored in the memory of portable multifunction device 100 Appendage, module and the data structure being not present in 102.For example, the memory 370 of equipment 300 optionally stores drawing mould Module 382, word processing module 384, website creation module 386, disk editor module 388, and/or electrical form is presented in block 380 Module 390, and the memory 102 of portable multifunction device 100 (Figure 1A) does not store these modules optionally.
Each element in Fig. 3 in above-mentioned identified element is optionally stored in previously mentioned memory devices In one or more memory devices.Each module in above-mentioned identified module corresponds to one for executing above-mentioned function Group instruction.Above-mentioned identified module or program (that is, instruction set) need not be implemented as individual software program, process or mould Block, therefore each subset of these modules is optionally combined in various embodiments or is otherwise rearranged.One In a little embodiments, memory 370 optionally stores the subgroup of above-mentioned module and data structure.In addition, memory 370 is optionally Store other module and data structure not described above.
It attention is drawn to the reality for the user interface (" UI ") optionally realized on portable multifunction device 100 Apply scheme.
Fig. 4 A shows the exemplary of the application menu on the portable multifunction device 100 according to some embodiments User interface.Similar user interface is optionally realized in equipment 300.In some embodiments, user interface 400 includes Following elements or its subset or superset:
One or more signal strengths instruction of one or more wireless communications (such as cellular signal and Wi-Fi signal) Device;
Time;
Bluetooth indicator;
Battery Status Indicator;
With common application program image target pallet 408, icon such as:
The icon 416 for being marked as " phone " of ο phone module 138, the icon optionally include missed call or voice The indicator 414 of the quantity of message;
The icon 418 for being marked as " mail " of ο email client module 140, which, which optionally includes, does not read The indicator 410 of the quantity of Email;
The icon 420 for being marked as " browser " of ο browser module 147;And
The label of ο video and musical player module 152 be music " icon 422;And
The icon of other applications, such as:
The icon 424 for being marked as " message " of ο IM module 141;
The icon 426 for being marked as " calendar " of ο calendaring module 148;
The icon 428 for being marked as " photo " of ο image management module 144;
The icon 430 for being marked as " camera " of ο camera model 143;
The icon 432 for being marked as " Online Video " of ο Online Video module 155;
The icon 434 for being marked as " stock market " of the stock market ο desktop small routine 149-2;
The icon 436 for being marked as " map " of ο mapping module 154;
The icon 438 for being marked as " weather " of ο weather desktop small routine 149-1;
The icon 440 for being marked as " clock " of ο alarm clock desktop small routine 149-4;
The icon 442 for being marked as " body-building support " of ο body-building support module 142;
The label of ο notepad module 153 be notepad " icon 444;And
ο is used to be arranged the icon 446 of application program or module, which provides to equipment 100 and its various apply journey The access of the setting of sequence 136.
It should be noted that icon label shown in Fig. 4 A is only exemplary.For example, other labels are optionally for each Kind application icon.In some embodiments, the label of corresponding application programs icon includes and the corresponding application programs figure Mark the title of corresponding application program.In some embodiments, the label of application-specific icon is different from specific with this The title of the corresponding application program of application icon.
Fig. 4 B is shown with the touch sensitive surface 451 separated with display 450 (for example, plate or Trackpad in Fig. 3 355) the exemplary user interface in equipment (for example, equipment 300 in Fig. 3).Although touch-screen display 112 will be referred to Input on (being wherein combined with touch sensitive surface and display) provides subsequent many examples, but in some embodiments, The input on touch sensitive surface that equipment detection is separated with display, as shown in Figure 4 B.In some embodiments, touch sensitive surface (for example, 451 in Fig. 4 B) have master corresponding with main shaft (for example, 453 in Fig. 4 B) on display (for example, 450) Axis (for example, 452 in Fig. 4 B).According to these embodiments, equipment detects position corresponding with corresponding position on display Place the contact with touch sensitive surface 451 (for example, 460 in Fig. 4 B and 462) (for example, in figure 4b, 460 correspond to 468 and 470) 462 correspond to.In this way, in the display of touch sensitive surface (for example, 451 in Fig. 4 B) and multifunctional equipment (for example, Fig. 4 B In 450) marquis when being separated, by equipment user's input detected on touch sensitive surface (for example, contact 460 and 462 with And their movement) be used to manipulate the user interface on display by the equipment.It should be appreciated that similar method optionally for Other users interface as described herein.
As used herein, term " focus selector " refers to the user interface for being used to indicate that user is just interacting therewith The input element of current portions.In some specific implementations for including cursor or other positions label, cursor serves as " focus selection Device ", so that when cursor is above particular user interface element (for example, button, window, sliding block or other users interface element) Detect input (for example, pressing on touch sensitive surface (for example, touch sensitive surface 451 in Trackpad 355 or Fig. 4 B in Fig. 3) Input) in the case where, which is adjusted according to detected input.It is including making it possible to realize With the touch-screen display of the user interface element on touch-screen display directly interacted (for example, the touch-sensitive display in Figure 1A Touch screen in device system 112 or Fig. 4 A) some specific implementations in, " focus choosing is served as in the contact detected on the touchscreen Select device " so that working as on touch-screen display in particular user interface element (for example, button, window, sliding block or other users Interface element) position at detect input (for example, by contact pressing input) when, adjusted according to detected input Whole particular user interface element.In some specific implementations, focus is moved to user interface from a region of user interface Another region, the movement of the contact in correspondence movement or touch-screen display without cursor is (for example, by using tabulation Focus is moved to another button from a button by key or arrow key);In these specific implementations, focus selector is according to coke It puts the movement between the different zones of user interface and moves.The concrete form that focus selector is taken, focus are not considered Selector is usually from user's control to transmit with the desired interaction of the user of user interface (for example, by indicating to equipment The user of user interface it is expected the element interacted) user interface element (or contact on touch-screen display). For example, when detecting pressing input on touch sensitive surface (for example, touch tablet or touch screen), focus selector (for example, cursor, Contact or choice box) position above the corresponding button will indicate that user it is expected to activate the corresponding button (rather than device display On the other users interface element that shows).
In some embodiments, equipment depends on based on connecing during input the response of input detected by equipment Touch the standard of intensity.For example, being inputted for some " light press ", more than the intensity of the contact of the first intensity threshold during input The first response of triggering.In some embodiments, equipment depends on the response of the input as detected by equipment to include input The standard of the contact strength of period and time-based both criteria.For example, for some " deep to press " inputs, as long as meeting Pass through delay time between the second intensity threshold of first intensity threshold and satisfaction, the first of light press is exceeded more than during input The intensity of the contact of second intensity threshold of intensity threshold just triggers the second response.The duration of the delay time is usually less than (for example, 40ms, 100ms or 120ms, this depends on the magnitude of the second intensity threshold to 200ms (millisecond), wherein the delay time As the second intensity threshold increases and increase).The delay time helps to avoid unexpectedly to identify deep pressing input.For another example, for Some " deep pressing " inputs, the period that susceptibility reduction will occur after reaching the first intensity threshold.It is dropped in the susceptibility During the low period, the second intensity threshold increases.This temporary increase of second intensity threshold, which additionally aids, avoids unexpected depth Pressing input.For other pressing inputs deeply, time-based standard is not dependent on to the response for detecting deep pressing input.
In some embodiments, one or more of input intensity threshold value and/or corresponding output are based on one or more A factor (such as, user setting, contact movement, incoming timing, application program operation, rate when applying intensity, input simultaneously Quantity, user's history, environmental factor (for example, ambient noise), focus selector position etc. and change.Illustrative factor exists It is described in U.S. Patent Application Serial Number 14/399,606 and 14/624,296, these U.S. Patent applications full text is to quote Mode is incorporated herein.
For example, the dynamic that Fig. 4 C shows the intensity being based in part on touch input 476 at any time and changes over time is strong Spend threshold value 480.Resistance to vibration threshold value 480 is the summation of two components: pre- since being initially detected touch input 476 The first component 474 for decaying after the delay time p1 of definition at any time and track the intensity of touch input 476 at any time Second component 478.The initial high-intensitive threshold value of first component 474 reduces unexpected triggering " deep pressing " response, still allows for simultaneously " deep pressing " response immediately is carried out in the case where touch input 476 provides sufficient intensity.The reduction of second component 478 passes through touch The gradual strength fluctuation of input and inadvertent free " deep pressing " response.In some embodiments, meet in touch input 476 When resistance to vibration threshold value 480 (for example, point 481 at) in figure 4 c, triggering " deep pressing " response.
Fig. 4 D shows another resistance to vibration threshold value 486 (for example, intensity threshold ID).Fig. 4 D also show two other Intensity threshold: the first intensity threshold IHWith the second intensity threshold IL.In fig. 4d, although touch input 484 is full before time p2 The first intensity threshold I of footHWith the second intensity threshold IL, but response is just provided until passing through delay time p2 at the time 482. Equally in fig. 4d, resistance to vibration threshold value 486 decays at any time, (triggers and the second intensity threshold wherein decaying from the time 482 Value ILWhen associated response) time 488 for having been subjected to after predefined delay time p1 starts.It is such dynamic State intensity threshold is reduced immediately in triggering and lower threshold intensity (such as the first intensity threshold IHOr the second intensity threshold IL) related The response of connection surprisingly triggers and resistance to vibration threshold value I later or simultaneouslyDAssociated response.
Fig. 4 E shows another resistance to vibration threshold value 492 (for example, intensity threshold ID).In Fig. 4 E, defeated from touching Enter 490 by initial detecting to when have been subjected to delay time p2 after, triggering with intensity threshold ILAssociated response.Together When, resistance to vibration threshold value 492 from touch input 490 by initial detecting to when have been subjected to predefined delay time p1 it After decay.Therefore, in triggering and intensity threshold ILThe intensity that touch input 490 is reduced after associated response, then not The intensity for increasing touch input 490 in the case where discharging touch input 490 can trigger and intensity threshold IDAssociated response (example Such as, at the time 494), even if when the intensity of touch input 490 is lower than another intensity threshold (for example, intensity threshold IL) when It is such.
User interface and associated process
Attention is directed to can be in one exported with display, touch sensitive surface, (optionally) for generating tactile A or multiple tactile output generators and (optionally) are for detecting and the one or more of the intensity of the contact of touch sensitive surface The user interface (" UI ") and correlation realized on the electronic equipment of sensor such as portable multifunction device 100 or equipment 300 The embodiment of the process of connection.
Fig. 5 A1 to Fig. 5 A29 is shown according to some embodiments for one or more sides along touch-sensitive display Edge shows the exemplary user interface with multiple application program image target taskbars at variable position, for example, it allows to use Family is called at the position close to its current hand position and interacts with taskbar (for example, without current hand The significant displacement of position) user interface in these attached drawings is for showing process described below, including in Fig. 6 A to Fig. 6 F Process.For the ease of explaining, by embodiment party is discussed with reference to the operation executed in the equipment with touch-sensitive display system 112 Some embodiments in case.In such embodiment, focus selector is optionally: respective finger or stylus contact, are right The representative point (for example, the center of gravity that accordingly contacts or with accordingly contact associated point) that should contact in finger or stylus is touching The center of gravity of two or more detected contacts in quick display system 112.However, in response to when aobvious on display 450 The contact on touch sensitive surface 451 is detected when showing user interface shown in the accompanying drawings together with focus selector, is optionally had Similar operation is executed in display 450 and the equipment of independent touch sensitive surface 451.
For the ease of explaining, by with reference in the equipment for not having home button the operation that executes discuss in embodiment Some embodiments, and release the user interface currently shown using the gesture for meeting predefined standard and show home On-screen user interface.Although being shown as optional in Fig. 5 A1 to Fig. 5 A77, in some embodiments, wrapped in equipment It includes home button (for example, mechanical button, solid-state button or virtual push button), and the home button is used to release currently to show User interface simultaneously shows home on-screen user interface.(being inputted for example, being pressed in response to individual palpation) and/or display multi-task user interface (for example, being inputted in response to double pressures).
Home on-screen user interface includes multiple application program images corresponding with the different application being mounted in equipment Mark.When being activated by user (for example, passing through tap input), it is corresponding using journey that each application icon enters equipment Sequence simultaneously shows user interface (for example, the initial interface of default or last shown use in the application program of display Family interface).Taskbar is a user interface object comprising the application icon selected from home on-screen user interface Collection, to provide quick access for a small amount of common application program.Including the application icon in taskbar optionally by with Family selection (for example, via setting user interface), or various standards are based on (for example, having made since last time use by equipment With frequency or time) it automatically selects.In some embodiments, taskbar is shown as a part of home on-screen user interface (for example, bottom part of covering home on-screen user interface, as shown in Figure 4 A).In some embodiments, in response to user Request (for example, meet taskbar show standard gesture), taskbar is shown in another independently of home on-screen user interface In a part of user interface (for example, application program user interface).Application program switch user interface shows multiple nearest The expression (for example, showing that the time of application program was arranged in sequence based on last time) of the application program of opening.(example when selected Such as, inputted by tap), the expression for the application program opened recently accordingly is (for example, the application program opened recently accordingly The user interface that finally shows snapshot) make the equipment application program that display is opened recently accordingly on the screen again The user interface finally shown.
Fig. 5 A1 to Fig. 5 A5 shows exemplary implementation scheme, and wherein electronic equipment is in the different location along device end Place's display taskbar, this depends on the position (for example, long side pressing) for calling input.Fig. 5 A1 is shown with full-screen display mode The interaction map user interface of display.The left side of the bottom margin of display position (for example, relative to interaction graphical user The current orientation at interface defines bottom margin) at the long pressing gesture that detects (for example, contact 4202 maintains fixed position (position for example, it contacts to earth), at least threshold amount of time TT1It is less than the amount of movement of threshold value down) cause in the bottom along equipment Taskbar 4204 is shown at the corresponding position (for example, centered on contact 4202) in the left side at edge, such as Fig. 5 A1 to Fig. 5 A2 institute Show.After the contact 4202 in Fig. 5 A3 is lifted away from, taskbar keeps display, because the contact there is no during input Mobile (for example, keeping substantially static).On the contrary, the long pressing detected at the position on the right side of the bottom margin of display Gesture (for example, passing through contact 4206) makes in the corresponding position on the right side along device bottom edge (for example, to contact 4206 Centered on) at show taskbar 4202, as shown in Fig. 5 A4 to Fig. 5 A5.Taskbar is shown in the bottom of the display in Fig. 5 A5 The left side of bottom margin in the right side at edge, with Fig. 5 A2 is on the contrary, because call the long pressing input bit of taskbar in bottom sides The right-hand side of edge, so that user be allowed to interact with taskbar (for example, nothing at the position easily accessed convenient for user The predeterminated position for needing user to be moved to their hands in equipment in equipment).In some embodiments, instead of in touch screen Fringe region in need long pressing gesture (such as, it is desirable that contact is maintained into fixed position at least threshold amount of time TT1, and And optionally, intensity is made to be maintained at a below first threshold intensity, which is greater than contact detection intensity threshold value) to adjust Go out on missions column, the equipment need in the fringe region of touch screen light press gesture (such as, it is desirable that the intensity of contact increases to Higher than first threshold intensity, which is greater than contact detection intensity threshold value, and optionally, without tieing up the contact It holds in fixed position at least threshold amount of time TT1) to recall taskbar.
Fig. 5 A4 to Fig. 5 A8 shows exemplary implementation scheme, wherein single input is (for example, by the contact continuously safeguarded The multi-section point of 4206 inputs) to show taskbar, it then navigates to related to the application icon shown in taskbar The application program user interface of connection.Fig. 5 A4 shows the interaction map user interface shown with full-screen display mode.Pass through contact The 4206 long pressing gesture at the position on the right side of the bottom margin of display makes in the bottom margin along equipment The corresponding position on right side shows taskbar 4204 (for example, centered on contact 4206), as shown in Fig. 5 A4 to Fig. 5 A5.Contact 4206 movement on the mail applications icon 218 in taskbar 4204 selected icon, and the icon is as being selected As a result shown in Fig. 5 A6 bigger.When selecting mail applications icon 218, contacts 4206 and be lifted away from so that navigating to E-mail user interface, as shown in Fig. 5 A7 to Fig. 5 A8.As shown in Fig. 5 A7 to Fig. 5 A8, the display of e-mail user interface is animated Change, it appears that generated from selected mail applications icon 218, cover interaction map user interface.In Fig. 5 A7 to figure In 5A8, after navigating to e-mail user interface, taskbar disappears, because calling the input of taskbar to occur mobile and causing Navigation event.If not detecting being lifted away from for contact 4206 when contact 4206 moves through mail applications icon 218, And the movement for contacting 4206 proceeds to the position corresponding to the telephony application icon 216 in taskbar, stops selection postal Part application icon, and telephony application icon becomes choosing.If when contact 4206 is moved from taskbar 4204 From when detect that contact 4206 is lifted away from, then equipment optionally stops showing the taskbar, while maintaining to interact graphical user circle The display in face.
Fig. 5 A9 to Fig. 5 A10 shows exemplary implementation scheme, wherein the long pressing input on the different edges of equipment Also to show taskbar at the position close to input.Fig. 5 A9 shows e-mail user interface.Under the left edge of equipment The long pressing gesture (for example, by contact 4208) detected at position in half part makes in the left edge along equipment The corresponding position of lower half portion shows taskbar 4204 (for example, centered on contact 4206), such as Fig. 5 A9 to Fig. 5 A10 institute Show.Compared with Fig. 5 A2 and Fig. 5 A5, it is located on different edges because the calling of long pressing input is shown, in Fig. 5 A10, Taskbar is shown on the different edges of equipment.In addition, taskbar is shown compared with Fig. 5 A2 and Fig. 5 A5 with different orientations, Because it is along the vertical edge of equipment rather than horizontal edge is shown.
Fig. 5 A9 to Fig. 5 A12 shows an exemplary implementation scheme, wherein although due to inputting without navigating Event, but the display for eliminating taskbar is lifted away from by calling contact 4208.Fig. 5 A9 shows e-mail user interface.In Long pressing gesture on the lower half portion of equipment left edge, including the contact in the MobileFinder mail header in Fig. 5 A9 4208, so that showing taskbar 4204 along the lower half portion of equipment left edge under contact 4206 in Fig. 5 A10.Scheming In 5A12, after being lifted away from of the contact, taskbar disappears, because the contact moves in Fig. 5 A10 to Fig. 5 A11 far from taskbar It is dynamic, for example, the contact no-fix is above taskbar when being lifted away from.
Fig. 5 A13 to Fig. 5 A14 shows exemplary implementation scheme, wherein the hand detected in the fringe region of touch screen Gesture (for example, tap or light press) causes the operation in shown application program user interface, rather than causes taskbar Display because the gesture does not meet long pressing standard (for example, being kept at least threshold amount of time without substantive shifting in contact Being lifted away from for contact is detected before dynamic).Fig. 5 A13 shows e-mail user interface.Tap hand on equipment left edge lower half portion Gesture or light press gesture, including the contact 4209 in MobileFinder mail header in Fig. 5 A13, so that being selected in Fig. 5 A14 MobileFinder mail is selected/shown, rather than shows taskbar, as shown in Fig. 5 A12, because less than before contact is lifted away from The taskbar display operation of sufficient calling system range (and seizes the corresponding specific mail selection/display of mail applications Operation) needed for time threshold (for example, TT1)。
Fig. 5 A15 to Fig. 5 A18 shows the exemplary implementation scheme for gently sweeping hiding taskbar downwards.Fig. 5 A15 show with The interaction map user interface that full-screen display mode is shown.Long pressing gesture on the right side of the bottom margin of display, packet The contact 4212 in Fig. 5 A15 is included, so that under contact 4212, showing along the right side of the bottom margin of equipment in Fig. 5 A13 Taskbar 4204.In Fig. 5 A17, contact is moved down so that taskbar skids off the bottom margin of display.In Fig. 5 A18 In, after being lifted away from of contact, taskbar disappears, because taskbar is pushed away the display in Fig. 5 A16 to Fig. 5 A17 by contact. In Fig. 5 A16, taskbar is shown at the lower position of contact 4212, but not centered on contact 4212, because contact It is located proximate to the adjacent vertical edge (for example, right hand edge of display) of display.In this case, in adjacent display Taskbar is shown at adjacent vertical edge.
Fig. 5 A19 to Fig. 5 A21 shows exemplary implementation scheme, wherein what is contacted is lifted away from so that taskbar expands and moves Predetermined position on to display.Fig. 5 A19 shows the interaction map user interface shown with full-screen display mode.It is showing Long pressing gesture on the left side of the bottom margin of device, including the contact 4216 in Fig. 5 A19, so that being contacted in Fig. 5 A20 Under 4216, taskbar 4204 is shown along the left side of the bottom margin of equipment.After 4216 be lifted away from of contact, in Fig. 5 A21 In, taskbar is moved to the predefined position 4204-b among display bottom margin from the position 4204-a in Fig. 5 A20. Compared with the display at the position for calling input definition, when showing at predefined position, taskbar can also expand.
Fig. 5 A22 to Fig. 5 A23 is shown when long pressing gesture is too near to the end at display edge, and taskbar is shown in The exemplary implementation scheme of default location.Fig. 5 A22 shows the interaction map user interface shown with full-screen display mode.In Long pressing gesture on the right side of the bottom margin of display, including the contact 4218 in Fig. 5 A22, so that taskbar 4204 exists It is shown at the default location of the right end of the bottom margin of display, but not centered on the contact 4218 in Fig. 5 A23 Place display, because if it by contact 4218 centered on (for example, the right hand portion of taskbar will deviate from display to the right), then not All taskbars can be shown over the display.
Fig. 5 A22 to Fig. 5 A27 shows exemplary implementation scheme, wherein making from the single gesture that the edge of display is initiated It is able to split screen display available mode and shows application program.Fig. 5 A22 shows interaction graphical user circle shown with full-screen display mode Face.Long pressing gesture on the right side of the bottom margin of display, including the contact 4218 in Fig. 5 A22, so that taskbar 4204 show at the default location close to the right end of the bottom margin of display, but not with the contact 4218 in Fig. 5 A23 Centered on show.The movement contacted on mail applications icon 218 has selected icon, and the icon is as the result selected Shown in Fig. 5 A24 bigger.In Fig. 5 A25, when selecting mail applications icon, contacts and exist far from display edge It moves up upwards, while icon is removed into taskbar, wherein icon shows bigger due to being moved out of taskbar, and Indicate the application program for starting team being answered when contact 4218 is lifted away from.In Fig. 5 A26, contact by the further of boundary 4223 It is mobile (for example, sightless boundary, or in response to detect the upward of the icon 218 outside taskbar and move right and The boundary temporarily shown) make icon be changed into the view of e-mail user interface, to indicate to answer in mail when being lifted away from of contact (for example, being displayed side by side with map user interface is interacted) will be started with program with split screen display available mode.In Fig. 5 A27, contact 4218 are lifted away from so that equipment is switched to split screen display available mode from full-screen display mode, to show on the right half of display For the user interface of mail applications, and interaction map user interface is shown on the left half of display.Mail is answered It is shown with program user interface with span mode, because icon has been dragged out taskbar before being lifted away from of contact, this and Fig. 5 A8 On the contrary, wherein e-mail user interface is shown with full-screen display mode, because selecting postal in the taskbar in Fig. 5 A6 to Fig. 5 A7 What is contacted when part icon is lifted away from.
Fig. 5 A28 to Fig. 5 A29 shows exemplary implementation scheme, wherein making in the gesture that the edge of display is initiated Transition navigational state is navigate to, rather than shows taskbar, because contact is remote before the time requirement for meeting long pressing gesture Edge from display is mobile.Fig. 5 A28 shows the interaction map user interface shown with full-screen display mode.In Fig. 5 A29 In, by moving up contact 5222 from the bottom margin of display come activated user interface selection course, because being grown meeting It is contacted before pressing standard and moves enough amounts.On the contrary, taskbar is shown in Fig. 5 A23, because starting reality in contact 4218 Meet long pressing standard before matter is mobile.In Fig. 5 A29, interaction map user interface is expressed interactive map user interface Card 4014 replace (for example, being transformed into).After activated user interface selection course, for example, equipment exists as shown in Fig. 5 A29 Multiple possible target user interfaces are (for example, the user interface of the application program previously shown, application program switch user circle Face or home on-screen user interface) between selected, this is depended on when detecting when being lifted away from of contact, the target selected at that time Which user interface state is user interface state be.Target user interface state can dynamic select, and help to be based on Various criterion is (for example, the different marks of position, time, moving parameter based on shown contact and/or user interface object It is quasi-) it navigates in different user interface (for example, the application program opened recently, home on-screen user interface, application program switch Device user interface).Further it is provided that real-time vision feedback is to indicate which user use towards when moving contact on the touchscreen Family interface navigation.For example, describing the respective standard for navigating to different user interface relative to Fig. 8.
In some embodiments, when currently displayed user interface is shown with full-screen display mode (for example, as schemed Shown in 5A28 to Fig. 5 A29), equipment follows first standard set at the different user interface for navigating in full-screen display mode; And when currently displayed user interface is with the display of split screen display available mode, equipment is followed for leading under split screen display available mode It navigates to different user interface (for example, navigating to the application program user interface opened recently, or in the subdivision of split screen Application program switch user interface) or different user interfaces navigate to (for example, application program is cut with full-screen display mode Parallel operation user interface, including the single optional user interface of split screen user interface conduct, application program switch user interface, including Application program user interface in split screen user interface is as single optional user interface or home on-screen user interface) second Standard set.About navigate to different user interface more details (e.g., including different full screen user interfaces and split screen user Different user interface configurations (for example, various combination of the user interface in split screen user interface) in interface) below with reference to Such as Fig. 5 B1 to Fig. 5 B36 and flow chart 7A to Fig. 7 I is provided.
Fig. 5 B1 to Fig. 5 B36, which is shown, to be used for according to some embodiments from user circle shown with split screen display available mode Face navigates to the exemplary user interface at different user interface.
User interface in these attached drawings is for showing process described below, including the process in Fig. 7 A to Fig. 7 I.For Convenient for explaining, will discuss in embodiment with reference to the operation executed in the equipment with touch-sensitive display system 112 Some embodiments.In such embodiment, focus selector is optionally: respective finger or stylus contact correspond to hand Refer to or the representative point of stylus contact (for example, the center of gravity that accordingly contacts or with accordingly contact associated point) or in touch-sensitive display The center of gravity of two or more detected contacts in system 112.However, showing attached drawing on display 450 in response to working as Shown in user interface together with focus selector when detect contact on touch sensitive surface 451, optionally with display 450 with execute similar operation in the equipment of independent touch sensitive surface 451.
For the ease of explaining, by with reference in the equipment for not having home button the operation that executes discuss in embodiment Some embodiments, and release the user interface currently shown using the gesture for meeting predefined standard and show home On-screen user interface.Although being shown as optional in Fig. 5 B1 to Fig. 5 B36, in some embodiments, wrapped in equipment It includes home button (for example, mechanical button, solid-state button or virtual push button), and the home button is used to release currently to show User interface simultaneously shows home on-screen user interface.(being inputted for example, being pressed in response to individual palpation) and/or display multi-task user interface (for example, being inputted in response to double pressures).
Exemplary user interface shown in Fig. 5 B1 to Fig. 5 B36 is related to according to some embodiments for setting in electronics Under standby upper split screen display available mode between multiple user interfaces efficient navigation method, for example, in different application and being It is switched fast between system user interface.The example user interface of user interface selection course includes application program switch user circle Face comprising application program associated with electronic equipment is (for example, the application program opened recently, the application journey currently shown Sequence and optionally, system control panel) multiple user interfaces expression, these electronic equipments are shown as the virtual folded of card Layer (for example, " lamination "), wherein each card in lamination indicates the user interface of different application.These are stuck in herein Also referred to as " application view " (when corresponding to the user interface for the application program opened recently), or be " control panel View " (when corresponding to the user interface of control panel).The user detected on touch screen 112 (for example, touch sensitive surface) Input (for example, contact, gently sweep/drag gesture, flick gesture etc.), which is used to show, to be covered in the user interface currently shown Application program taskbar, and navigate between the different user interface that can choose for showing on the screen.In some implementations In scheme, home on-screen user interface is optionally shown as " card " in the virtual lamination of card.In some embodiments, home On-screen user interface is shown in the display layer below card stack layer.
While equipment shows user interface (for example, the user interface for being used for application program), opened from the bottom of screen Beginning gesture (for example, in the predefined region of equipment, the edges of the predefined region proximity displays (e.g., including display The fringe region of the predefined part (for example, 20 pixels are wide) at the close device bottom edge of device)) call user interface selection Process (for example, transition navigation user interface) and speed and direction based on input and it is optionally based on the user currently shown The moving parameter of interface object (for example, card) and feature guide the navigation between multiple user interfaces.The equipment is with indicating this The display of the card replacement present user interface of user interface is (for example, in some embodiments, user interface seems according to defeated What is entered moves and narrows down in card).According to some embodiments, user, which can choose, to be navigate to using different gestures with (i) Full frame home screen, (ii) navigate on the screen when immediately calling user interface selection course before the user interface that shows The application program (for example, in any portion of split screen display available) of display, (iii) navigate to split screen application program switch user Interface allows user from being selected in the application program being previously displayed on screen (for example, for grasping under span mode Shown in a part of the display of work), (iv) navigates to full frame application program switch user interface, allows user from elder generation Before selected in the application program that is displayed on the screen (for example, being shown with full-screen display mode or aobvious with split screen display available mode Show), or (v) navigate readjustment user interface selection course when the user interface (for example, in split screen display available mode) that shows.In During input, which provides dynamic vision feedback, what navigation selection instruction could be made that when inputting and terminating, promote multiple choosings Validated user navigation between selecting.In some embodiments, visual feedback and user interface response be flowing and it is reversible. In some embodiments, user is also an option that using gesture navigation to control panel user interface.In other embodiments In, need different input (for example, initiating from the different edges of display) to navigate to control panel user interface.One In a little embodiments, user is also an option that in shown user interface, display has multiple application program image target tasks Column.
Fig. 5 B1 to Fig. 5 B9 shows exemplary split screen user interface, wherein can be by being shown with split screen display available mode User interface in a part of application program switch user interface to change display.Fig. 5 B1 is shown with split screen display available It the interaction map user interface that shows and is simultaneously displayed in display right half in the left half of the display of mode operation E-mail user interface.Home, which shows, can indicate that 4400 are shown in two parts of display, be covered in corresponding user interface, To which instruction can initiate input instruction navigation (for example, for only in the part of display in any portion of display It is interior to navigate or for navigating to full screen user interface).By moving up contact on the left of the bottom margin of display After 4402 activated user interface selection courses, in Fig. 5 B2, interaction map user interface is replaced by (for example, being transitioned into) Indicate the card 4014 of interaction map user interface.However, the display of e-mail user interface maintains in the right half of display, because It is only initiated in the left half of display for transition navigational state.When contact 4402 is moved up more than the threshold position on screen When, indicate that the second card 406 of web browser user interface is also shown partially in (for example, from display in the left half of display The left edge of device slides into), if navigation will enter split screen application program and switch to indicate that contact is lifted away from the time point Device user interface.The optional ground of standard of split screen application program switch user interface for navigating on display left half In contact 4402 moving parameter (for example, position, speed, path etc., or combinations thereof) and move history and dynamically determine.In When contact 4402 is lifted away from, in Fig. 5 B3, equipment navigates to the application program switch user interface in display left half, In In Fig. 5 B4.The equipment in the left side of display by going out to represent the sliding of the previous shown user interface under each other Card carrys out animated transition, to form the lamination of previously shown user interface.Since Fig. 5 B5, gesture is gently swept in the folded of card It navigates in layer, which show the web-browsing cards 4406 in Fig. 5 B6 and 5B7.Web is selected using Flick gesture in Fig. 5 B8 Browsing card 4406 causes the left side of the display in Fig. 5 B9 to show the user interface of web-browsing application program.Mail user Interface remains displayed in the right half of the display in Fig. 5 B9, because of the use that navigation action is only shown in display left half Family is operated on interface.
Fig. 5 B1 to Fig. 5 B12 shows exemplary split screen user interface, wherein a part in split screen display available occurs for navigation Interior (for example, instead of in another part of display or replace in whole display), because transition navigation gesture is from display The bottom margin of the part of device starts (for example, instead of since bottom margin of another part of display).Fig. 5 B1 is shown With the interaction map user interface shown in the left half of the display of split screen display available mode operation and it is simultaneously displayed on display E-mail user interface in device right half.When gently sweeping gesture upwards since the bottom margin of the left half of display, such as scheme Shown in 5B1, in the left half activated user interface selection course of screen, such as the transition shown on display left half in Fig. 5 B2 Shown in user interface of navigating.On the contrary, when gently sweeping gesture upwards since the bottom margin of the right half of display, such as Fig. 5 B10 It is shown, it is led in the right half activated user interface selection course of screen, such as the transition shown on display right half in Fig. 5 B11 Shown in user interface of navigating.In both cases, the user interface being shown on the opposite segments of display, while the hair that navigates are maintained It is raw that (for example, when navigating to application program switch user interface, then web is clear on the part of display for wherein initiating gesture When user interface of looking at appears in the left half of the display in Fig. 5 B2 to Fig. 5 B9, e-mail user interface remains displayed in display Right half in);Equally, when on the right half in the display in Fig. 5 B10 to Fig. 5 B12 occur to application program switch use When the navigation at family interface, web-browsing user interface is remained displayed on the left half of display.
In the example shown in Fig. 5 B1 to Fig. 5 B12, gesture is gently swept at the edge that the either side of split screen starts and meets navigation The standard of split screen application program switch user interface on to the corresponding side of split screen, but do not meet navigate to it is full frame using journey The standard of sequence switch user interface.
Fig. 5 B13 to Fig. 5 B17 shows example process, and wherein equipment is from the user interface shown with split screen display available mode Full frame application program switch user interface (for example, replacing split screen application program switch user interface) is navigate to, because The input meet for navigate to full frame application program switch user interface standard (for example, because transition navigation gesture from Advance farther in the edge of display).Fig. 5 B13 is shown to show in the left half of the display of split screen display available mode operation Interaction map user interface and the e-mail user interface that is simultaneously displayed in display right half.When gently sweeping gesture upwards from aobvious When showing that the bottom margin of the left half of device starts, as shown in Fig. 5 B13, in the left half activated user interface selection course of screen, As shown in the transition navigation user interface shown on display left half in Fig. 5 B14.As contact continues the bottom far from display Portion edge, the e-mail user interface being shown in display right half, which is replaced by (for example, being transformed into) Fig. 5 B15, indicates postal The card 4015 of part user interface, to indicate to the user that equipment will be switched to full-screen display mode (for example, removing when contact is lifted away from Non-user modifies gesture so that navigation guide is returned to split screen display available mode).In addition, if the point shown in Fig. 5 B15 is detected and is connect Touching is lifted away from, then the application program switch user interface shown using full-screen display mode will include card as user interface 4014 and card 4015, these user interfaces can individually select in application program switch user interface;And when user selects When the wherein card shown in full frame application program switch user interface, equipment shown under full-screen display mode with it is selected Block corresponding user interface.In other words, detected in the state of if shown in Fig. 5 B15 being lifted away from of contact 4424 (for example, Visual feedback instruction meets the standard for navigating to full frame application program switch user interface), then equipment is due to contact 4424 Navigation gesture will translate out span mode.
As shown in Fig. 5 B16, when contact 4424 continues to move up, from previously shown interaction graphical user circle The card of face and e-mail user interface is animated to be merged into single card 4017, so that split screen display available state is indicated, wherein together When show interaction map application and mail applications user interface.Existing over the display indicates web-browsing user Second card 4406 at interface indicates the equipment will navigate in a different configuration full frame application program switching when contact is lifted away from Device user interface.The display (for example, including card associated with two application programs) at full frame interim subscriber interface indicates application Program switch user interface will be shown with full-screen display mode.This display with split screen interim subscriber interface is (for example, as schemed Only include card associated with single application program shown in 5B2 and Fig. 5 B11) it is contrasted, instruction application program switching Device user interface will be when gesture be terminated with span mode (for example, as shown in Fig. 5 B4 and Fig. 5 B12) display.Then, equipment meeting Full frame application program switch user interface is shown after the contact in Fig. 5 B17 is lifted away from.Card 4015 selection so that equipment again Split-screen display user interface, including interaction map user interface and e-mail user interface.
Fig. 5 B18 to Fig. 5 B21 shows example process, and wherein equipment is from the user interface shown with split screen display available mode Full frame home screen is navigate to (for example, replacing split screen application program switch user interface or full frame application program switch User interface) because the input meets the standard for navigating to full frame home on-screen user interface (for example, because transition is led Boat gesture is even advanced than the edge of display shown in Fig. 5 B16 farther).Fig. 5 B18 is shown with split screen display available mode behaviour The interaction map user interface shown in the left half for the display being used as and the mail being simultaneously displayed in display right half Family interface.When gently sweeping gesture upwards since the bottom margin of the left half of display, as shown in Fig. 5 B18, and from display The bottom margin of device is advanced, and ground is remote enough, and activation full screen user interface selection course is such as shown on the display in Fig. 5 B19 Full frame transition navigation user interface shown in comprising with interaction both map application and mail applications it is associated Card 4017.The equipment is indicated in the presence of the second card 4406 of the web-browsing user interface indicated in Fig. 5 B19 over the display Application program switch user interface will be navigate to when contact is lifted away from.As contact continues the bottom margin far from display, Web-browsing card disappears in Fig. 5 B20, and home on-screen user interface starts to focus on behind transition navigation user interface, instruction Equipment will navigate to the home screen when contact is lifted away from (for example, modifying gesture except non-user navigates to different use to guide Family interface).Then, equipment can show full frame home screen after the contact in Fig. 5 B21 is lifted away from.
Fig. 5 B22 to Fig. 5 B24 shows exemplary split screen user interface, and wherein equipment navigates in a part of display Previous shown user interface (for example, rather than application program switch user interface or home screen), maintain simultaneously The user interface on another part of display is shown, because the input meets the mark for navigating to previously shown user interface Standard is (for example, the substantial horizontal bottom margin for being moved to display of the input is (for example, the input is a part from display The arc scan that starts of bottom margin)).Fig. 5 B22 is shown to show in the left half of the display of split screen display available mode operation The web-browsing user interface shown and the e-mail user interface being simultaneously displayed in display right half.When substantially lateral is gently swept When gesture is since the bottom margin of the left half of display, as shown in Fig. 5 B22, in the left half activated user interface of screen Selection course, as shown in the transition navigation user interface shown on display left half in Fig. 5 B23.Arc gently sweep seem by Web-browsing user interface (for example, application view 406 of web-browsing user interface) from the first part of display to the right Dragging, while map user interface will be interacted (for example, the application view of interaction map user interface in Fig. 5 B23 4014) it is drawn on display from left side.These cards show as just moving on home screen, and home screen obscures in the background It is unclear.The display of e-mail user interface in display right half is not influenced by gesture, because gesture is in the left part of display Beginning and never call full-screen display mode (for example, as shown in Fig. 5 B15 and 5B19) in point.After contact is lifted away from, alternatively Figure user interface is shown in Fig. 5 B24 in the left half of split screen display available.
Fig. 5 B25 to Fig. 5 B36 shows exemplary split screen user interface, and wherein equipment is led in a part of display Boat gently sweeps gesture activation full screen display mould by the previous shown user interface in card stack layer, then in response to serial arc Formula, because of the user interface for not having other previously shown in card stack layer.Fig. 5 B25 is shown with the operation of split screen display available mode The interaction map user interface shown in the left half of display and mail user circle being simultaneously displayed in display right half Face.When substantially lateral when gently sweeping gesture since the bottom margin of the right half of display, as shown in Fig. 5 B25, in screen Right half activated user interface selection course, such as the transition navigation user interface institute shown on the display left half in Fig. 5 B23 Show (for example, with display left half on the contrary, as shown in Fig. 5 B23, when the bottom margin of the left half from display initiates arc When scanning).Arc gently sweeps gesture and pushes away e-mail user interface to the right from display, while by web-browsing user interface (example Such as, the application view 4406 of web-browsing user interface) it drags on the right half of display (for example, seeming is display The interaction map user interface that shows of right half), as shown in Fig. 5 B27.Web-browsing user interface is to navigate to the display right side The previously shown user interface of partial first, because it is the last one user interface left of navigating over the display. According to some embodiments, although web-browsing user interface is previously shown in the left half of display, it is still to lead It navigates to the first previous shown user interface in the right half of display, because the two partial sharings of display are previous Shown single lamination.
As shown in Fig. 5 B28 to Fig. 5 B29, the first subsequent arc scan in the right half of display causes to navigate back to figure E-mail user interface in 5B30 is reset, for example, such as inputting because the card shown by previously is stacked in front of gesture starts Home, which shows, when beginning can indicate again shown in display of 4400-2, as shown in Fig. 5 B28.On the contrary, such as Fig. 5 B31 to Fig. 5 B33 institute Show, the subsequent arc of second in display right half, which is gently swept, to be navigate to for the older of the instant message application program in Fig. 5 B33 Previous shown user interface, (for example, rather than navigating back before showing electronic mail user interface and being shown in display Web page browsing user interface in device right part) it is not reset because the card shown by previously is stacked in front of gesture starts, Shown in shortage home as shown in Fig. 5 B31 shows and can indicate.Finally, in Fig. 5 B34 to Fig. 5 B35, previously shown The subsequent arc scan of third in display right part started before card stack layer reset causes to navigate to interactive map user interface Full screen display, it is previously shown in the left half of display as shown in Fig. 5 B36 because without more in card stack layer Previously shown user interface.Compared with split screen display available mode, show which show two home and can indicate 4400 (for example, One is shown on each application program user interface shown in the right half and left half of the display in Fig. 5 B25, thus Instruction may individually navigate in any portion of display), it only shows in the full frame interactive map user interface in Fig. 5 B36 Showing that a home shows can indicate.
Fig. 5 C1 to Fig. 5 C59 is shown according to some embodiments for using more contact gestures at different user interface Between the exemplary user interface navigated, for example, that takes into account the translations that will be contacted as one group and contact is relative to that This movement (for example, " kneading " and " expansion " acts), and dynamical feedback is provided during gesture, to indicate when completing gesture Will navigate to which user interface, this allows user to change the characteristic attribute of gesture to avoid unexpected navigation and/or in the gesture phase Between consider the variation of expected navigation.
User interface in these attached drawings is for showing process described below, including the process in Figure 11 A to Figure 11 F. For the ease of explaining, by discussing in embodiment with reference to the operation executed in the equipment with touch-sensitive display system 112 Some embodiments.In such embodiment, focus selector is optionally: respective finger or stylus contact correspond to Finger or the representative point of stylus contact (for example, the center of gravity that accordingly contacts or with accordingly contact associated point) or touch-sensitive aobvious Show the center of gravity of two or more contacts detected in system 112.However, in response to attached when being shown on display 450 The contact on touch sensitive surface 451 is detected when user interface shown in figure is together with focus selector, and optionally there is display Similar operation is executed in device 450 and the equipment of independent touch sensitive surface 451.
For the ease of explaining, by with reference in the equipment for not having home button the operation that executes discuss in embodiment Some embodiments, and release the user interface currently shown using the gesture for meeting predefined standard and show home On-screen user interface.Although being shown as optional in Fig. 5 C1 to Fig. 5 C59, in some embodiments, wrapped in equipment It includes home button (for example, mechanical button, solid-state button or virtual push button), and the home button is used to release currently to show User interface simultaneously shows home on-screen user interface.(being inputted for example, being pressed in response to individual palpation) and/or display multi-task user interface (for example, being inputted in response to double pressures).
Example user interface shown in Fig. 5 C1 to Fig. 5 C59 is related to according to some embodiments in multiple users The method of efficient navigation between interface, for example, being switched fast between different application and system user interface.Fig. 5 C1 is extremely Exemplary user interface shown in Fig. 5 C59 includes home on-screen user interface comprising multiple application program launching icons, For example, as described in Fig. 5 A1 to Fig. 5 A29, full frame application program switch user circle comprising for being set with electronics The expression of multiple user interfaces of standby associated application program, (for example, the application program opened recently, what is currently shown is answered With program, and optionally system control panel), these electronic equipments be shown as handling on virtual plane card (for example, It is opposite with the card shown in virtual lamination described in reference Fig. 5 B1 to Fig. 5 B36), wherein each card in lamination indicates different The user interface of application program.These are stuck in also referred herein as " application view " (when corresponding to the application opened recently When the user interface of program), or be " control panel view " (when corresponding to the user interface of control panel).In some realities Apply in scheme, application view show application program corresponding with application view recent state or RUNTIME VIEW it is fast According to different from the application program launching icon shown in home user interface, the latter shows independently of the nearest of application program Or the predetermined design of real-time status.
While equipment shows user interface (for example, it is used for the user interface or system user interface of application program, it is all Such as application program switch user interface), the gesture including at least three contact (for example, 3,4,5 or more contacts) is from screen From anywhere in start, and including at least amount of threshold shift within a predetermined period of time, call user interface selection course (example Such as, display transition is navigated user interface) and speed and direction based on input and it is optionally based on user circle currently shown In face of as (for example, card) moving parameter and feature guide the navigation between multiple user interfaces.The equipment is with indicating the use The display of the card replacement present user interface at family interface is (for example, in some embodiments, user interface seems according to input Move and narrow down in card).According to some embodiments, user be can choose using translation and kneading/expansion gesture with (i) Full frame home screen is navigate to, (ii) navigates to before the user interface shown when immediately calling user interface selection course The application program (for example, in any portion of split screen display available) shown on screen, (iii) navigate to the switching of split screen application program Device user interface allows user to be selected from the application program being previously displayed on screen (for example, in split screen mould Shown in a part of the display operated under formula), (iv) navigates to full frame application program switch user interface, allows to use Family is selected from the application program being previously displayed on screen (for example, being shown with full-screen display mode or with split screen display available Mode is shown), or (v) navigate readjustment user interface selection course when the user interface that shows (for example, in split screen display available mode In).During input, which provides dynamic vision feedback, what navigation selection instruction could be made that when inputting and terminating, and promotes Validated user navigation between multiple selections.In some embodiments, visual feedback and user interface response be flowing and Reversible.In some embodiments, user is also an option that using gesture navigation to control panel user interface.In other realities It applies in scheme, needs different input (for example, initiating from the different edges of display) to navigate to control panel user circle Face.In some embodiments, user is also an option that in shown user interface, there are multiple application programs to open for display Cardon target taskbar.
Fig. 5 C1 to Fig. 5 C3, Fig. 5 C4 to Fig. 5 C6 and Fig. 5 C7 to Fig. 5 C9 show exemplary implementation scheme, including The gesture of two contacts (for example, two fingers touch) executes the operation specific to application program, for example, rather than system scope User interface selection (for example, UI navigate) operation.Fig. 5 C1 is shown to his Fig. 5 C3 and Fig. 5 C4 to Fig. 5 C6 to be caused alternatively Gesture is gently swept in figure translation, and Fig. 5 C7 to Fig. 5 C9 shows the kneading gesture for causing interactive map size adjusting.
Fig. 5 C1 shows the interaction map user interface shown with full-screen display mode.It includes connecing that gesture is gently swept in double contacts Touching 4502 and contact 4506 are to the right respectively from the position 4502-a as shown in Fig. 5 C1 and position 4506-a to as shown in Fig. 5 C2 The movement 4504 of position 4502-b and position 4506-b and mobile 4508, cause interactive map to the right horizontal translation (for example, display For Oregon east) because the gesture meet specific to application program translation standard (e.g., including contact in gesture It is moved in translation comprising less than three total contacts), rather than the standard of calling user interface selection course (e.g., including gesture In contact translational movement comprising at least three contact).When contact is lifted away from, interaction map application program user interface Display is kept, as shown in Fig. 5 C3, because the gesture meets the standard specific to application program, rather than the user of system scope Interface navigation standard.
Fig. 5 C4 shows the interaction map user interface shown with full-screen display mode.It includes connecing that gesture is gently swept in double contacts Touching 4662 and contact 4666 are upwards respectively from the position 4662-a as shown in Fig. 5 C4 and position 4666-a to as shown in Fig. 5 C5 The movement 4664 of position 4662-b and position 4666-b and mobile 4668 causes the interactive upward vertical translation of map (for example, hiding Montana State south) because the gesture meet specific to application program translation standard (e.g., including contact in gesture Translational movement comprising contact total less than three), rather than call user interface selection course standard (e.g., including hand The translational movement of contact in gesture comprising at least three contacts).When contact is lifted away from, interaction map application user circle Face keeps display, as shown in Fig. 5 C6, because the gesture meets the standard specific to application program, rather than the use of system scope Family interface navigation standard.
Fig. 5 C7 shows the interaction map user interface shown with full-screen display mode.Double contact kneading gestures include connecing Touching 4594 and contact 4598 are toward each other respectively from the position 4594-a as shown in Fig. 5 C7 and position 4598-a to such as Fig. 5 C8 institute The movement 4596 of the position 4594-b and position 4598-b that show and mobile 4600 causes interactive map to reduce (for example, display Russia strangles Both ridge east and Illinois are western) because the gesture meets the size adjusting standard (example specific to application program Such as, the kneading including the contact in gesture is mobile comprising less than three total contacts), rather than user interface is called to select The standard (e.g., including the kneading of the contact in gesture is mobile comprising at least three contacts) of journey.When contact is lifted away from, hand over Mutual map application user interface keeps display, as shown in Fig. 5 C9, because the gesture meets the mark specific to application program Standard, rather than the user interface navigation standard of system scope.
Fig. 5 C10 to Fig. 5 C12, Fig. 5 C13 to Fig. 5 C16, Fig. 5 C17 to Fig. 5 C19 and Fig. 5 C20 to Fig. 5 C22 show and show Example property embodiment is executed including the gesture of gently sweeping of at least three contacts (for example, three, four or five fingers touch) User interface selection (for example, the UI navigate) operation of system scope, for example, rather than specific to the operation of application program.Response Gesture in each series of drawing and the user interface navigate to depend on the attribute of gesture.The equipment provides during the gesture Dynamic visual feedback will navigate to which user interface (for example, all contacts are lifted away from) when gesture terminates with instruction.
Fig. 5 C10 to Fig. 5 C12 shows the level contacted including four and gently sweeps gesture, this causes to navigate to previously shown Application program user interface.Fig. 5 C10 shows the interaction map user interface shown with full-screen display mode.Four contacts are light Sweeping gesture includes contact 4510, contact 4514, contact 4518 and contact 4522 to the right respectively from the position as shown in Fig. 5 C10 4510-a, position 4514-a, position 4518-a and position 4522-a are to position 4510-b, position 4514- as shown in Fig. 5 C11 B, the movement 4512 of position 4518-b and position 4522-b, mobile 4516, mobile 4520 and mobile 4524, call user interface choosing Select process because the gesture meet system scope user interface navigation standard (e.g., including the translation of the contact in gesture moves It is dynamic comprising at least three contacts, wherein after equipment detects contact for the first time, in threshold amount of time (for example, TT1) in hair Raw at least amount of threshold shift), rather than specific to the translation standard of application program (e.g., including the translation of the contact in gesture It is mobile comprising less than three total contacts (for example, as shown in Fig. 5 C1 to Fig. 5 C3), or contact is detected for the first time in equipment Afterwards, in threshold amount of time (for example, TT1) interior there is no amount of threshold shift).Equipment interaction map user interface indicates (for example, card) 4526 replaces the display of interaction map user interface, and starts to slip away card (for example, according to contact on the right side of screen Move right), while the expression of previously shown e-mail user interface (for example, card) 4528 is dragged into screen from left side On, as shown in Fig. 5 C11.During the gesture, card 4526 and card 4528 keep larger, so that instruction is set when the gesture terminates It is standby will navigate to next/upper one shown by application program (for example, because when input/application view attribute is full Foot " gently sweeping for next/upper application program side " standard (100x4) and/or " vertically gently sweep it is next/upper one application When program " standard (100x5), application program shown by next/upper one is appointed as before current goal state by equipment Application program, as shown in Figure 10 A to Figure 10 B), show e-mail user interface after contact is lifted away from as shown in Fig. 5 C12.
Fig. 5 C13 to Fig. 5 C16, which is shown, gently sweeps gesture including four the vertical of contact, this causes to navigate to home screen use Family interface.Fig. 5 C13 shows the e-mail user interface shown with full-screen display mode.It includes contact that gesture is gently swept in four contacts 4530, contact 4534, contact 4538 and contact 4542 are to the right respectively from position 4530-a, position 4534- as shown in Fig. 5 C13 A, position 4538-a and position 4542-a is to position 4510-b, position 4534-b as shown in Fig. 5 C14, position 4538-b and position Movement 4532, the movement 4536, mobile 4540 and mobile 4544 for setting 4542-b, call user interface selection course, because of the hand Gesture meet system scope user interface navigation standard (e.g., including the translational movement of the contact in gesture comprising at least three A contact, wherein after equipment detects contact for the first time, in threshold amount of time (for example, TT1) in that at least threshold value occurs is mobile Amount), rather than specific to the translation standard of application program (e.g., including the translational movement of the contact in gesture comprising be less than Three total contacts (for example, as shown in Fig. 5 C4 to Fig. 5 C6), or after equipment detects contact for the first time, in threshold amount of time (for example, TT1) interior there is no amount of threshold shift).The equipment e-mail user interface indicates (for example, card) 4528 replacement postal The display of part user interface, and start to shrink and translate 4528 (for example, moving up according to contact) of card upwards.Previously show Expression (for example, the card) 4526 for the interaction map user interface shown is also aobvious with the size and vertical translation similar with mail card 4528 Show, so that indicating equipment will navigate to application program switch user interface when terminating gesture.As contact continues up shifting It moves to position 4530-c, position 4534-c, position 4538-c and position 4542-c, as shown in Fig. 5 C15, mail card 4528 continues It shrinks and moves up, interaction expansion card 4526 disappears, and home on-screen user interface starts to gather behind mail card 4528 Coke, so that indicating equipment will navigate to home on-screen user interface (for example, because when input/application program when gesture terminates The attribute of view meets " be quickly sized/translate home " standard (100x2) and/or " big is sized/translate home " When standard (100x3), home screen is appointed as current goal state by equipment, as shown in Figure 10 A-10B), such as Figure 10 A to figure Shown in 10B, home on-screen user interface is shown after contact is lifted away from as shown in Fig. 5 C16.
Fig. 5 C17 to Fig. 5 C19 show including four contact it is vertical gently sweeps gesture, this, which causes to navigate to application program, cuts Except device user interface.Fig. 5 C17 shows the e-mail user interface shown with full-screen display mode.Four contacts gently sweep gesture and include Contact 4546, contact 4550, contact 4554 and contact 4558 are upwards respectively from position 4546-a, position as shown in Fig. 5 C17 4550-a, position 4554-a and position 4558-a are to position 4510-b, position 4550-b as shown in Fig. 5 C18, position 4554-b Movement 4548, movement 4552, mobile 4556 and mobile 4560 with position 4558-b, call user interface selection course, because The gesture meet system scope user interface navigation standard (e.g., including the translational movement of the contact in gesture comprising extremely Few three contacts, wherein after equipment detects contact for the first time, in threshold amount of time (for example, TT1) at least threshold value occurs Amount of movement), rather than specific to the translation standard of application program (e.g., including the translational movement of the contact in gesture comprising Less than three total contacts, or after equipment detects contact for the first time, in threshold amount of time (for example, TT1) in there is no Amount of threshold shift).The display for indicating (for example, card) 4528 replacement e-mail user interface of the equipment e-mail user interface, and Start to shrink upwards and translation blocks 4528 (for example, moving up according to contact).Previously shown interaction graphical user circle The expression (for example, card) 4526 in face is also shown with the size and vertical translation similar with mail card 4528, so that indicating equipment will Application program switch user interface is navigate to when gesture terminates (for example, because when input/application view attribute is full Foot " pause application program switch " standard (100x6) and/or " of short duration, slow to be moved to application program switch " standard When (100x8), application program switch is appointed as current goal state by equipment, as shown in Figure 10 A to Figure 10 B), such as Fig. 5 C19 It is shown that application switching device user interface is shown after contact is lifted away from.The equipment navigates to the use of the application program switch in Fig. 5 C19 Family interface, rather than home on-screen user interface (for example, as navigated in Fig. 5 C13 to Fig. 5 C16), because the gesture is full Sufficient application program switch navigation standard, rather than home screen navigation standard is (for example, contact moves up satisfaction and navigation To corresponding first vertical translation of application switching device user interface and/or the first vertical speed threshold value, but does not meet and navigate to Corresponding second vertical translation of home on-screen user interface and/or the second vertical speed threshold value).
Fig. 5 C20 to Fig. 5 C22 shows the level contacted including four and gently sweeps gesture, this causes to navigate back to identical answer Use program user interface.Fig. 5 C20 shows the interaction map user interface shown with full-screen display mode.Hand is gently swept in four contacts Gesture include contact 4562, contact 4566, contact 4570 and contact 4574 to the right respectively from the position 4562-a as shown in Fig. 5 C20, Position 4566-a, position 4570-a and position 4574-a are to position 4510-b, position 4566-b, position as shown in Fig. 5 C21 The movement 4564 of 4570-b and position 4574-b, mobile 4568, mobile 4572 and mobile 4576, call user interface to select Journey because the gesture meet system scope user interface navigation standard (e.g., including the translational movement of the contact in gesture, It includes at least three contacts, wherein after equipment detects contact for the first time, in threshold amount of time (for example, TT1) in occur At least amount of threshold shift), rather than specific to the translation standard of application program (e.g., including the translation of the contact in gesture moves It is dynamic comprising less than three total contacts (for example, as shown in Fig. 5 C1 to Fig. 5 C3), or contact is detected for the first time in equipment Afterwards, in threshold amount of time (for example, TT1) interior there is no amount of threshold shift).Equipment interaction map user interface indicates (for example, card) 4526 replaces the display of interaction map user interface, and starts to slip away card (for example, according to contact on the right side of screen Move right), while the expression of previously shown e-mail user interface (for example, card) 4528 is dragged into screen from left side On, as shown in Fig. 5 C21.During the gesture, card 4526 and card 4528 keep larger, however, these cards slide to the right very really Far, so that indicating equipment will navigate back to interactive map using interface (for example, because when input/using journey when gesture terminates When the attribute of sequence view meets " be sized/translate to cancel " standard (100x7), current application program is appointed as working as by equipment Preceding dbjective state, as shown in Figure 10 A to Figure 10 B), show that interaction map uses interface after contact is lifted away from as shown in Fig. 5 C22.
Fig. 5 C23 to Fig. 5 C26 shows exemplary implementation scheme, including at least four contacts (for example, four or five A finger touches) gesture of gently sweeping execute operation specific to application program, rather than when there is no thresholds in threshold amount of time The user interface selection (for example, UI navigates) of system scope when being worth amount of movement operates.Fig. 5 C23 is shown to be displayed in full screen mould The interaction map user interface that formula is shown.Detect that four including contact 4578, contact 4582, contact 4586 and contact 4590 connect Touching input, as shown in Fig. 5 C24.However, until passing through threshold amount of time until detecting contact for the first time (for example, TT1) it The movement that these contacts just occur afterwards, as shown in Fig. 5 C24.Contact 4578, contact 4582, contact 4586 and contact 4590 are to the right Respectively from position 4578-a, position 4582-a as shown in Fig. 5 C24, position 4586-a and position 4590-a to such as Fig. 5 C25 institute The movement 4580 of the position 4578-b, position 4582-b, position 4586-b and the position 4590-b that show, mobile 4584, mobile 4588 With movement 4592, lead to interactive map horizontal translation (for example, being shown as Oregon east) to the right, for example, rather than calling and using Family interface selection course (for example, as shown in Fig. 5 C10 to Fig. 5 C12), because the gesture meets the translation specific to application program Standard (e.g., including detect the threshold amount of time after contact (for example, TT for the first time in equipment1) in be less than threshold value movement Amount), rather than the standard of calling user interface selection course (e.g., including the threshold value after equipment detects contact for the first time Time quantum is (for example, TT1) in be more than threshold value amount of movement (for example, as shown in Fig. 5 C10 to Fig. 5 C12)).When contact is lifted away from, Interaction map application program user interface keeps display, as shown in Fig. 5 C26, because the gesture meets specific to application program Standard, rather than the user interface navigation standard of system scope.
Fig. 5 C27 to Fig. 5 C29, Fig. 5 C30 to Fig. 5 C32, Fig. 5 C33 to Fig. 5 C36 and Fig. 5 C37 to Fig. 5 C42 show and show Example property embodiment is executed including the kneading gesture of at least three contacts (for example, three, four or five fingers touch) User interface selection (for example, the UI navigate) operation of system scope, for example, rather than specific to the operation of application program.Response Gesture in each series of drawing and the user interface navigate to depend on the attribute of gesture, include in some embodiments Translational motion replaces and/or the translational motion other than kneading/development is mobile.The equipment provides dynamically during the gesture Visual feedback will navigate to which user interface (for example, all contacts are lifted away from) when gesture terminates with instruction.
Fig. 5 C27 to Fig. 5 C29 shows the kneading gesture contacted including five, this causes to navigate to home screen user circle Face.Fig. 5 C27 shows the interaction map user interface shown with full-screen display mode.Five contact kneading gestures include contact 4602, contact 4606, contact 4610, contact 4614 and contact 4618 are toward each other respectively from the position as shown in Fig. 5 C27 4602-a, position 4606-a, position 4610-a, position 4614-a and position 4618-a are to the position 4502- as shown in Fig. 5 C28 B, movement 4604, movement 4608, movement 4612, the shifting of position 4606-b, position 4610-b, position 4614-b and position 4618-b Dynamic 4616 and mobile 4620, user interface selection course is called, because the gesture meets the user interface navigation mark of system scope Quasi- (e.g., including the kneading of the contact in gesture is mobile comprising at least three contacts connect wherein detecting for the first time in equipment After touching, in threshold amount of time (for example, TT1) at least amount of threshold shift occurs), rather than specific to the size of application program Adjustment standard (e.g., including the kneading of the contact in gesture/expansion is mobile comprising less than three total contacts are (for example, as schemed Shown in 5C7 to Fig. 5 C9), or after equipment detects contact for the first time, in threshold amount of time (for example, TT1) in there is no Amount of threshold shift).Equipment interaction map user interface indicates that (for example, card) 4526 replaces interaction map user interface It has been shown that, and start to shrink and by card 4526 towards the position translation (for example, according to the contact of kneading move) between each contact. The smaller size of interaction expansion card 4526 and the appearance instruction of the interaction subsequent home on-screen user interface of expansion card 4526 are set The standby home on-screen user interface that will navigate to when gesture terminates is (for example, because when input/application view attribute meets " be quickly sized/translate home " standard (100x2) and/or when " big is sized/translate home " standard (100x3), Home screen is appointed as current goal state by equipment, as shown in Figure 10 A-10B), as shown in Figure 10 A to Figure 10 B, such as Fig. 5 C29 It is shown that home on-screen user interface is shown after contact is lifted away from.
Fig. 5 C30 to Fig. 5 C32 shows the kneading gesture contacted including five, this causes to navigate to application program switch User interface.Fig. 5 C30 shows the interaction map user interface shown with full-screen display mode.Five, which contact kneading gestures, includes Contact 4642, contact 4646, contact 4650, contact 4654 and contact 4658 are toward each other respectively from the position as shown in Fig. 5 C30 4642-a, position 4646-a, position 4650-a, position 4654-a and position 4658-a are to the position 4542- as shown in Fig. 5 C31 B, movement 4644, movement 4648, movement 4652, the shifting of position 4646-b, position 4650-b, position 4654-b and position 4658-b Dynamic 4656 and mobile 4660, user interface selection course is called, because the gesture meets the user interface navigation mark of system scope Quasi- (e.g., including the kneading of the contact in gesture is mobile comprising at least three contacts connect wherein detecting for the first time in equipment After touching, in threshold amount of time (for example, TT1) at least amount of threshold shift occurs), rather than specific to the size of application program Adjustment standard (e.g., including the kneading of the contact in gesture/expansion is mobile comprising less than three total contacts are (for example, as schemed Shown in 5C7 to Fig. 5 C9), or after equipment detects contact for the first time, in threshold amount of time (for example, TT1) in there is no Amount of threshold shift).Equipment interaction map user interface indicates that (for example, card) 4526 replaces interaction map user interface It has been shown that, and start to shrink and expansion card 4526 will be interacted towards the position translation between each contact (for example, connecing according to kneading Touching movement).Previously the expression (for example, card) 4528 of shown e-mail user interface was also with similar with interaction expansion card 4526 Size and vertical translation are shown, so that indicating equipment will navigate to application program switch user interface (example when gesture terminates Such as, because when input/application view attribute meets " pause application program switch " standard (100x6) and/or " short Temporarily, be slowly moved to application program switch " standard (100x8) when, application program switch is appointed as current goal by equipment State, as shown in Figure 10 A to Figure 10 B), application switching device user interface is shown after contact is lifted away from as shown in Fig. 5 C32.This sets The standby application program switch user interface navigate in Fig. 5 C32, rather than home on-screen user interface is (for example, such as Fig. 5 C27 As navigating into Fig. 5 C29) because the gesture meets application program switch navigation standard, rather than home screen navigation Standard is (for example, the kneading of contact, which moves up, meets the first kneading translation corresponding with application switching device user interface is navigate to And/or the first vertical speed threshold value, but do not meet corresponding with home on-screen user interface is navigate to second mediate translation and/or Second vertical speed threshold value).
Fig. 5 C33 to Fig. 5 C36, Fig. 5 C37 to Fig. 5 C42 and Fig. 5 C43 to Fig. 5 C47 show exemplary implementation scheme, Middle user interface navigation is by including the translation at least three gestures for contacting (for example, three, four or five fingers touch) It is controlled with mobile combination is mediated.The user interface navigate in response to the gesture in each series of drawing depends on termination The attribute (for example, last measurement property set of gesture) of preceding gesture.It is anti-that the equipment provides dynamic vision during the gesture Feedback will navigate to which user interface (for example, all contacts are lifted away from) when gesture terminates with instruction.
Fig. 5 C33 to Fig. 5 C36 shows exemplary implementation scheme, and the kneading of the gesture contacted including five is mobile to adjust With user interface selection course, and before gesture termination, the translational movement of gesture causes to navigate to previously shown just Application program user interface.Fig. 5 C33 to Fig. 5 C36 also shows exemplary implementation scheme, wherein calling user interface choosing After selecting process, continue user interface navigation after being lifted away from of some but not all contact.Movement is mediated in five contacts, including Contact 4622, contact 4626, contact 4630, contact 4634 and contact 4638 are toward each other respectively from the position as shown in Fig. 5 C33 4622-a, position 4626-a, position 4630-a, position 4634-a and position 4638-a are to the position 4622- as shown in Fig. 5 C34 B, movement 4624, movement 4628, movement 4632, the shifting of position 4626-b, position 4630-b, position 4634-b and position 4638-b Dynamic 4636 and mobile 4640, call user interface selection course.Equipment interaction map user interface indicates (for example, card) 4526 replace the display of interaction map user interface, and start to shrink and by interaction expansion card 4526 towards between each contact Position translation (for example, mobile according to the contact of kneading).The previously expression (for example, card) 4528 of shown e-mail user interface Also it is shown with the size and vertical translation similar with interaction expansion card 4526, so that indicating equipment will navigate to when gesture terminates Application program switch user interface is (for example, because " pause application program is cut when input/application view attribute meets Parallel operation " standard (100x6) and/or when " of short duration, slowly be moved to application program switch " standard (100x8), equipment will be applied Program switch is appointed as current goal state, as shown in Figure 10 A to Figure 10 B), for example, as shown in Fig. 5 C30 to Fig. 5 C32.In After contact 4622 and contact 4626 are lifted away from, user interface selection course continues, as shown in Fig. 5 C35.Remaining contact 4630, Contact 4634 and contact 4638 from position 4630-b, position 4634-b as shown in Fig. 5 C34 and position 4638-b horizontal translation to Position 4630-c, position 4634-c as shown in Fig. 5 C35 and position 4638-c, will interaction expansion card 4526 push away display to The right side, while mail card 4528 being further dragged on display from left side, this instruction equipment when the gesture terminates will navigate to Application program shown by next/upper one is (for example, because when input/application view attribute satisfaction " gently sweeps use in side In next/upper application program " standard (100x4) and/or " vertically gently sweeping next/upper application program " standard When (100x5), application program shown by next/upper one is appointed as the application journey before current goal state by equipment Sequence, as shown in Figure 10 A to Figure 10 B), e-mail user interface is shown after contact is lifted away from as shown in Fig. 5 C36.
Fig. 5 C37 to Fig. 5 C42 shows exemplary implementation scheme, passes through expansion including the navigation gesture of kneading action Movement is inverted.Movement, including contact 4670, contact 4674,4686 courts of contact 4678, contact 4682 and contact are mediated in five contacts To each other respectively from position 4670-a, position 4674-a as shown in Fig. 5 C37, position 4678-a, position 4682-a and position 4686-a is to position 4670-b, position 4674-b as shown in Fig. 5 C38, position 4678-b, position 4682-b and position 4686-b Movement 4672, mobile 4676, mobile 4680, mobile 4684 and mobile 4688, call user interface selection course.The equipment is used The display of interaction map user interface is replaced in the expression (for example, card) 4526 of interaction map user interface, and start to shrink and incite somebody to action Expansion card 4526 is interacted towards the position translation (for example, mobile according to the contact of kneading) between each contact.It is previously shown E-mail user interface expression (for example, card) 4528 it is also aobvious with the size and vertical translation similar with interaction expansion card 4526 Show, so that indicating equipment will navigate to application program switch user interface (for example, because when inputting/answering when gesture terminates With the attributes of Views meet " pause application program switch " standard (100x6) and/or " it is of short duration, be slowly moved to application When program switch " standard (100x8), application program switch is appointed as current goal state by equipment, such as Figure 10 A to figure Shown in 10B).With contact continue together mediate to position 4670-c, position 4674-c, position 4678-c, position 4682-c and Position 4586-c, as shown in Fig. 5 C39, interaction expansion card 4526 continues contraction and, mail card mobile towards the virtual palm of gesture 4528 disappear, and home on-screen user interface starts to focus behind interaction expansion card 4526, so that indicating equipment will be in hand Gesture navigates to home on-screen user interface (for example, because when input/application view attribute meets " fast velocity modulation when terminating / translation home " standard (100x2) and/or when " big be sized/translate home " standard (100x3), equipment will for whole size Home screen is appointed as current goal state, as shown in Figure 10 A-10B).The kneading campaign (for example, expansion movement) of contact is anti- Position 4670-d, position 4674-d as shown in Fig. 5 C40, position 4678-d, position 4682-d and position 4686-d are gone to, is expanded Big interaction expansion card 4526 simultaneously reappears mail card 4538, so that indicating equipment will navigate to application when gesture terminates Program switch user interface is (for example, because when input/application view attribute meets " pause application program switch " Standard (100x6) and/or when " of short duration, slowly be moved to application program switch " standard (100x8), equipment cuts application program Parallel operation is appointed as current goal state, as shown in Figure 10 A to Figure 10 B).After expansion movement, horizontal translation is in place to the right for contact 4670-e, position 4674-e, position 4678-e, position 4682-e and position 4686-e are set, as shown in Fig. 5 C41, by interaction map Card 4526 pushes away display to the right, while mail card 4528 being further dragged on display from left side, this instruction is in the gesture When termination equipment will navigate to next/upper one shown by application program (for example, because when input/application view Attribute meet " gently sweeping for next/upper application program side " standard (100x4) and/or " vertically gently sweep it is next/ When one application program " standard (100x5), application program shown by next/upper one is appointed as current goal by equipment Application program before state, as shown in Figure 10 A to Figure 10 B), mail user circle is shown after contact is lifted away from as shown in Fig. 5 C42 Face.
Fig. 5 C43 to Fig. 5 C47 shows exemplary implementation scheme, wherein light sweep is made to both contribute to kneading action upwards Lead to the gesture for navigating to home on-screen user interface.Fig. 5 C43 shows the interaction graphical user shown with full-screen display mode Interface.The light gesture, including contact 4690, contact 4694, contact 4698 and contact 4702 of sweeping of four contacts is to the right respectively from such as figure Position 4690-a, position 4694-a shown in 5C43, position 4698-a and position 4702-a are to the position as shown in Fig. 5 C44 Movement 4692, movement 4696, movement 4700 and the movement of 4690-b, position 4694-b, position 4698-b and position 4702-b 4704, call user interface selection course.Equipment interaction map user interface indicates (for example, card) 4526 replacement interaction The display of formula map user interface, and start to slip away card (for example, being moved right according to contact) on the right side of screen, while will be previous The expression (for example, card) 4528 of shown e-mail user interface drags on screen from left side, as shown in Fig. 5 C43.Scheming In 5C44, card 4526 and card 4528 keep larger, so that instruction equipment when the gesture terminates will navigate to next/upper one Shown application program is (for example, because when input/application view attribute satisfaction " is gently swept for next/upper one side A application program " standard (100x4) and/or when " vertically gently sweeping next/upper application program " standard (100x5), equipment Application program application program shown by next/upper one being appointed as before current goal state, if Figure 10 A is to scheming Shown in 10B).Contact is moved upwards up to position 4690-c, position 4694-c, position 4698-c and position 4702-c, such as Fig. 5 C45 It is shown, so that card shrinks and moves up (for example, moving up according to contact), so that indicating equipment will be when gesture terminates Application program switch user interface is navigate to (for example, because when input/application view attribute meets " pause application Program switch " standard (100x6) and/or when " of short duration, slowly be moved to application program switch " standard (100x8), equipment Application program switch is appointed as current goal state, as shown in Figure 10 A to Figure 10 B).As shown in Fig. 5 C46, with contact Start it is kneaded together to position 4690-d, position 4694-d, position 4698-d and position 4702-d, interaction expansion card 4526 after Continuous to shrink and move down initially towards the virtual palm of gesture, mail card 4528 disappears, and home on-screen user interface is opened Begin to focus behind interaction expansion card 4526, so that indicating equipment will navigate to home on-screen user interface when gesture terminates (for example, because working as input ,/attribute of application view meet " be quickly sized/translates home " standard (100x2) and/ Or when " big be sized/translate home " standard (100x3), home screen is appointed as current goal state by equipment, is such as schemed Shown in 10A-10B), as shown in Figure 10 A to Figure 10 B, home on-screen user interface is shown after contact is lifted away from as shown in Fig. 5 C47. Although card is moved down in response to the kneading campaign in Fig. 5 C46 (for example, in response to upward in Fig. 5 C13 to Fig. 5 C16 Gently sweep and move up opposite), the navigational state of prediction is home on-screen user interface, because moving up and what is contacted pinches It is associated with such navigation (for example, gently sweeping and mediate upwards both of which helps to increase " simulation Y location " to close both of which And/or the contraction of card, wherein one or two, which corresponds to, navigates to application program switch or home on-screen user interface).
Fig. 5 C48 to Fig. 5 C50 shows exemplary implementation scheme, wherein gently sweeping gesture is included in home screen user upwards At least three contacts (for example, three, four or five fingers touch) on interface, this is not home screen user circle of default Face (for example, second page or continued page of application program launching icon) to navigate to default home on-screen user interface.Figure 5C48 shows secondary home on-screen user interface comprising for multiple application programs (for example, clock, application program shop, Voice memo, calculator and notepad) application program launching icon.Gesture, including contact 4710, contact are gently swept in four contacts 4714, contact 4718 and contact 4722 are upwards respectively from position 4710-a, position 4714-a, position as shown in Fig. 5 C48 4718-a and position 4722-a is to position 4710-b, position 4714-b as shown in Fig. 5 C49, position 4718-b and position 4722- The movement 4712 of b, mobile 4716, mobile 4720 and mobile 4724, so that equipment navigates to main (for example, default) home screen User interface, as shown in Fig. 5 C50.In some embodiments, it is sliding to show main home on-screen user interface for the display of animation Enter (for example, from left side of display) and secondary home on-screen user interface is pushed away into display (for example, to the right).Some In embodiment, the four contact kneading gestures including contact 4710, contact 4714, contact 4718 and contact 4722 are moved toward each other It is dynamic, so that equipment navigates to main (for example, default) home on-screen user interface.
Fig. 5 C51 to Fig. 5 C54 shows exemplary implementation scheme, wherein including in application program switch user interface The upward gesture of gently sweeping of at least three contacts (for example, three, four or five fingers touch) to navigate to home screen use Family interface.Gesture is gently swept in four contacts, including contacts 4726, contact 4730, contact 4734 and contact 4738 upwards respectively from such as figure Position 4726-a, position 4730-a shown in 5C51, position 4734-a and position 4738-a are to the position as shown in Fig. 5 C52 Movement 4728, movement 4732, movement 4736 and the movement of 4726-b, position 4730-b, position 4734-b and position 4738-b 4740, so that equipment navigates to home on-screen user interface, as shown in Fig. 5 C54.In some embodiments, the display of animation With the mobile upward sliding application program switch user interface of contact, thus below application program switch user interface Show home on-screen user interface.In some embodiments, the initial part of gesture is gently swept in response to the upward of multiple contacts, It is displayed side by side the expression (for example, as shown in Fig. 5 C52) of most recently used application program, and navigates to home screen when meeting Standard when, (for example, identical as the standard for navigating to home on-screen user interface from application program user interface, extremely such as Fig. 9 A Described in Fig. 9 C and Figure 10 A to Figure 10 D), equipment only shows that the expression of most recently used application program on display is anti-as vision Feedback, with the current goal state (for example, as shown in Fig. 5 C23) that indicative user interface navigates before contact is lifted away from, and in hand Display home on-screen user interface (for example, if figure is shown in Fig. 5 C54) after gesture terminates.
Fig. 5 C55 to Fig. 5 C59 shows exemplary implementation scheme, wherein being used for the user interface of user interface selection course It is dynamically and reversible.Movement, including contact 4742, contact 4746, contact 4750, contact 4754 and contact are mediated in five contacts 4758 toward each other respectively from position 4742-a, position 4746-a as shown in Fig. 5 C55, position 4750-a, position 4754-a and Position 4758-a is to position 4742-b, position 4746-b as shown in Fig. 5 C56, position 4750-b, position 4754-b and position The movement 4744 of 4758-b, mobile 4748, mobile 4752, mobile 4756 and mobile 4760, call user interface selection course.It should Equipment interaction map user interface indicates that (for example, card) 4526 replaces the display of interaction map user interface, and starts to receive It contracts and expansion card 4526 will be interacted towards the position translation (for example, mobile according to the contact of kneading) between each contact.Previously The expression (for example, card) 4528 of shown e-mail user interface also with the size similar with interaction expansion card 4526 and vertical is put down Display is moved, so that indicating equipment will navigate to application program switch user interface (for example, because when defeated when gesture terminates Enter/attribute of application view meet " pause application program switch " standard (100x6) and/or " it is of short duration, be slowly moved to When application program switch " standard (100x8), application program switch is appointed as current goal state by equipment, extremely such as Figure 10 A Shown in Figure 10 B).Contact in the diagonal directions upwards and to the right translational movement (e.g., including the shifting of horizontal and vertical component It is dynamic) make as shown in Fig. 5 C57 to position 4742-c, position 4746-c, position 4750-c, position 4754-c and position 4758-c It must block and shrink and move up (for example, vertical component mobile according to contact), and move right (for example, being moved according to contact Dynamic horizontal component).As shown in Fig. 5 C58, contact is moved down into position 4742-d, position 4746-d, position 4750-d, position 4754-d and position 4758-d are set, so that the size of interaction expansion card 4526 increases, mail card 4528 is pushed away into display to the right, To which indicating equipment will navigate back to interactive map user interface (for example, because when input/application program view when gesture terminates When the attribute of figure meets " be sized/translate to cancel " standard (100x7), current application program is appointed as current mesh by equipment Mark state, as shown in Figure 10 A to Figure 10 B), interaction map user interface is shown after contact is lifted away from as shown in Fig. 5 C29.
Fig. 6 A to Fig. 6 F is for the one or more edges along touch-sensitive display according to some embodiments can Conjugating the place's of setting display has the flow chart of shown method 600 of multiple application program image target taskbars.Method 600 is that have Electronic equipment (the example of display, touch sensitive surface and the one or more sensors for detecting the intensity contacted with touch sensitive surface Such as, the equipment 300 in Fig. 3 or the portable multifunction device 100 in Figure 1A) at execute.In some embodiments, it shows Show that device is touch-screen display, and touch sensitive surface is over the display or integrated with display.In some embodiments, it shows Device is to separate with touch sensitive surface.Some operations in method 600 are optionally combined and/or the sequence of some operations Optionally it is changed.
As described below, method 600 provides a kind of intuitive way, for one or more sides along touch-sensitive display Edge is shown at variable position has multiple application program image target taskbars.This method along one of touch-sensitive display or The number of input from the user is reduced when shown at variable position with multiple application program image target taskbars in multiple edges Amount, degree and/or property, to create more effective man-machine interface.For battery powered electronic equipment, allow users to Faster and more effectively show that there are multiple application programs at variable position along one or more edges of touch-sensitive display The taskbar of icon saves power and increases the time between battery charging, and enhances the operability of equipment (for example, passing through It helps user to provide input appropriate when operate/interacting with equipment and reduces user's mistake).
Method 600 is related in response to the position based on input (for example, the edge of the equipment of display taskbar is based on detecting The edge and/or taskbar of input along edge position depend on input the degree of approach) input (for example, away from display The long pressing gesture initiated in edge preset distance) show that there is multiple answer at variable position along the edge of touch-sensitive display With the taskbar of icon (for example, along any of multiple edges of display, such as, bottom, right side or the left side of display Current display direction of the side edge relative to equipment).For example, in some embodiments, in response to along display edge Long pressing input, equipment show taskbar along the particular edge of display.In some embodiments, hand is pressed in response to long Nearby the long of (for example, overlapping, placed in the middle or side) presses input for the position of gesture, and equipment is shown at the position along display edge Show taskbar.In some embodiments, (for example, working as when detecting long-pressing input at the first area at display edge When detecting input from anywhere in the central part of display, taskbar is shown in the center at edge, and/or the person of working as When detecting input in the predetermined proximity in the end at edge, taskbar is shown in the end at edge), equipment is being pre-positioned Set and show taskbar at (for example, in the centre at the edge of display, or the end section at the edge of display), and when The second area (for example, not in central area and/or not in predetermined proximity of the end at edge) at display edge is examined When measuring long pressing input, equipment shows taskbar in the position that user specifies (for example, overlapping, placed in the middle or input side). Allow user in selected location rather than only show taskbar in pre-position, which enhance the operability of equipment and makes User equipment interface is more effective (for example, the light access by providing to the navigation feature of equipment;It is led by allowing user to execute Boat function, position of the palmistry regardless of user for display;By helping user to realize in advance with less required input The result of phase;And by providing additional control option, but not since the control of additional display keeps user interface mixed and disorderly), This is further through allowing users to faster and more effectively reduce electricity usage using equipment and extend the battery of equipment Service life.
Equipment shows (602) first user interfaces (for example, application program user interface) over the display, wherein first uses Family interface is different from home on-screen user interface, and home on-screen user interface includes and is mounted on equipment (for example, Fig. 5 A1, figure The postal in interaction map user interface and Fig. 5 A9 and Fig. 5 A13 in 5A4, Fig. 5 A15, Fig. 5 A19, Fig. 5 A22 and Fig. 5 A28 Part user interface) on multiple application programs the corresponding multiple application icons of different application.In some embodiment party In case, taskbar is also shown into home on-screen user interface (for example, as shown in Fig. 5 B21) under default situations.
When showing the first user interface over the display, first in the first edge of equipment detection (604) display is connect First input of touching is (for example, respectively such as Fig. 5 A1, Fig. 5 A4, Fig. 5 A9, Fig. 5 A13, Fig. 5 A15, Fig. 5 A19, Fig. 5 A22 and Fig. 5 A28 Shown in contact 4202, contact 4206, contact 4208, contact 4209, contact 4212, contact 4216, contact 4218 and contact 4222)。
The first input (for example, long pressing) on display edge is detected in response to (606), and the of display Continued to test on one edge to first contact when (for example, when at the initial touch position of touch input first contact keep base This static (for example, there is the amount of movement less than threshold value)), it is detected in the first part of the first edge of display according to determining To the first input (for example, corresponding position of first contact in the first part of first edge keeps substantially static at least threshold Value time quantum and have less than threshold value amount of movement) and first input meet taskbar show standard (for example, first input be Long pressing input or deep pressing pressure input, without the movement of the first contact), equipment in the first edge along display the Show that (608) have the taskbar of multiple application icons at one position.For example, in response to the left side of the bottom margin in display On position at be consecutively detected contact 4202 reach meet long-pressing input standard period (for example, meeting time threshold TT1), left side of the equipment in Fig. 5 A2 below contact 4202 along the bottom margin of display shows taskbar 4204.One In a little embodiments, first position be selected as include the first edge of display first part (for example, taskbar is with first Centered on the position of touch, such as in Fig. 5 A2 by contact 4202 centered on taskbar 4204).In some embodiments, First position is that predetermined position (for example, when detecting the first touch in the middle section in first edge, is being located at display Taskbar is shown at the default location at center, whether no matter is contacted at the center of display).
The first input (for example, long pressing) on display edge is detected in response to (606), and the of display Continued to test on one edge to first contact when (for example, when at the initial touch position of touch input first contact keep base This static (for example, there is the amount of movement less than threshold value)), it is different from the first part of first edge in display according to determining Detect the first input (for example, the first contact is corresponding on the second part of first edge on the second part of first edge Kept at position substantially static at least threshold amount of time and have less than threshold value amount of movement) and first input meet taskbar Display standard (for example, the first input is long pressing input or deep pressing input, without the movement of the first contact), display (610) along the taskbar of the second place of the first edge of display, be selected as include display first edge Second part (for example, taskbar by first touch position centered on), wherein the second position is different from first position.Example Such as, in response to be consecutively detected at the position on the right side of the bottom margin of display contact 4206 reach meet long-pressing input The period of standard is (for example, meet time threshold TT1), bottom of the equipment in Fig. 5 A5 below contact 4206 along display The right side at portion edge shows taskbar 4204, shows that the position of taskbar 4204 is different from Fig. 5 A2.
Taskbar is shown at first position when meeting the first standard (for example, first position standard), and when satisfaction Taskbar is shown in the second place when the second standard (for example, second position standard), and which enhance the operability of equipment simultaneously And keep user's equipment interface more effective (for example, the light access by providing to the navigation feature of equipment;By allowing user to hold Row navigation feature, position of the palmistry regardless of user for display;By helping user real with less required input Existing expected result;And by providing additional control option, but not since the control of additional display keeps user interface miscellaneous Disorderly), this is further through allowing users to faster and more effectively reduce electricity usage using equipment and extend equipment Battery life.
In some embodiments, along that the first position of the first edge of display does not include (612) display The second part at one edge is (for example, work as taskbar with the first part of first edge (for example, bottom margin) (for example, close to left The corresponding touch location at edge) centered on show, and when the width of taskbar does not cross over the whole length of first edge, appoint The position on business column does not include the second part (for example, close to corresponding touch location of right hand edge) of first edge).For example, Fig. 5 A2 The position (for example, in left side of the bottom margin of display) of middle display taskbar 4204 is not contacted with detecting in Fig. 5 A15 4212 bottom margin partly overlaps (for example, the right part at display bottom edge edge).Task is shown at first position Column, the taskbar is not Chong Die with the second part of first edge, the display phase of the second part and the taskbar of the second place It is associated with (for example, second position Chong Die with the second part at edge), which enhance the operability of equipment and sets user Standby interface is more effective (for example, the light access by providing to the navigation feature of equipment;By allowing user to execute navigation function Can, position of the palmistry regardless of user for display;By helping user expected from less required input realization As a result;And by providing additional control option, but not since the control of additional display keeps user interface mixed and disorderly), this is again Electricity usage faster and more effectively can be reduced using equipment by using family and extend the battery life of equipment.
In some embodiments, along that the second position of the first edge of display does not include (614) display The first part at one edge.For example, working as taskbar with the second part of first edge (for example, bottom margin) (for example, close to right The corresponding touch location at edge) centered on show, and when the width of taskbar does not cross over the whole length of first edge, appoint The position on business column does not include first part's (for example, close to corresponding touch location of left edge) of first edge.For example, Fig. 5 A16 The position (for example, on right side of the bottom margin of display) of middle display taskbar 4204 is not contacted with detecting in Fig. 5 A1 4202 bottom margin partly overlaps (for example, the left part at display bottom edge edge).Task is shown in the second place Column, the taskbar is not Chong Die with the first part of first edge, the display phase of the second part and the taskbar at first position It is associated with (for example, first position Chong Die with the first part at edge), which enhance the operability of equipment and sets user Standby interface is more effective (for example, the light access by providing to the navigation feature of equipment;By allowing user to execute navigation function Can, position of the palmistry regardless of user for display;By helping user expected from less required input realization As a result;And by providing additional control option, but not since the control of additional display keeps user interface mixed and disorderly), this is again Electricity usage faster and more effectively can be reduced using equipment by using family and extend the battery life of equipment.
In some embodiments, (for example, the when showing the first user interface over the display without showing taskbar One contact no longer detects the first time input of the first contact after being lifted away from from first edge after, taskbar stops display), if Standby (616) second contact of the detection in the second edge (for example, left edge or top edge) of display is (for example, long pressing is defeated Enter) second input, the second edge be different from display first edge (for example, bottom margin).It is aobvious in response to detecting Show the second input (for example, long pressing) in the second edge of device, and continues to detect the in the second edge of display Two contacts when the second contact keeps substantially static at the initial touch position of touch input for example, having (for example, (be less than The amount of movement of threshold value)), according to determine the second input meet taskbar show standard (for example, the second input be long pressing input or Without the mobile deep pressing input of the second contact), the third place of the equipment in the second edge along display Place's display (618) has multiple application program image target taskbars (for example, according to the position of the second contact, according to based on first The position of the first contact on edge selects the mode of the position of taskbar to select the third place) (for example, taskbar is shown in The center of the touch location of third contact in second edge).For example, being detected on the left edge of display in Fig. 5 A9 Contact 4208, rather than on the bottom margin of display, such as the contact 4202 in Fig. 5 A1.In response to including meeting taskbar The input (for example, its position is substantially kept at least a period of time TT1) of the contact 4208 of display standard, along display Left edge shows the taskbar 4204 in Fig. 5 A10, rather than shows along bottom margin such as the taskbar 4204 in Fig. 5 A2.In In some embodiments, when first user interface is in vertical direction, term " top edge ", " left edge ", " the right Edge ", " side edge " are limited by the top of the first user interface, left side, right side and lateral location.When the first edge in display On when detecting input, show taskbar (for example, the bottom that the display relative to equipment is orientated along the first edge of display Portion edge), when detecting input in the second edge in display, along display second edge show taskbar (for example, The side edge that display relative to equipment is orientated), which enhance the operability of equipment and makes user's equipment interface more and have Effect is (for example, by providing the light access to the navigation feature of equipment;By allow user execute navigation feature, no matter user Palmistry for display position how;By helping user to realize expected result with less required input;And lead to Cross and additional control option be provided, but not since the control of additional display keeps user interface mixed and disorderly), this is further through enabling users to Enough battery lifes for faster and more effectively reducing electricity usage using equipment and extend equipment.
In some embodiments, (for example, the when showing the first user interface over the display without showing taskbar One contact from first edge be lifted away from after and after the second contact is lifted away from from second edge, no longer detect the first contact After first input and after the second input of the second contact, taskbar stopping is shown near first edge, and taskbar Stopping is shown near second edge), equipment detects (620) third on the third edge (for example, right hand edge) of display and connects The third of touching inputs, which is different from the first edge of display and the second edge of display.In response to detecting Third input (for example, long pressing) in the second edge of display, and continue to detect on the third edge of display Third contact is (for example, when the second contact keeps substantially static at the initial touch position of touch input (for example, having small In the amount of movement of threshold value)), meet taskbar display standard (for example, third input is that long pressing inputs according to determining that third inputs Or in the case where no third contact mobile deep pressing input), the 4th at the third edge along display of equipment Setting place's display (622) has multiple application program image target taskbars.(for example, according to the position that third contacts, according to based on the The position of the first contact on one edge selects the mode of the position of taskbar to select the 4th position) (for example, taskbar is shown The center of the touch location of the 4th contact on third edge).For example, with along display in Fig. 5 A2 bottom margin and Show that taskbar 4204 is compared along the left edge of display in Fig. 5 A109, the long pressing in Fig. 5 A1 on the right hand edge of display Input will make the right hand edge along display show taskbar.In some embodiments, taskbar is shown in second edge With the center at third edge, without consider third contact and the 4th contact accurate location (for example, no matter finger contact it is definite How is position, and taskbar is all centered on short side edge, and the touch location based on finger contact is along longer bottom sides Edge displacement;Or taskbar without the accurate location of consideration finger contact, and is based on touching centered on short bottom margin Position is shifted along longer side edge).When detecting input in the first edge in display, along the first of display Edge shows taskbar station (for example, bottom margin that the display relative to equipment is orientated), when the second edge in display On when detecting input, along the second edge of display show taskbar (for example, the display relative to equipment be orientated the One side edge), and when detecting input on the third edge in display, task is shown along the third edge of display Column (for example, the second side edge opposite with first side edge, display orientation relative to equipment), which enhance equipment can Operability and keep user's equipment interface more effective (for example, the light access by providing to the navigation feature of equipment;By permitting Family allowable executes navigation feature, position of the palmistry regardless of user for display;By helping user with less institute Realization expected result need to be inputted;And by providing additional control option, but not since the control of additional display uses Family interface is mixed and disorderly), this is further through allowing users to faster and more effectively reduce electricity usage using equipment and extend The battery life of equipment.
In some embodiments, when showing taskbar at first position along the first edge of display, continue Detect the first contact (for example, the first contact is moved along first edge progress is some when showing taskbar over the display Later, at the first part of the first edge of display or in the different piece of display, the first edge of display), Equipment detects that (624) first contacts are lifted away from from display, and in response to detecting being lifted away from for the first contact (626), according to It determines when showing taskbar, the movement of the first contact is less than threshold quantity, and after in first time, contact is lifted away from, which is being shown The display (628) of taskbar is maintained in the first user interface on device.For example, being lifted away from it in the contact 4202 as shown in Fig. 5 A2 Afterwards, equipment maintains the display of the taskbar 4204 in Fig. 5 A3, because contact 4202 is not moved substantially over the display.When When detecting input in the first edge of display, show taskbar (for example, relative to equipment along the first edge of display Display orientation bottom margin), then if input movement be less than threshold quantity, taskbar is maintained when input is lifted away from Display, which enhance the operability of equipment and keep user's equipment interface more effective (for example, leading by providing to equipment The light access for function of navigating;By allowing user to execute navigation feature, position of the palmistry regardless of user for display; By helping user to realize expected result with less required input;And by providing additional control option, but not Since the control of additional display keeps user interface mixed and disorderly), this further through allow users to faster and more effectively using equipment and Reduce electricity usage and extends the battery life of equipment.
In some embodiments, in response to detecting being lifted away from for the first contact (626), taskbar is being shown according to determining When, the first contact is mobile to be less than threshold quantity, and after being lifted away from of the first contact, equipment expansion (630) in the first user interface The size for the taskbar that top is shown is (for example, the size for the taskbar being initially displayed is less than appointing under its final display state The size on business column).For example, equipment expands the taskbar in Fig. 5 A21 after the contact 4216 shown in Fig. 5 A20 is lifted away from 4204 size, because contact 4216 is not moved substantially over the display.It is defeated when being detected in the first edge in display It is fashionable, taskbar (for example, bottom margin that the display relative to equipment is orientated) is shown along the first edge of display, so Afterwards if the movement of input is less than threshold quantity, expand the size of taskbar when input is lifted away from, which enhance grasping for equipment The property made and keep user's equipment interface more effective (for example, the light access by providing to the navigation feature of equipment;Pass through permission User executes navigation feature, position of the palmistry regardless of user for display;By helping user with needed for less Expected result is realized in input;And by providing additional control option, but not since the control of additional display makes user Interface is mixed and disorderly), this is further through allowing users to faster and more effectively reduce electricity usage using equipment and extend The battery life of equipment.
In some embodiments, in response to detecting being lifted away from for the first contact (626), taskbar is being shown according to determining When, the first contact is mobile to be less than threshold quantity, and equipment is by the display of taskbar from the first position of the first edge along display Mobile (632) arrive the third predetermined position (for example, center of first edge) of the first edge along display.For example, scheming After 4216 be lifted away from of contact shown in 5A20, equipment will be the display of taskbar 4204 from the display as shown in Fig. 5 A20 The center of the bottom margin of the display as shown in Fig. 5 A21 is moved on the left of bottom margin, because contact 4216 is substantially not It moves over the display.In some embodiments, after being lifted away from of contact, predetermined position that taskbar moves to is along setting Standby predetermined edge (for example, " bottom margin " of display, the present displays relative to equipment are orientated), but regardless of initially aobvious Show the edge (for example, side edge) of taskbar.In some embodiments, after being lifted away from of contact, what taskbar moved to Predetermined position contacts identical edge (for example, each edge of equipment is set with corresponding preplanned mission field along with first It is associated).When detecting input in the first edge in display, shown at first position along the first edge of display Show taskbar (for example, bottom margin that the display relative to equipment is orientated), then if the movement of input is less than threshold quantity, Then the third for the first edge that taskbar is moved into display from first position is made a reservation for along the first edge of display Position which enhance the operability of equipment and keeps user's equipment interface more effective (for example, the navigation by providing to equipment The light access of function;By allowing user to execute navigation feature, position of the palmistry regardless of user for display;It is logical It crosses and user is helped to realize expected result with less required input;And by the additional control option of offer, but not by Keep user interface mixed and disorderly in the control of additional display), this is further through allowing users to faster and more effectively subtract using equipment Lack electricity usage and extends the battery life of equipment.
In some embodiments, according to determining that the first contact is moved into close to the position outside taskbar, the Taskbar when being lifted away from of one contact stops display.For example, equipment stops aobvious after the contact 4208 as shown in Fig. 5 A11 is lifted away from Taskbar 4204 in diagram 5A12, because contact 4208, before being lifted away from, 4208-a is moved to appointing in Fig. 5 A10 from position It is engaged on column 4204, is moved to the position 4208-b of the outside of the taskbar 4204 in Fig. 5 A11.In some embodiments, it responds In detecting being lifted away from for the first contact, it is more than threshold quantity according to determining that the first contact has been moved, contacts edge in first time First edge it is mobile after, equipment is according to the corresponding application programs icon on the current location selection taskbar of the first contact (for example, touching 4218 is moved to the position on the mail applications icon 218 in Fig. 5 A24 from the position 4218-a in Fig. 5 A23 Set 4218-b, selection (for example, expansion) mail applications icon), after the first contact is moved along first edge, Corresponding application icon is dragged from taskbar according to the current location of the first contact, to select corresponding application program image Mark, then along the direction far from taskbar mobile (for example, upward away from taskbar) (for example, in selection mail applications figure After mark 218,4218 edges far from display will be contacted, the position in Fig. 5 A25 is moved to from the position 4218-b in Fig. 5 A24 4218-c is set, mail applications icon 218 is hauled out into the taskbar in Fig. 5 A25 (for example, expanding mail applications icon 218 display).In some embodiments, it if detecting being lifted away from for the first contact when selecting corresponding application programs, sets It is standby to start the first application program corresponding with the corresponding application programs icon currently selected, and accordingly answering with the first application program The first user interface is replaced with program user interface.For example, after 4206 be lifted away from of contact, when the taskbar in Fig. 5 A6 In 4204 when selection mail applications icon 218, equipment starts associated mail applications, in Fig. 5 A7 to Fig. 5 A8 Show electronic mail application user interface (for example, seeming electronic mail application user interface from mail translation animation It is popped up in application icon 218 such).
In some embodiments, when showing taskbar at first position along the first edge of display, equipment Detect the first movement of the first contact of (634) along taskbar (for example, along first edge).For example, the shifting of contact 4206 Dynamic 4208 are moved to the position 4206-b in Fig. 5 A6 from the position 4206-a in Fig. 5 A5.In response to detecting the of the first contact One is mobile, and equipment is according to the corresponding application programs icon in current location selection (636) taskbar of the first contact (for example, logical Cross relative in taskbar the amplification of other applications icon, highlight and/or the corresponding application icon of animation Visually to indicate the selection of corresponding application programs icon).For example, contact 4206 to position 4206-b movement 4208 it Afterwards, the mail applications icon 218 in equipment selection (for example, expanding display) Fig. 5 A6, because contact 4206 is answered positioned at mail With in program icon 218.After detecting that the first contact is moved along the first time of first edge, which detects (638) First contact is from being lifted away from of display (for example, contact 4206 in Fig. 5 A6 is lifted away from).Detect that first connects in response to (640) Touching is lifted away from, according to determining when detect the first contact has currently selected the first application program image in taskbar when being lifted away from Mark, equipment start (642) and correspond to first the first application program of application program image target in taskbar, and with the first application program Second user interface display replacement the first user interface display (644).For example, contact 5A5's in Fig. 5 A6 is lifted away from Later, equipment animation shows the electronic mail application user interface in Fig. 5 A7-5A8.In some embodiments, when first connects When touching is moved along the first edge below taskbar, and in response to detecting being lifted away from for the first contact, work as inspection according to determining Measure the first contact has currently selected the second application icon in taskbar when being lifted away from, equipment starting corresponds to taskbar In second the second application program of application program image target, and with the third user interface of the second application program replace the first user circle The display in face.When detecting input in the first edge in display, taskbar (example is shown along the first edge of display Such as, the bottom margin being orientated relative to the display of equipment), then if selected in taskbar when being lifted away from what is be in contact Application icon, then open application program when being lifted away from input, which enhance the operability of equipment and make user Equipment interface is more effective (for example, the light access by providing to the navigation feature of equipment;By allowing user to execute navigation function Can, position of the palmistry regardless of user for display;By helping user expected from less required input realization As a result;And by providing additional control option, but not since the control of additional display keeps user interface mixed and disorderly), this is again Electricity usage faster and more effectively can be reduced using equipment by using family and extend the battery life of equipment.
In some embodiments, when showing taskbar at first position along the first edge of display, equipment The movement (for example, in first edge direction that is substantially parallel with display) of (646) first contact of detection over the display. For example, the movement 4208 of contact 4206 is moved to the position 4206-b in Fig. 5 A6 from the position 4206-a in Fig. 5 A5.In response to Detect contact be located on display shown with the first application program image target in taskbar corresponding position (for example, according to Determine that the x coordinate of the first contact corresponds to the first application program image target x coordinate, and the y-coordinate of the first contact is located at task At the top edge on column or lower section), equipment selects (648) first application icons (for example, and changing application icon Display properties (for example, size, color, highlight, animation), to indicate its selected state).For example, in contact 4206 To after the movement 4208 of position 4206-b, equipment selects and expands the mail applications icon 218 in display Fig. 5 A6, because It is located on mail applications icon 218 for contact 4206.In some embodiments, whenever the mobile period in contact according to When the current location of first contact selects the new application icon in taskbar, tactile output is generated.In some embodiments In, if detecting being lifted away from for the first contact when selecting the first application icon, equipment starts the first application program.In In some embodiments, when the first contact is moved away from taskbar from the side or bottom of taskbar, stop what selection currently selected Application icon.In some embodiments, when the x coordinate of the first contact is located at two application icons in taskbar Between position when, stop the application icon that currently selecting of selection and do not select other applications icon.Some In embodiment, if when detect the first contact when being lifted away from currently without selection application icon, do not start application Program;And taskbar is optionally held on display (for example, if contacting static and detecting when close to taskbar Be lifted away from) or stop display (for example, if immediately first contact be lifted away from before contact it is previous move detect be lifted away from). When detecting input in the first edge in display, along display first edge show taskbar (for example, relative to The bottom margin of the display orientation of equipment), it then detects and connects at position corresponding with application icon over the display Application icon is selected when touching, which enhance the operability of equipment and keeps user's equipment interface more effective (for example, passing through Light access to the navigation feature of equipment is provided;By allowing user to execute navigation feature, no matter the palmistry of user is for aobvious Show device position how;By helping user to realize expected result with less required input;And it is additional by providing Option is controlled, but not since the control of additional display keeps user interface mixed and disorderly), this is further through allowing users to more rapidly and have Effect ground is reduced electricity usage using equipment and extends the battery life of equipment.
In some embodiments, when selecting the first application icon, equipment detects first on (650) display Contact the movement (for example, on direction perpendicular to first edge) of the first edge far from display.In response to detection first The movement for contacting the first edge far from display over the display is not correspond to taskbar according to determining first contact (for example, the y-coordinate of the first contact is above top edge of taskbar) detected at the position of display, equipment is in display On correspond to first contact the display for not corresponding to taskbar position position at show (652) first application program images Mark or its indicate (for example, the first application icon contacts vertically moving to be lifted away from and take the post as far from first edge by first Business column).For example, contact 4218 is from the position 4218-b in Fig. 5 A24 to figure after selecting mail applications icon 218 The mail applications icon 218 in Fig. 5 A25 is hauled out in movement 4222 of the position 4218-c far from display edge in 5A25 Except taskbar (for example, and expanding the display of mail applications icon 218).In some embodiments, first journey is applied The movement of sequence icon corresponds to the movement of the first contact.In some embodiments, when the first application icon is dragged completely It goes out on missions column or when transmitting predefined threshold value y-coordinate on the display in task marge portion, the first application icon changes Its appearance is moved in the first contact of display below the first contact.For example, in Fig. 5 A25, when hauling-out taskbar When 4204, mail applications icon 218 expands.In some embodiments, the first application program image target cosmetic variation companion As the first application icon is put into split screen separator by split-screen display separator indicator, prompt user over the display The other side of indicator is used with the application program in the first user interface and corresponding to the first application program image target application program Family separates screen between interface.In response in selection application icon (for example, when the application journey that is shown in taskbar of contact When on sequence icon) when detection contact the movement (for example, far from taskbar) far from display edge, application program image target is shown Show from taskbar and be moved on screen from position not corresponding with taskbar position, which enhance the operability of equipment and makes User equipment interface is more effective (for example, the light access by providing to the navigation feature of equipment;It is led by allowing user to execute Boat function, position of the palmistry regardless of user for display;By helping user to realize in advance with less required input The result of phase;And by providing additional control option, but not since the control of additional display keeps user interface mixed and disorderly), This is further through allowing users to faster and more effectively reduce electricity usage using equipment and extend the battery of equipment Service life.
In some embodiments, the position but not right with the display of taskbar for corresponding to the first contact over the display (for example, in moving up the by contact for the first time when showing the first application icon or its expression at the position answered One application icon is hauled out after taskbar), equipment detects being lifted away from for (654) first contacts, and in response to (656) detection first Contact is lifted away from, while the first application icon is shown in the position corresponding to the first contact on display but does not correspond to At the position of the display of taskbar, which replaces the display of the first user interface in the first part of (658) display, Middle display is with the second user interface corresponding to application program associated with the first application icon (for example, with split screen mould Formula opens the second application program), and the display (660) of the first user interface is maintained in the second part of display, this Two parts are not Chong Die with the first part of display.For example, being lifted away from response to detection contact 4218, while being used in interaction map The mail applications icon 218 in Fig. 5 A26, right half of the equipment in display are shown outside family interface and taskbar 4204 It shows e-mail user interface, while maintaining the display of interaction map user interface in the left half of the display in Fig. 5 A27.In In some embodiments, the size of the first user interface is adjusted to fill the second part of display (for example, pair shown in UI As the contraction with display area is proportionally shunk).In some embodiments, the first user interface is cut to fill display Second part (for example, the object shown in UI maintains identical size, but the dimensional contraction of display area).In some implementations In scheme, taskbar stopping is shown on split screen.In some embodiments, taskbar is in its original position on split screen Display.With the first of the display replacement display at second user corresponding with the associated application program of application icon interface The display of the first user interface in part, while maintaining in the second part of display the display (example of the first user interface Such as, application program is opened with span mode), when contact is located at the application program image target of display corresponded to outside taskbar When display position, it is lifted away from (for example, after application icon to be hauled out to taskbar) in response to detection contact, which enhance The operability of equipment and keep user's equipment interface more effective (for example, the light visit by providing to the navigation feature of equipment It asks;By allowing user to execute navigation feature, position of the palmistry regardless of user for display;By help user with Less required input realizes expected result;And by providing additional control option, but not due to additional display Control keeps user interface mixed and disorderly), this is further through allowing users to faster and more effectively reduce electricity usage using equipment And extend the battery life of equipment.
In some embodiments, when showing taskbar at first position along the first edge of display, equipment Detection (662) first contacts the movement towards the first edge of display, and contacts in response to detection first towards display First edge movement, moved according to determining by the first movement of the contact towards the first edge of display to meet taskbar The standard (contact is completely removed from display or is more than threshold position) removed, equipment stops display (664) taskbar (for example, passing through According to first contact towards the outer peripheral movement of equipment by taskbar slip away display first edge with hide taskbar).Example Such as, in response to contact 4212 towards display edge from the position 4212-b's in position 4212-a to Fig. 5 A17 in Fig. 5 A16 Mobile 4214, taskbar 4204 starts to slip away the bottom of display.It is lifted away from response to contact 4212 in Fig. 5 A17, equipment stops Show the taskbar 4204 in Fig. 5 A18.When detecting input in the first edge in display, along the first of display Edge shows taskbar (for example, bottom margin that the display relative to equipment is orientated), contacts direction then in response to detection The mobile taskbar that meets of the first edge of display removes standard, stops display taskbar (for example, working as contact close to display side Taskbar is hidden when edge), which enhance the operability of equipment and keep user's equipment interface more effective (for example, passing through offer Light access to the navigation feature of equipment;By allowing user to execute navigation feature, no matter the palmistry of user is for display Position how;By helping user to realize expected result with less required input;And by providing additional control Option, but not since the control of additional display keeps user interface mixed and disorderly), this is further through allowing users to faster and more effectively Reduce electricity usage using equipment and extends the battery life of equipment.
In some embodiments, first edge of the first part of the first edge of display in display (615) In first predefined subrange (for example, center one third part), and first position is made a reservation for the first of first edge (for example, when touching middle section of the contact positioned at edge, taskbar is in the first predetermined position within the scope of foster son with display Center) (for example, the second predefined subrange of first edge is except the first predefined subrange, and the second position is different It is dynamically selected in the first predetermined position, and according to the position of the first contact outside the first predefined subrange of first edge It selects).In response to the input in the first predefined subrange of the first edge for detecting display, along the first edge of display (for example, central one third part) is at the first predetermined position (for example, the center at edge) in the first predefined subrange It shows taskbar, which enhance the operability of equipment and keeps user's equipment interface more effective (for example, by providing to equipment Navigation feature light access;By allow user execute navigation feature, no matter the palmistry of user for display position How;By helping user to realize expected result with less required input;And by providing additional control option, but User interface will not be kept mixed and disorderly due to the control of additional display), this is further through allowing users to faster and more effectively using setting It is standby and reduce electricity usage and extend the battery life of equipment.
In some embodiments, first edge of the second part of the first edge of display in display (617) In second predefined subrange (for example, at the left or right one third of first edge), and when the first contact is away from closer to the When first neighboring edge at one edge is at least threshold distance, the taskbar for being shown in the second place is located at is with the first contact Center position (for example, the first contact is located at 1/3 part of left or right at first edge, and apart from remote enough, so as to Entire taskbar can be shown when center to touch) at, when the first contact is less than the first neighboring edge far from first edge Threshold distance when, the taskbar for being shown in the second place is shown as the first neighboring edge of adjacent first edge (for example, partially Center from first edge and the fixed x pixel of the first neighboring edge away from the first edge closer to the first contact (for example, 5 pixels)) (for example, left or right alignment of the first edge relative to display).For example, in Fig. 5 A5, taskbar 4204 are shown centered on contact 4206, because 4206 right hand edges away from display of contact are at least threshold distance.On the contrary, scheming In 5A16, taskbar 4204 is shown in the default location of adjacent display right hand edge, rather than centered on contact 4212, because Contact 4212 and the distance between the right hand edge of display are not at least one threshold values.When contact is located at the first edge of display In the second predefined subrange of (for example, left or right one third of first edge) and the neighboring edge nearest away from display When more than threshold distance, taskbar is shown in the second place centered on the position of the first contact, when contact is located at display In the predefined subrange of the second of the first edge of device and when being less than threshold distance away from the nearest neighboring edge of display, in neighbour The second place for connecing the arest neighbors edge of display shows taskbar (for example, when contacting the nearest of too close display edge When end is to show entire taskbar to contact as center, taskbar is shown in predefined position, this substantially minimizes task Column center with the distance between contact, while keep showing entire taskbar), which enhance the operability of equipment and uses Family equipment interface is more effective (for example, the light access by providing to the navigation feature of equipment;By allowing user to execute navigation Function, position of the palmistry regardless of user for display;It is expected by helping user to realize with less required input Result;And by providing additional control option, but not since the control of additional display keeps user interface mixed and disorderly), this Further through allowing users to faster and more effectively reduce electricity usage using equipment and extend the battery longevity of equipment Life.
In some embodiments, when showing taskbar at first position (for example, all when being shown at default location When being such as in display center) (when display first edge first part display first edge it is predefined in (for example, as above closing when heart range (for example, center one third part) is interior and first position is the first predetermined position Shown described in taskbar in predetermined position in when contact is in the first subrange of the first edge of display)) The size ratio of taskbar is when showing taskbar in the second place (for example, center or docking side in the first contact Edge (for example, as above being shown described in taskbar about when contact is in the second subrange of the first edge of display)) task The size on column is bigger (623).For example, the writing from memory when the center for the bottom margin for being located in display in Fig. 5 A21 of taskbar 4204 Show than as the left side positioning Shi Geng great in Fig. 5 A20 along the bottom margin of display when recognizing at position.When at first When setting (for example, predefined or default location) display taskbar than the first edge along display in the second position (for example, should Position depends on position of the contact in the subrange at the edge of display) display taskbar when show bigger taskbar, increase The strong operability of equipment and keep user's equipment interface more effective (for example, by providing to the light of the navigation feature of equipment Pine access;By allowing user to execute navigation feature, position of the palmistry regardless of user for display;By helping to use Family expected result is realized with less required input;And by providing additional control option, but not due to additional aobvious The control shown keeps user interface mixed and disorderly), this is further through allowing users to faster and more effectively reduce electric power using equipment Using and extend the battery life of equipment.
In some embodiments, in response to detecting the first input (example in the first input on the edge of the display Such as, gently swept to top edge) and continued to test in the first edge of display to when the first contact, it is inputted according to determining first Meet navigation gesture standard, wherein navigation gesture standard includes requiring to detect the first contact over the display far from display The amount of threshold shift of first edge, so as to meet navigation gesture standard (but do not require the first input meet taskbar display mark It is quasi-), equipment enters (625) interim subscriber interface model, in the interim subscriber interface model, multiple and different user interface shapes State is available with one group of one or more attribute based on the first input compared with corresponding one group of one or more threshold value To be selected (if the first input, which is unsatisfactory for taskbar, shows standard, optionally to abandon along the first edge of display Show taskbar).For example, in response to meeting long-pressing gesture standard (for example, it is desired in a period of time TT1Interior limitation movement) it Before, 4222 bottom margin of the movement 4224 far from display is contacted from the position in position 4222-a to Fig. 5 A29 in Fig. 5 A28 4222-b is set, equipment enters transition navigational state, i.e., with the application program view for corresponding to interaction map user interface in Fig. 5 A29 Figure 40 14 replaces the display of interaction map user interface in Fig. 5 A28.Into interim subscriber interface model allow user according to whether Meet certain preset mobile conditions to navigate to different user interfaces (for example, one or more of following situations: (i) Home screen is navigate to, (ii) navigates to and immediately gently sweep what user interface shown when gesture starts was shown on the screen before Application program, (iii) navigate to control panel user interface, and (iv) navigates to application program switching user interface, or (v) leads Boat returns to and gently sweeps user interface shown when gesture starts), it enhances the operability of equipment and makes user equipment circle Face is more effective (for example, the light access by providing to the navigation feature of equipment;By allowing user to execute navigation feature, nothing By user palmistry for display position how;By helping user to realize expected result with less required input; And by providing additional control option, but not since the control of additional display keeps user interface mixed and disorderly), turn, this reduces Electricity usage and the battery life (for example, by helping user faster and more effectively to use equipment) for extending equipment.
It should be appreciated that the particular order that the operation in Fig. 6 A to Fig. 6 F is described is only exemplary, not purport Showing that the sequence is the unique order that can execute these operations.Those skilled in the art will recognize that various ways To resequence to operations described herein.Additionally, it should be noted that herein in regard to other methods as described herein (for example, Method 700,800,900,1000 and 1100) description other processes details equally in a similar way be suitable for above for The method 600 of Fig. 6 A to Fig. 6 F description.For example, contact, gesture, user interface object, tactile above with reference to the description of method 600 Output, intensity threshold, focus selector and animation optionally have herein with reference to other methods as described herein (for example, method 700,800,900,1000,1100) contact of description, gesture, user interface object, tactile output, intensity threshold, focus selection One or more of device, feature of animation.For brevity, these details are not repeated herein.
Operation above with reference to Fig. 6 A to Fig. 6 F description is optionally implemented by component that Figure 1A describes into Figure 1B.Example Such as, display operation 602,608,610,618,622 and 652, detection operation 604,616,620,624,634,638,646,650, 654 and 662, into operation 625, extended operation 630, moving operation 632, selection operation 636 and 648, opening operation 642 and stop Only display operation 664 is optionally realized by event classifier 170, event recognizer 180 and event handler 190.Event point Event monitor 171 in class device 170 detects the contact on touch-sensitive display 112, and event dispatcher module 174 is by thing Part information is transmitted to application program 136-1.The corresponding event identifier 180 of application program 136-1 is by event information and corresponding thing Part defines 186 and is compared, and whether determine the first contact on touch sensitive surface at first position (or the rotation of the equipment is It is no) correspond to predefined event or subevent, such as selection to the object in user interface or the equipment are orientated from one To the rotation of another orientation.When detecting corresponding predefined event or subevent, the activation of event recognizer 180 with it is right The associated button.onrelease 190 of the detection of the event or subevent.Button.onrelease 190 optionally uses or calls number Carry out more new application internal state 192 according to renovator 176 or object renovator 177.In some embodiments, event handling Program 190 accesses corresponding GUI renovator 178 and carrys out content shown by more new application.Similarly, those skilled in the art It can know clearly based in Figure 1A, into Figure 1B, how discribed component can realize other processes.
Fig. 7 A to Fig. 7 I is shown according to some embodiments from the user interface navigation that is shown with split screen display available mode To the flow chart of the method 700 at different user interface.Method 700 is with display, touch sensitive surface and for detecting and touching The electronic equipment of the one or more sensors of the intensity of sensitive surfaces contact is (for example, in equipment 300 or Figure 1A in Fig. 3 just Take formula multifunctional equipment 100) at execute.In some embodiments, display is touch-screen display, and touch sensitive surface It is over the display or integrated with display.In some embodiments, display is to separate with touch sensitive surface.In method 700 Some operations be optionally combined and/or the sequence of some operations is optionally changed.
As described below, method 700 is provided from the user interface navigation shown with split screen display available mode to different user circle The intuitive manner in face.The method reduce when in split screen display available mode and/or within and except user interface between carry out Quantity, degree and/or the property of input from the user when navigation, to create more effective man-machine interface.For battery The electronic equipment of power supply, allow users to faster and more effectively in span mode and/or within and except user interface Between carry out navigation can save electric power and increase battery charging between time.
Equipment shows (702) first application programs in first part's (for example, left part of display) of display User interface (for example, first application program user interface) is (for example, the left part of the display in Fig. 5 B1 and Fig. 5 B18 Show interaction map user interface, and the left part of the display in Fig. 5 B10 shows web page browsing user interface) Meanwhile the second application of display on the second part (for example, right part of display) different from first part of display The program user interface application program user interface of the first application program user interface (for example, be different from) (for example, Fig. 5 B1, The right part of display in Fig. 5 B10 and Fig. 5 B18 shows electronic mail user interface).In some embodiments, first Application program user interface and the second application program user interface are two individual consumer interfaces of same application, or different The different user interface of application program or system user interface and application program user interface etc..When aobvious simultaneously over the display When showing the first user interface and second user interface, the two interfaces can respond and receive the touch input of user.User circle Face allows the drag and drop object between two user interfaces.
At second of display while showing the first application program user interface in the first part in display When point the second application program user interface of upper display, equipment detect the first input that (714) carried out by the first contact (for example, In the first edge region of display (for example, in the bottom margin preset distance away from display, such as by working as on display Preceding display orientation limits) start) comprising the movement of (upwards or lateral) is (for example, the first contact is in display on first direction On movement).It is moved up for example, being shown in Fig. 5 B1, Fig. 5 B10 and Fig. 5 B18 respectively by what contact 4402 carried out 4404, by contact 4418 carry out move up 4420 and by contact 4425 carry out move up 4427.
The first input is detected in response to (716), according to first input the first standard of satisfaction is determined, wherein the first standard packet Including and requiring the first input includes being greater than the movement of first threshold amount (for example, being more than threshold distance and/or rate) on first direction (for example, first contact movement over the display) to meet the first standard, equipment full frame home screen replacement (718) the The display of one user interface and second user interface.For example, the movement 4427 carried out by contact 4425 is from the position in Fig. 5 B18 Setting the position 4425-c in 4425-a to Fig. 5 B20 includes the amount of threshold shift for being at least distally from the bottom margin of display, so that After contact 4425 is lifted away from Fig. 5 B20, web page browsing user circle is replaced in the display of full frame home screen of the equipment in Fig. 5 B21 The display (being shown with the span mode in Fig. 5 B18) of face and electronic mail user interface.In some embodiments, for the first time After detecting the first contact, and before determining that the first input meets the first standard, used in detecting the aobvious of input for the first time Show that the replacement user interface on device part replaces the display of the first user interface (for example, allowing the user to navigate to being somebody's turn to do for display The interim subscriber interface of multiple and different user interfaces on part, for example, application program switch user interface or one upper/under One application program user interface;Or the transition of the multiple and different user interfaces allowed the user to navigate in whole display is used Family interface, for example, full frame application program switch user interface or home screen, with specific reference to for corresponding to different user circle The different navigation standard in face to first input carry out assessment, for example, first input one group of one or more attribute with it is corresponding In the comparison of one group of threshold value of correspondence at different user interface).For example, by contacting 4425 bottoms from display in Fig. 5 B18 Portion edge is moved upward to after activated user interface selection course, and equipment enters transition navigational state, i.e., with indicating alternatively The card 4017 of figure user interface and electronic mail user interface replaces the two user interfaces.
In response to the first input of (716) detection, according to first input the second standard of satisfaction is determined, wherein the second standard includes It is required that the first input includes being less than the movement (example of first threshold amount (for example, being less than threshold distance and/or rate) on first direction Such as, the movement of the first contact over the display) to meet the second standard, and determine that the first input corresponds in the display Start in the first edge region of first application program user interface, the first replacement user interface replacement (720) first of equipment The display of application program user interface, while keeping the second application program user interface aobvious in the second part of display Show.For example, the movement 4404 carried out by contact 4402 is from the position 4402-b in position 4402-a to Fig. 5 B2 in Fig. 5 B1 Meet the second mobile standard, but be unsatisfactory for first movement standard, because it includes less than the threshold of the bottom margin far from display It is worth amount of movement, so that equipment is used with the application program switch in Fig. 5 B3 to Fig. 5 B4 after contact 4402 is lifted away from Fig. 5 B2 The display of interaction map user interface in display replacement (transition) the display left part at family interface.
The first input is detected in response to (716), the second standard is met according to determining first input, and determine that first is defeated Enter and start in the second edge region for corresponding to the second application program user interface, equipment is replaced with the second replacement user interface The display of (742) second application program user interfaces, while keeping the first application program user interface at first of display Display in point.For example, the movement 4420 carried out by contact 4418 is from position 4420-a to Fig. 5 B11 in Fig. 5 B10 Position 4420-b meets the second mobile standard, but is unsatisfactory for first movement standard, because it includes less than the bottom far from display The amount of threshold shift at portion edge, so that equipment is cut with the application program in Fig. 5 B12 after contact 4418 is lifted away from Fig. 5 B11 The display of electronic mail user interface in display replacement (transition) display right part of parallel operation user interface.
When meeting the first standard (for example, first distance and/or threshold speed), home screen is shown with full-screen display mode Curtain, and when meeting the second standard (for example, second distance and/or threshold speed), it is shown in the first part of display Replace application program user interface, while keep application program user interface on the second part of display display (for example, Or vice versa), it is specifically dependent upon the position for calling input to start, the operability of equipment is enhanced and makes user equipment circle Face is more effective (for example, the light access by providing to the navigation feature of equipment;It is interacted by reducing operation equipment/with equipment When user's mistake;By helping user to realize expected result with less required input;And by providing additional control Option processed, but not since the control of additional display keeps user interface mixed and disorderly), it is set turn, this reduces electricity usage and extending Standby battery life (for example, by helping user faster and more effectively to use equipment).
In some embodiments, the second standard includes (722) application program switch interface navigation standard, wherein applying The input of program switch interface navigation standard requirements first includes the movement of the first contact (for example, the first contact is over the display Movement), which has the moving parameter on the direction far from the respective edges region that the first input starts in display The magnitude of (for example, distance and/or rate), to meet application program switch interface navigation standard.In some embodiments In, application program switch interface navigation standard requirements are application programs when the distributed current goal state of transition user interface When switch user interface (for example, as with reference to determined by Fig. 8), being lifted away from for contact is detected.For example, in some embodiments In, application program switch interface navigation standard includes that input meets the first X threshold speed, and it is substantially horizontal, and not Meet Y location threshold value, for example, meeting Fig. 8 when being unsatisfactory for standard 80x2 and 80x3 before detecting being lifted away from of contact In standard 80x4.Similarly, in some embodiments, application program switch interface navigation standard includes that input has not More than the speed of minimum X speed and Y speed, for example, standard 80x2 ought be unsatisfactory for extremely before immediately in being lifted away from for contact is detected When any one in 80x5, meet the standard 80x6 in Fig. 8.Similarly, in some embodiments, application program switch circle Face navigation standard includes inputting without downward speed or being unsatisfactory for third x position threshold value, for example, ought connect immediately in detecting When being unsatisfactory for any one into 80x7 of standard 80x2 before being lifted away from of touching, meet the standard 80x8 in Fig. 8.Replace user interface (for example, when the display for starting to replace the first application program user interface when the first input in the first edge region in display First replacement user interface, or when in the second edge region in display start first input when replace the second application program Second replacement user interface of the display of user interface) it is application program switch user interface comprising for selectively One of multiple application programs currently indicated in application program switch user interface are activated (to retain user circle for example, having The application program (for example, last movable user interface) of the latest activities of surface state) application program corresponding expression.One In a little embodiments, after detecting the first contact for the first time, and before determining that the first input meets the second standard, it is used in Detect that the replacement user interface in the display part of input replaces the display of the first user interface (for example, allowing to use for the first time Family navigates to the interim subscriber interface of multiple and different user interfaces on the part of display, for example, application program switch User interface or one upper/next application program user interface;Or it allows the user to navigate to multiple in whole display The interim subscriber interface at different user interface, for example, full frame application program switch user interface or home screen).In response to Upward since the fringe region of the first part of display gently sweeps and shows application program in the first part of display Switch user interface (for example, when equipment is in split screen display available mode), while application program user interface being kept to show In the second part of device display (or vice versa), enhance the operability of equipment and keep user's equipment interface more effective (for example, by providing the light access to the navigation feature of equipment;By reducing user when operation equipment/interact with equipment Mistake;By helping user to realize expected result with less required input;And by providing additional control option, but User interface will not be kept mixed and disorderly due to the control of additional display), turn, this reduces electricity usage and extend the battery of equipment Service life (for example, by helping user faster and more effectively to use equipment).
In some embodiments, when showing application program in the first part of display or the second part of display When switch user interface, equipment detects (724) to for selectively activating currently in application program switch user interface The application program of one of multiple application programs of middle expression it is corresponding indicate in first indicate (for example, corresponding application programs The thumbnail of last active user interface) selection (for example, 4406 selection is indicated by 4416 pairs of contact in Fig. 5 B8). In response to detecting the selection indicated first, as application program switch user circle when detecting the selection to the first expression (according to conditions above is determined) when face is shown in the first part of display, equipment shows (726) at first of display The user of display application program associated with the first expression (for example, last active user interface of corresponding application programs) in point Interface (for example, application program switch user interface in the first part of replacement display), while keeping second to apply journey Sequence user interface is in the display in the second part of display (for example, indicating 4406 using the selection of contact 4416 in Fig. 5 B8 Later, equipment shows web page browsing user interface in the left part of display, while electronic mail user interface being kept to exist Display in the right part of display in Fig. 5 B8).In response to detecting the selection indicated first, when detecting pair When application program switch user interface is shown in the second part of display when the selection of the first expression (more than determination Condition), equipment shows that (726) show user circle of application program associated with the first expression in the second part of display Face (for example, application program switch user interface in the second part of replacement display), while keeping the first application program User interface is in the display in the first part of display (for example, the selection to the expression 4414 in Fig. 5 B12, will lead to equipment Associated interactive map user interface is shown in the right part of display, while keeping web page browsing user interface aobvious Show the display in the left part of device).(for example, in the display operated with span mode in the first part of display On side) selection is corresponding in the application program switch user interface of display indicate after, shown in the first part of display Show application program user interface, while keeping the application program user interface shown simultaneously with application program switch user interface Display in the second part of display (for example, on the opposite side of the display operated with span mode), is enhanced and is set Standby operability and keep user's equipment interface more effective (for example, the light access by providing to the navigation feature of equipment; By reducing user's mistake when operation equipment/interact with equipment;It is expected by helping user to realize with less required input Result;And by providing additional control option, but not since the control of additional display keeps user interface mixed and disorderly), this Reduce electricity usage again and extends the battery life of equipment (for example, by helping user faster and more effectively to use Equipment).
In some embodiments, when display application program associated with the first expression in the first part in display User interface and when showing the second application program user interface in the second part of display (for example, in display After selection first indicates in the application program switch user interface shown in first part), correspondence of the equipment in display In the second edge region of the second application program user interface (for example, in the bottom margin preset distance away from display, such as Limited by the current display orientation on display) second input of (732) by the second contact progress is detected (for example, navigating After web page browsing user interface on to the left side of the display in Fig. 5 B1 to Fig. 5 B9, the bottom of the display in Figure 51 0 Contact 4420 is detected on the bottom margin of the right part at portion edge).In response to detecting the second input, according to determining second Input meets application program switch interface navigation standard, if the application program switch in the second part of alternate display is used The display of (734) second application program user interfaces is replaced (for example, journey is applied in display in the second part of display in family interface Sequence switch user interface rather than the first part of display), while keeping application program associated with the first expression User interface is shown in the first part of display (for example, in response to gently sweeping gesture, including logical in Fig. 5 B10 to Fig. 5 B11 That crosses 4418 progress of contact moves up 4420, and equipment shows application program switch on the right side of the display in Fig. 5 B12 User interface).Application program switch user interface in the second part of display includes and previously the first of display The expression of associated first application program of the first application program user interface shown on part is (for example, application program switches The expression of application program in device user interface indicates previously in any of the first part of display or second part The user interface of display is (for example, the first part of display and the shared one group of common application journey previously shown of second part Sequence user interface) (for example, expression 4414 in Fig. 5 B12 and the interaction previously shown on the right side of the display in Fig. 5 B1 Map user interface is associated).In some embodiments, each part of split screen display available mode has one group of list of their own Only application program user interface previously shown, so that when application program user interface is separate aobvious in a part of display Show device navigation when, application program switch user interface display it is same a part in open but display other When part is not opened, the expression of the user interface can be used in application program switch user interface.In response to from display The fringe region of second part start it is upward gently sweep and in the second part of display display include previously in display The application program switch user interface of the expression of the application program user interface shown in first part is (for example, at equipment When split screen display available mode), it enhances the operability of equipment and keeps user's equipment interface more effective (for example, passing through offer Light access to the navigation feature of equipment;By reducing user's mistake when operation equipment/interact with equipment;By helping to use Family expected result is realized with less required input;And by providing additional control option, but not due to additional aobvious The control shown keeps user interface mixed and disorderly), turn, this reduces electricity usage and extend equipment battery life (for example, passing through User is helped faster and more effectively to use equipment).
In some embodiments, the second standard includes (736) last Application Program Interface navigation standard, wherein finally answering Being inputted with program interface navigation standard requirements first includes the first movement contacted, which, which has, is being arranged essentially parallel to display Movement on the direction in the respective edges region (for example, first edge region or second edge region) that the first input starts in device The magnitude of parameter (for example, distance and/or rate) is (for example, arc is gently swept, including Fig. 5 B22, Fig. 5 B25, Fig. 5 B28, Fig. 5 B31 With in Fig. 5 B34 the movement 4430 carried out respectively by contact 4428,4432,4436,4440 and 4444,4434,4438, 4442 and 4446).In some embodiments, next/upper Application Program Interface navigation standard requirements have served as crossing user The distributed current goal state at interface is next/upper application program user interface (for example, as determined with reference to Fig. 8 ) when, detect being lifted away from for contact.For example, in some embodiments, next/upper Application Program Interface navigation standard Meet the first x threshold speed including input, has and project downward position, or meet the first Y location threshold value, and do not include Direction offset after amount of threshold shift, for example, when be unsatisfactory for before detecting being lifted away from of contact standard 80x2 and When 80x3, meet the standard 80x4 in Fig. 8.Similarly, in some embodiments, next/upper Application Program Interface Navigation standard includes inputting due to having the translational movement less than minimum Y translational movement and meeting the second x position threshold value, for example, when tight It connects when being unsatisfactory for any one into 80x4 of standard 80x2 before detecting being lifted away from of contact, meets the standard 80x5 in Fig. 8. Similarly, in some embodiments, next/upper Application Program Interface navigation standard includes that input has downward Y Speed meets third x position threshold value, and first in the compound gesture of but not gently sweeps, for example, ought connect immediately in detecting When being unsatisfactory for any one into 80x7 of standard 80x2 before being lifted away from of touching, meet the standard 80x8 in Fig. 8.Similarly, one In a little embodiments, next/upper Application Program Interface navigation standard includes that input has downward Y speed or satisfaction Third x position threshold value is first gently to sweep, and meet x position threshold value, for example, when before detecting being lifted away from of contact When being unsatisfactory for any one into 80x7 of standard 80x2, meet the standard 80x8 in Fig. 8.User interface is replaced (for example, when aobvious Show that the first replacement of the display for starting to replace the first application program user interface in the first edge region of device when the first input is used Family interface, or the aobvious of the second application program user interface is replaced when starting first in the second edge region in display and inputting Show second replacement user interface) be different from be replaced corresponding application programs user interface (for example, the first user interface or Second user interface) the first application program user interface previously shown.In response to the edge of the first part from display Lateral gently sweep that region starts and show the user interface previously shown (for example, at the equipment in the first part of display When split screen display available mode), at the same keep application program user interface in the second part of display display (or vice versa also So), it enhances the operability of equipment and keeps user's equipment interface more effective (for example, the navigation function by providing to equipment The light access of energy;By reducing user's mistake when operation equipment/interact with equipment;By helping user with needed for less Expected result is realized in input;And by providing additional control option, but not since the control of additional display makes user Interface is mixed and disorderly), turn, this reduces electricity usage and extend equipment battery life (for example, by help user more rapidly And equipment is efficiently used).
In some embodiments, first in the application program user interface for being used as previously having shown replaces user interface After the display for replacing the first application program user interface, and the first time threshold what is contacted since first is lifted away from In (for example, for detecting the time threshold that continuous horizontal is gently swept), equipment detects (738) by starting in first edge region Second contact carry out second input, including second contact movement, the movement have in being arranged essentially parallel to display Meet the magnitude (example of the moving parameter (for example, distance and/or rate) on the direction of last Application Program Interface navigation standard Such as, arc is gently swept, including the movement 4442 or 4446 carried out by contact 4440 or 4444 in Fig. 5 B31 or Fig. 5 B34).It rings Ying Yu detects the second input, can navigate to the second application program user interface previously shown according to determination, and equipment is with second The application program user interface that had previously shown replace the application program user interface that (740) first had previously shown display (for example, Equipment shows the message user interface in Fig. 5 B33, because when equipment detects that arc is gently swept, including in Fig. 5 B31 to Fig. 5 B32 By contact 4440 carry out movement 4442 when, the expression of message user interface can be used in card heap).In response to detecting Two input, according to determination can not navigate to second previously shown application program user interface (for example, first previously shown answer It is that there is a pile the last application program in the nearest opening application program for retaining user interface state to use with program user interface Family interface), equipment shows (740) second user interface (for example, by by the display at second user interface with full-screen display mode First part and the second part of display are expanded to from the second part of display to terminate split screen display available mode) (for example, In In Fig. 5 B36, equipment the display of interaction map user interface is expanded to from split screen it is full frame because when equipment detects that arc is light Sweep, including carried out in Fig. 5 B34 to Fig. 5 B35 by contact 4444 movement 4446 when, without more users circle in card heap Face indicates available).In response to the side since the fringe region of the first part of display to gently sweep and the first of display The user interface (when equipment is in split screen display available mode) that display second had previously been shown in part, while application program being kept to use Family interface in the second part of display display (or vice versa), or display is shown in full-screen display mode The application program user interface shown in second part, is specifically dependent upon whether the user interface that second had previously shown can be used, and increases The strong operability of equipment and keep user's equipment interface more effective (for example, by providing to the light of the navigation feature of equipment Pine access;By reducing user's mistake when operation equipment/interact with equipment;By helping user real with less required input Existing expected result;And by providing additional control option, but not since the control of additional display keeps user interface miscellaneous Disorderly), turn, this reduces electricity usage and the battery life of equipment is extended (for example, by helping user more rapidly and effectively Ground uses equipment).
In some embodiments, in response to detecting the first input, third standard is met according to determining first input, The middle input of third standard requirements first includes being less than in the movement but first direction of first threshold amount to be greater than second on first direction To meet third standard, equipment shows that (744) are full frame and answers for the movement (for example, being greater than threshold distance and/or rate) of threshold quantity Use program switch user interface (for example, the split view that shows is as in one group of optional application program before the first input Optional option) (for example, replacing the first user interface and second user interface with full frame application program switch user interface Display).For example, the movement 4426 carried out by contact 4424 is from the position in position 4424-a to Fig. 5 B16 in Fig. 5 B13 4425-d meets the mobile standard of third, but is unsatisfactory for first movement standard, because it includes less than the bottom sides far from display The first threshold amount of movement of edge but be greater than far from display bottom margin second threshold amount of movement (for example, with navigate to as Split screen application program switch user interface shown in Fig. 5 B1 to Fig. 5 B4 and Fig. 5 B10 to Fig. 5 B12 is associated) so that After contact 4424 is lifted away from Fig. 5 B16, the display of full frame application program switch user interface of the equipment in Fig. 5 B17 is replaced In the display of interaction map user interface and the right part of display in the left part of (for example, transition) display The display of electronic mail user interface.In some embodiments, third standard further includes the predetermined pause of the movement to input Requirement (for example, before being lifted away from of contact).In some embodiments, after detecting the first contact for the first time, And before determining that the first input meets third standard, replacement of the equipment in the display part for detecting input for the first time User interface replaces the display of the first user interface (for example, allowing the user to navigate to multiple and different on the part of display The interim subscriber interface of user interface, for example, application program switch user interface or one upper/next application user Interface;Or the interim subscriber interface of multiple and different user interfaces in whole display is allowed the user to navigate to, for example, full frame Application program switch user interface or home screen).When satisfaction the first standard (for example, first distance and/or threshold speed) When, home screen is shown with full-screen display mode;When meeting the second standard (for example, second distance and/or threshold speed), In Replacement application program user interface is shown in the first part of display, while keeping application program user interface in display Display (for example, or vice versa) on second part, is specifically dependent upon the position for calling input to start;And works as and meet third When standard (for example, third distance and/or threshold speed, such as the median of first threshold and second threshold), show full frame answer With program switch user interface, enhances the operability of equipment and keep user's equipment interface more effective (for example, by mentioning For the light access of the navigation feature to equipment;By reducing user's mistake when operation equipment/interact with equipment;Pass through help User realizes expected result with less required input;And by providing additional control option, but not due to additional The control of display keeps user interface mixed and disorderly), turn, this reduces electricity usage and extend equipment battery life (for example, logical It crosses help user and faster and more effectively uses equipment).
In some embodiments, when showing in the first part of display (for example, left part of display) Is shown while one application program user interface (for example, first application program user interface) on the second part of display When two application program user interfaces, and before detecting the first input, equipment in the first application program user interface one Display (704) first is shown and can be indicated on part, wherein first shows that the position instruction that can be indicated is used for the first part in display Upper beginning Pre-defined gesture input is (for example, for entering interim subscriber interface model or display application program switch user circle Gently sweep gesture in the edge in face) conversion zone (for example, bottom edge region of the display in the first part of display) (example Such as, the home in Fig. 5 B1 in the left part of display, which shows, can indicate 4400-1), and equipment is in the second application user Display (740) second is shown and can be indicated in a part at interface, wherein second shows the position instruction that can be indicated in display Start Pre-defined gesture input on second part (for example, for entering interim subscriber interface model or display application program switching Gently sweep gesture in the edge of device user interface) conversion zone (for example, the bottom margin of the display in the second part of display Region) (for example, the home in Fig. 5 B1 in the right part of display, which shows, can indicate 4400-2).When with split screen display available mode behaviour When making, display first is shown and can indicate and second show and can indicate on the part at the first user interface and second user interface respectively, To indicate to enhance grasping for equipment for the conversion zone for starting navigation gesture input on each part of split screen display available device The property made and keep user's equipment interface more effective (for example, the light access by providing to the navigation feature of equipment;Pass through reduction User's mistake when operation equipment/interacted with equipment;By helping user to realize expected result with less required input;And And by providing additional control option, but not since the control of additional display keeps user interface mixed and disorderly), turn, this reduces electricity Power uses and extends the battery life (for example, by helping user faster and more effectively to use equipment) of equipment.
In some embodiments, first shows the size that can be indicated (example proportional to the size of the first part of display Such as, the one third of the bottom width of the first part of display), second shows the second part of the size and display that can indicate Size it is proportional (for example, one third of the bottom width of the second part of display), and when in the first application program Display first, which is shown, on the part of user interface can indicate and show that second shows on the part of the second application program user interface When can indicate, the user that equipment detection (706) meets split screen size adjusting standard is inputted (for example, in the first part of display The gesture of size adjusting handle is selected and dragged on screen segment device between second part).Divide in response to detecting to meet Screen dimensions adjust user's input of standard, and it is the second size that the first part of display is adjusted (708) from first size by equipment, The second size including the first part with display proportionally adjust the first application program user interface display size and First shows the display size that can be indicated, and the second part of display is the 4th ruler from third size adjusting (708) by equipment It is very little, the display size of the second application program user interface is proportionally adjusted including the 4th size of the second part with display Show the display size that can be indicated with second.When the part for the display that adjustment is used with split screen display available mode is (for example, first part And second part) size when, adjustment instruction for start navigation gesture input showing for conversion zone can indicate (for example, In First shown in the first part of split screen display available device shows second that can indicate and show in the second part of split screen display available device Showing can indicate) display size, enhance the operability of equipment and keep user's equipment interface more effective (for example, by mentioning For the light access of the navigation feature to equipment;By reducing user's mistake when operation equipment/interact with equipment;Pass through help User realizes expected result with less required input;And by providing additional control option, but not due to additional The control of display keeps user interface mixed and disorderly), turn, this reduces electricity usage and extend equipment battery life (for example, logical It crosses help user and faster and more effectively uses equipment).
In some embodiments, when with full-screen display mode (for example, in whole display, rather than aobvious with split screen Show mode) display third application program user interface when, equipment third application program user interface a part (for example, aobvious Show the bottom edge region of device) on display (768) third show and can indicate that wherein third shows the position instruction that can indicate for showing Show and starts Pre-defined gesture input on device (for example, for entering full frame interim subscriber interface model or the full frame application program of display Gently sweep gesture in the edge of switch user interface) conversion zone (for example, in Fig. 5 B36 interaction map user interface it is full frame aobvious 4400-3 can be indicated by showing that the home on device shows).It is shown in a part of the user interface shown with full-screen display mode single Showing can indicate, to indicate to enhance grasping for equipment for the conversion zone for starting navigation gesture input on full screen monitor The property made and keep user's equipment interface more effective (for example, the light access by providing to the navigation feature of equipment;Pass through reduction User's mistake when operation equipment/interacted with equipment;By helping user to realize expected result with less required input;And And by providing additional control option, but not since the control of additional display keeps user interface mixed and disorderly), turn, this reduces electricity Power uses and extends the battery life (for example, by helping user faster and more effectively to use equipment) of equipment.
In some embodiments, the first standard and the second standard are required to being lifted away from (for example, detecting of the first input One contact is lifted away from).In response to detecting the movement of the first input on display (for example, the first contact in a first direction Movement), and before detecting being lifted away from of the first input, according to determining that the first input first answering in corresponding to for display Started in the first edge region of program user interface, it includes correspond to the first application program user interface first that equipment, which is used, The interim subscriber interface of application view (for example, diminution scaled image of the first application program user interface) is (for example, allow User navigates to the interim subscriber interface of multiple and different user interfaces on the part of display, for example, application program switches Device user interface or one upper/next application program user interface;Or it allows the user to navigate to more in whole display The interim subscriber interface at a different user interface, for example, full frame application program switch user interface or home screen, specific root According to the assessment carried out for the different navigation standard corresponding to different user interface to the first input, for example, the one of the first input The one or more attributes of group are compared with the one group of threshold value of correspondence for corresponding to different user interface) (746) first are replaced using journey The display of sequence user interface comprising corresponding to the first application view of the first application program user interface, while keeping second Display of the application program user interface in the second part of display, wherein the size of the first application view is with first It inputs movement over the display and dynamically changes.For example, by contacting 4402 bottom sides from display in Fig. 5 B1 Edge is moved upward to after activated user interface selection course, and equipment enters transition navigation shape in the left part of display State, i.e., with indicating that the application view 4014 of interactive map user interface replaces interaction map user interface in Fig. 5 B2, simultaneously Keep display of the electronic mail user interface in the right part of display.The correspondence in display is inputted according to determining first In the second edge region of the second application program user interface, it includes corresponding to the second application user circle that equipment, which is used, It replaces at the interim subscriber interface of second application view (for example, diminution scaled image of the second application program user interface) in face The display of (746) second application program user interfaces is changed, while keeping the first application program user interface the first of display Display in part, wherein the size of the second application view dynamically becomes with the first input movement over the display Change.For example, being moved upward to activated user interface selection from the bottom margin of display by contacting 4418 in Fig. 5 B10 After process, equipment in the right part of display enter transition navigational state, i.e., with indicated in Fig. 5 B11 Email use The application view 4022 at family interface replaces electronic mail user interface, while keeping interactive map user interface in display Left part on display.Before meeting the requirements the navigation standard of contact being lifted away from, what is operated with split screen display available mode Shown in the first part of display interim subscriber interface (for example, allow the user to navigate to different user interfaces (for example, with One or more of lower situation: (i) navigates to home screen, and (ii), which navigates to, immediately gently sweeps use shown when gesture starts The application program shown on the screen before the interface of family, (iii) navigates to control panel user interface, and (iv) navigates to using journey Sequence switches user interface, or (v) navigates to return to and gently sweep user interface shown when gesture starts)), while keeping applying Display (for example, or vice versa) of the program user interface on the second part of display is specifically dependent upon calling input and opens The position of beginning enhances the operability of equipment and keeps user's equipment interface more effective (for example, leading by providing to equipment The light access for function of navigating;By reducing user's mistake when operation equipment/interact with equipment;By helping user with less Required input realizes expected result;And by providing additional control option, but not since the control of additional display makes User interface is mixed and disorderly), turn, this reduces electricity usage and extend equipment battery life (for example, by help user more Quickly and efficiently use equipment).
In some embodiments, when showing interim subscriber interface, position and speed that equipment monitor (748) first contacts (748) corresponding visual feedback is spent and provides, (for example, by movement, zooming in or out and replacing user interface when inputting and starting Application view) instruction will current time detect contact equipment in the case where being lifted away from how will navigate (for example, will Any user interface shown and activate).For example, move up 4426 from the bottom margin of display by contact 4424 (from The position 4424-b that position 4424-a in Fig. 5 B13 is moved in Fig. 5 B14) come after activated user interface selection course, if It is standby to enter transition navigational state in the left part of display, i.e., with the application view for indicating interactive map user interface 4014 replace interaction map user interface, and show expression web browser to the left upper portion point of display in Fig. 5 B2 The application view 4406 of user interface, this current signature of instruction based on gesture, equipment will navigate to when contact is lifted away from Split screen application program switch user interface.4426 (the positions from Fig. 5 B14 are continued to move up in response to contact 4424 4424-b is moved to the position 4424-c in Fig. 5 B15), the equipment application view for indicating electronic mail user interface Display of the 4015 replacement electronic mail user interfaces in the right part of display, while keeping 4406 He of application view 4014 display in full frame transition navigation user interface, this current signature of instruction based on gesture, equipment will be lifted away from contact When navigate to full frame application program switch user interface.There is provided how indicating equipment will navigate when being lifted away from (for example, navigating Call gesture terminate after what user interface will be shown) visual feedback, enhance the operability of equipment and make user Equipment interface is more effective (for example, the light access by providing to the navigation feature of equipment;By reduction/mitigation operate equipment/ User's mistake when being interacted with equipment;By helping user to realize expected result with less required input;And by mentioning For additional control option, but not since the control of additional display keeps user interface mixed and disorderly), turn, this reduces electricity usages simultaneously And extend the battery life (for example, by helping user faster and more effectively to use equipment) of equipment.
In some embodiments, when showing interim subscriber in the first part of display or the second part of display When interface, shown in interim subscriber interface two or more application views instruction (750) first contact be lifted away from when, Equipment will start in first edge region according to the first input is determined, show that application program is cut in the first part of display Parallel operation user interface comprising for selectively activating currently indicate in application program switch user interface multiple to answer It is indicated with the multiple of application program of one of program, while keeping the second application program user interface in the second part of display In display show and answer in the second part of display and according to determining that the first input starts in second edge region With program switch user interface comprising currently indicated in application program switch user interface for selectively activating One of multiple application programs application program multiple expressions, while keeping the first application program user interface in display Display in first part is (for example, multiple application views 4406 and 4014 are in Fig. 5 B2 in the left part of display Display indicates the current signature based on gesture, and equipment will be in the left part for navigating to display when contact 4402 is lifted away from Split screen application program switch user interface, as shown in Fig. 5 B3 to Fig. 5 B4).At interim subscriber interface, (it is with split screen display available Shown in a part of the display of mode operation) in show two or more application views, will be with indicating equipment Navigate to when being lifted away from the part of display application program switch user interface (for example, in some embodiments, when When with the operation of split screen display available mode, two or more application views are shown in the part of initiation gesture in the display, And two or more application views will show two in instruction application program switch user interface in the display Or more show in the part of application view), enhance the operability of equipment and keep user's equipment interface more effective (for example, by providing the light access to the navigation feature of equipment;By reduction ,/mitigation operate equipment/with equipment when interacting User's mistake;By helping user to realize expected result with less required input;And by providing additional control choosing , but not since the control of additional display keeps user interface mixed and disorderly), turn, this reduces electricity usage and extend equipment Battery life (for example, by helping user faster and more effectively to use equipment).
In some embodiments, when showing interim subscriber in the first part of display or the second part of display When interface, equipment detection (752) will first contact be lifted away from when meet the first standard first input the first attribute (for example, The speed of first contact and/or position), and the first attribute in response to detecting the first contact, according to determining first input Start in first edge region, stops (754) and show the second application program user interface in the second part of display, and And the display at interim subscriber interface is extended into (754) to whole display (for example, from interim subscriber from the first part of display The split screen display available pattern switching that interface is only shown in the first part of split screen is that interim subscriber interface is shown in whole display The full-screen display mode shown, for example, as shown in Fig. 5 B19), and started in second edge region according to determining first input, Stop (754) and shows the first application program user interface, and showing interim subscriber interface in the first part of display Show from the second part of display extension (754) to whole display (for example, from interim subscriber interface only at second of split screen The split screen display available pattern switching shown in point is the full-screen display mode that interim subscriber interface is shown in whole display).In In some embodiments, when the first input starts in first edge region, the second application program user interface is used by second The application view at family interface is replaced, for example, the application view and previously replacement before showing interim subscriber interface The application program of first application program user interface of the first application program user interface shown in the first part of display View merging.In some embodiments, when the first input starts in second edge region, the first application user circle Face is replaced by the application view of the first user interface, such as the application view and previously replacement are in display interim subscriber Second application program user interface of the second application program user interface shown on the second part of display before interface Application view merge.The first standard will be met when contact is lifted away from (for example, full frame home screen is aobvious in response to detecting Indicating is quasi-) contact attribute and by the display at interim subscriber interface from a part of the display operated with split screen display available mode The whole display operated with full-screen display mode is expanded to, full frame home screen will be navigate to when contact is lifted away from indicating equipment Curtain enhances the operability of equipment and keeps user's equipment interface more effective (for example, the navigation feature by providing to equipment Light access;The user's mistake of/mitigation operates equipment/when being interacted with equipment by reduction;By helping user with less Required input realizes expected result;And by providing additional control option, but not since the control of additional display makes User interface is mixed and disorderly), turn, this reduces electricity usage and extend equipment battery life (for example, by help user more Quickly and efficiently use equipment).
In some embodiments, stop the first application program user interface of display or the second application program user interface packet (756) are included, are started in first edge region according to determining first input, with the application program of the first application program user interface View replaces the display of the first application program user interface, wherein the application view of the first application program user interface is aobvious Show attribute moving and dynamically change according to the first input, and according to determining that the first input opens in second edge region Begin, the display of the second application program user interface is replaced with the application view of the second application program user interface, wherein the The display properties of the application view of two application program user interfaces is moved and is dynamically changed according to the first input.Example Such as, it 4426 is moved up from the bottom margin of display (is moved to from the position 4424-a in Fig. 5 B13 by contact 4424 Position 4424-b in Fig. 5 B14) to come after activated user interface selection course, equipment enters in the left part of display Transition navigational state, i.e., replace interaction map user interface with the application view 4014 with first size, this applies journey Interaction map user interface in sequence view table diagram 5B14.Contact 4424 continues to move up the 4426 (positions from Fig. 5 B14 Set the position 4424-c that 4424-b is moved in Fig. 5 B15), cause application view 4014 from the first size in Fig. 5 B14 Narrow down to the second smaller size in Fig. 5 B15.The first standard will be met when contact is lifted away from (for example, full frame in response to detecting Home screen show standard) contact attribute and use the application view of application program user interface replacement application program use The display at family interface will navigate to full frame home screen with indicating equipment when contact is lifted away from, and enhance the operability of equipment And keep user's equipment interface more effective (for example, the light access by providing to the navigation feature of equipment;By reducing/subtracting User's mistake when light work equipment/interacted with equipment;By helping user to realize expected result with less required input; And by providing additional control option, but not since the control of additional display keeps user interface mixed and disorderly), turn, this reduces Electricity usage and the battery life (for example, by helping user faster and more effectively to use equipment) for extending equipment.
In some embodiments, showing full frame interim subscriber interface (for example, from the first part of display or second Partially extend into the interim subscriber interface of whole display) when, show two or more using journey in interim subscriber interface Sequence view indicates (758) when the first contact is lifted away from, and equipment will show application program switch user interface, the user interface packet To include multiple application programs indicate comprising indicates in full frame application program switch user interface for selectively activating Multiple expressions of the application program of one of multiple application programs.For example, in the navigation user interface of the transition shown in Fig. 5 B16 It shows application view 4406 and 4017, indicates the current signature based on gesture, equipment will the navigation when contact 4424 is lifted away from To full frame application program switch user interface, as shown in Fig. 5 B17.At the interim subscriber interface shown with full-screen display mode Middle two or more application views of display will navigate to full frame application program switching with indicating equipment when contact is lifted away from Device user interface enhances the operability of equipment and keeps user's equipment interface more effective (for example, by providing to equipment The light access of navigation feature;The user's mistake of/mitigation operates equipment/when being interacted with equipment by reduction;By helping user Expected result is realized with less required input;And by providing additional control option, but not due to additional display Control keep user interface mixed and disorderly), turn, this reduces electricity usage and extend equipment battery life (for example, passing through side User is helped faster and more effectively to use equipment).
In some embodiments, when showing full frame interim subscriber interface, one is only shown in interim subscriber interface Application view indicates (760) when the first contact is lifted, and equipment will show full frame home screen.For example, in Fig. 5 B20 Shown in show single application view 4017 in transition navigation user interface, indicate the current signature based on gesture, equipment Home screen will be navigate to when contact 4425 is lifted away from, as shown in Fig. 5 B21.In the interim subscriber shown with full-screen display mode An application view is only shown in interface, with indicating equipment will be navigate to when contact is lifted away from full frame home screen (for example, With show two or more application views on the contrary, full frame application program will be navigate to when contact is lifted away from indicating equipment Switch user interface), it enhances the operability of equipment and keeps user's equipment interface more effective (for example, by offer pair The light access of the navigation feature of equipment;The user's mistake of/mitigation operates equipment/when being interacted with equipment by reduction;Pass through side User is helped to realize expected result with less required input;And by providing additional control option, but not due to attached Add the control of display to keep user interface mixed and disorderly), turn, this reduces electricity usage and extend equipment battery life (for example, By helping user faster and more effectively to use equipment).
In some embodiments, when showing the first application program user interface and second in full frame interim subscriber interface The application view of application program user interface is (for example, the first application program user interface and the second application user circle The single utility program view in face, or indicate that the first application program user interface and the single of the second application program user interface are answered With Views) when, it includes the first contact in the first edge region or second edge area towards display that equipment, which detects (762), The gesture of movement (for example, the movement for being greater than threshold quantity in a second direction) in the second direction in domain.In response to detecting packet The gesture of the movement of the first contact in a second direction is included, equipment starts in first edge region according to determining first input, Display of (764) second application program user interfaces in the second part of display is restored, and according to determining first input Start in second edge region, restores display of (764) first application program user interfaces in the first part of display. For example, equipment will if contact 4424 is moved down from the bottom margin of position 4424-d towards display in Fig. 5 B15 Display of the electronic mail user interface in the right part of display is restored, is such as previously shown in Fig. 5 B14.In response to working as Moving down for contact is detected when showing full frame interim subscriber interface and is restored and was previously shown what is operated with split screen display available mode The display (for example, restoring split screen display available mode) for showing the application program user interface shown in a part of device, enhances equipment Operability and keep user's equipment interface more effective (for example, the light access by providing to the navigation feature of equipment;It is logical Cross the user's mistake of reduction/mitigation operates equipment/when interacting with equipment;By helping user to realize in advance with less required input The result of phase;And by providing additional control option, but not since the control of additional display keeps user interface mixed and disorderly), Turn, this reduces electricity usage and extend equipment battery life (for example, by help user faster and more effectively make With equipment).
In some embodiments, when the full frame application program switch user interface of display is (for example, to be displayed in full screen mould Formula) when, for selectively activating the application of one of multiple application programs indicated in application program switch user interface Multiple expressions of program include (766) first indicate (for example, with previously shown in the first part of display first application Program and the associated expression of the second application program previously shown on the second part of display), with selection first At least two application programs (for example, the expression for showing the span mode of display) being activated simultaneously when expression are associated (example Such as, it selects the expression 4015 shown in Fig. 5 B17 in full frame application program switch user interface to will lead to equipment to navigate to point Shield display pattern, wherein showing interaction map user interface and the right part in display in the left part of display Upper display electronic mail user interface is such as previously shown in Fig. 5 B13).When display first part or display When showing application program switch user interface (for example, with split screen display available mode) on two parts, for selectively activating Multiple expressions of the application program of one of the multiple application programs indicated in application program switch user interface do not include with The associated expression of at least two application programs of activation simultaneously when selection.When the full frame application program switch user interface of display When associated at least two application programs expression of display, and ought be at one of the display operated with split screen display available mode Expression associated with single application program is only shown when display application program switch user interface in point, enhances equipment Operability and keep user's equipment interface more effective (for example, the light access by providing to the navigation feature of equipment;Pass through The user's mistake of reduction/mitigation operates equipment/when being interacted with equipment;It is expected by helping user to realize with less required input Result;And by providing additional control option, but not since the control of additional display keeps user interface mixed and disorderly), this Reduce electricity usage again and extends the battery life of equipment (for example, by helping user faster and more effectively to use Equipment).
In some embodiments, while showing the first application program user interface in the first part in display When showing the second application program user interface on the second part of display, and detecting through the first contact progress Before first input, the taskbar that meets in the first edge of equipment detection (710) display shows standard (for example, long-pressing mark It is quasi-) the first touch input (for example, long-pressing).The first touch input in first edge in response to detecting display, and And when being continued to test in the first edge in display to the first touch input, equipment is according to determining on the first side of display The first touch input is detected in the first part of edge, shows that (712) have at first position along the first edge of display There are multiple application program image target taskbars, and detects the on the second part of the first edge of display according to determining One touch input, along the first edge of display, in the second position, (it is selected as including the of the first edge of display Two parts) at show (712) taskbar (for example, centered on position that taskbar is touched by first), the wherein second position and the One position is different.For example, being reached in response to being consecutively detected contact 4202 at the position on the left side of the bottom margin of display The period of standard is inputted (for example, meeting time threshold TT to long-pressing is met1), equipment is in Fig. 5 A2 below contact 4202 Taskbar 4204 is shown along the left side of the bottom margin of display.On the contrary, in response to the right side of the bottom margin in display On position at be consecutively detected contact 4206 reach meet long-pressing input standard period (for example, meeting time threshold TT1), right side of the equipment in Fig. 5 A5 below contact 4206 along the bottom margin of display shows taskbar 4204, with Show that the position of taskbar 4204 is different in Fig. 5 A2.In some embodiments, first position is selected as including display The first part (for example, centered on position that taskbar is touched by first) of first edge.In some embodiments, first Set is that predetermined position (for example, when detecting the first touch in the middle section in first edge, is being located at display center Taskbar is shown at default location, whether no matter is contacted at the center of display).When the first standard of satisfaction is (for example, first position Standard) when taskbar is shown at first position, and when meeting the second standard (for example, second position standard) in second Place's display taskbar is set, the operability of equipment is enhanced and keeps user's equipment interface more effective (for example, by providing to setting The light access of standby navigation feature;By allow user execute navigation feature, no matter the palmistry of user for display position Set how;By helping user to realize expected result with less required input;And by providing additional control option, But not since the control of additional display keeps user interface mixed and disorderly), this is further through allowing users to faster and more effectively use Equipment and reduce electricity usage and extend the battery life of equipment.
It should be appreciated that the particular order that the operation in Fig. 7 A to Fig. 7 I is described is only exemplary, not purport Showing that the sequence is the unique order that can execute these operations.Those skilled in the art will recognize that various ways To resequence to operations described herein.Additionally, it should be noted that herein in regard to other methods as described herein (for example, Method 600,800,900,1000 and 1100) description other processes details equally in a similar way be suitable for above for The method 700 of Fig. 7 A to Fig. 7 I description.For example, contact, gesture, user interface object, tactile above with reference to the description of method 700 Output, intensity threshold, focus selector, animation optionally have herein with reference to other methods as described herein (for example, method 600,800,900,1000,1100) contact of description, gesture, user interface object, tactile output, intensity threshold, focus selection One or more of device, feature of animation.For brevity, these details are not repeated herein.
Operation above with reference to Fig. 7 A to Fig. 7 I description is optionally implemented by component that Figure 1A describes into Figure 1B.Example Such as, display operation 702,704,712,718,720,726,734,740,742,744,746,764 and 768, detection operation 706, 710,714,724,732,738,752 and 762, sizing operation 708, monitoring operation 748 and display extended operation 754 are optional It is realized by event classifier 170, event recognizer 180 and button.onrelease 190 on ground.Event in event classifier 170 Monitor 171 detects the contact on touch-sensitive display 112, and event information is transmitted to and answers by event dispatcher module 174 With program 136-1.Event information is defined 186 with corresponding event by the corresponding event identifier 180 of application program 136-1 to be compared Compared with, and whether determine the first contact on touch sensitive surface at first position (or the rotation of the equipment whether) correspond to it is predefined Event or subevent, such as selection to the object in user interface or the equipment from one orientation to another be orientated Rotation.When detecting corresponding predefined event or subevent, event recognizer 180 activates and to the event or subevent The associated button.onrelease 190 of detection.Button.onrelease 190 optionally use or call data renovator 176 or Object renovator 177 carrys out more new application internal state 192.In some embodiments, button.onrelease 190 accesses phase GUI renovator 178 is answered to carry out content shown by more new application.Similarly, those skilled in the art can know clearly Based in Figure 1A, into Figure 1B, how discribed component can realize other processes.
Fig. 8 is the flow chart for showing the method 800 navigated between user interface according to some embodiments.Side Method 800 is in the electronic equipment with display and touch sensitive surface (for example, equipment 300, Fig. 3;Or portable multifunction device 100, Figure 1A) place executes.In some embodiments, electronic equipment includes for one with the intensity of the contact of detection touch sensitive surface Or multiple sensors.In some embodiments, the touch sensitive surface and display are integrated into touch-sensitive display.In some embodiment party In case, display is touch-screen display, and touch sensitive surface is over the display or integrated with display.In some embodiments In, display is to separate with touch sensitive surface.Some operations in method 800 are optionally combined and/or some operations Sequence be optionally changed.
Method 800 is related in response to meeting gently sweeping gesture between user interface and navigating for different mobile conditions.Permit Family allowable is according to whether meet certain preset mobile conditions (for example, speed and position threshold standard) to navigate to (i) home Screen, what is shown before (for example, immediately before) user interface (ii) shown when gently sweeping gesture and starting on the screen answers With program (for example, " next or upper application program "), (iii) application program switches user interface (sometimes referred to as " more Business " user interface), or (iv) return gently sweep user interface shown when gesture starts (" current application program "), enhancing The operability of equipment and make user equipment interface more effective (for example, realizing expected knot when by reducing operation equipment Step quantity needed for fruit), turn, this reduces electricity usage and extend equipment battery life (for example, by help use Family faster and more effectively uses equipment).In some embodiments, corresponding movement is shown with taskbar in response to meeting The initial part of the input of condition shows taskbar in the user interface currently shown.
Method 800 is with display and touch sensitive surface (in some embodiments, display is touch-sensitive display) It is executed at equipment, (for example, on touch-screen display) shows user interface (for example, application program user interface or home screen Curtain user interface).Equipment detects the contact at the bottom margin of (802) touch-screen display (for example, respectively in Fig. 5 A28, figure Contact 4222 in 5B1, Fig. 5 B10, Fig. 5 B13, Fig. 5 B18, Fig. 5 B22, Fig. 5 B25, Fig. 5 B28, Fig. 5 B31 and Fig. 5 B34, 4402,4418,4424,4425,4428,4432,4436,4440 and 4444), and enter and allow the user to navigate to different user Interface (for example, back to current application program, navigate to different (for example, next/upper one) application program user interfaces, Navigate to home on-screen user interface or navigate to application program switch user interface) interim subscriber interface.In some realities It applies in scheme, equipment is with the correspondence application view in transition user interface (for example, Fig. 5 A29, Fig. 5 B2, Fig. 5 B11, figure Application view 4014 in 5B14, Fig. 5 B19, Fig. 5 B23, Fig. 5 B26, Fig. 5 B29, Fig. 5 B32 and Fig. 5 B35,4022, 4017,4406 and 4408) replace application program user interface.
The position and speed of equipment monitor (804) contact and provide visual feedback, (for example, by it is mobile, reduce or put Greatly input start when replace user interface application view) indicate to the user that contact is lifted away from when equipment how will navigate (for example, any user interface will be shown and activate).In some embodiments, the position and speed of contact corresponds to user The display of the application view of feedback is provided.For example, as shown in Fig. 5 B20,100 supervision application Views 4017 of equipment Position and speed.Show that standard, equipment are shown using journey since the instantaneous velocity of application view 4017 meets home Sequence view 4017, the application view without showing application program that any other is opened recently, this indicating equipment will connect Touching navigates to home on-screen user interface when will be lifted away from.On the contrary, as shown in Fig. 5 B16, since application view 4017 has existed Meet application program switch to show standard rather than suspend at the position of home display standard, therefore in addition equipment shows and corresponds to A part of the application view 4406 for the application program opened recently, this indicating equipment will the navigation when contact will be lifted away from To application program switch user interface.It in some embodiments, cannot be from interim subscriber interface access control Panel User Interface, and therefore when the visual feedback that the dbjective state that equipment provides indicating equipment is application program switch user interface When, it does not include the expression of display control panel user interface.
Equipment is then based on the current attribute of input (for example, what user prediction user when being lifted away from input will navigate to Interface) distribute (80x1) current goal state (for example, if being lifted away from input at this moment, it will the user interface navigated to). As shown in figure 8, equipment is by the value of current signature and one or more threshold values based on input (for example, by that will input spy Sign is compared with various speed and position threshold) making one or more (for example, a series of) decision, (80x2 to 80x11) comes Selection target state.In some embodiments, creation additional object state is available attached in split screen display available mode to correspond to Add navigational state.For example, in some embodiments, split screen application program switch user interface, which corresponds to, applies journey with full frame The different dbjective state of sequence switch user interface and one group of different standards.It is used for being transitioned into full frame application program switch The respective standard of family interface and home screen is different, and being specifically dependent upon according to the input of some embodiments is from split screen mould The user interface starting that formula or screen mode toggle are shown.Similarly, full frame application program switch user interface is optionally with two Kind configuration display is (for example, all application programs are as the card that can individually select or at least two application combinations in split screen In card), it is specifically dependent upon the different groups of standards met according to some embodiment navigation gestures.
The Application U.S. Serial No 15/ that the example of the standard of each of these decisions was submitted on January 24th, 2018 It is described in more detail in 879,111, content of the application is expressly incorporated herein.Optionally exclude or Rearrange one or more decisions in batch operation 80x1.In some embodiments, optionally additional decision is added Into the group decision in batch operation 80x1.Display other users interface additionally, optionally be will lead to (for example, control panel User interface or notice user interface) decisions be added to the group in batch operation 80x1 decision in.
Then equipment determines whether (836) detect being lifted away from for contact.It is lifted away from if detected, equipment navigates to (838) The dbjective state (for example, the dbjective state distributed by batch operation 80x1) currently distributed is (for example, the target that display currently distributes The user interface of state).For example, detecting lift because contact 4424 suspends at the position 4424-d in Fig. 5 B16 From before, equipment will distribute application program switch as dbjective state (for example, " cutting for application program according to decision 80x6 The pause of parallel operation ") so that equipment navigates to the application program switch user interface in Fig. 5 B17, because it is to work as scheming The dbjective state currently distributed when being lifted away from is detected in 5B16.
It should be appreciated that the particular order that the operation in Fig. 8 is described is only exemplary, it is not intended that show institute Stating sequence is the unique order that can execute these operations.Those skilled in the art will recognize that various ways come to herein The operation is resequenced.Additionally, it should be noted that herein in regard to other methods as described herein (for example, method 600, 700,900,1000 and 1100) description other processes details equally in a similar way be suitable for above for Fig. 8 describe Method 800.For example, above with reference to method 800 describe contact, gesture, user interface object, tactile output, intensity threshold, Focus selector, animation optionally have herein with reference to other methods as described herein (for example, method 600,700,900,1000 With 1100) description contact, gesture, user interface object, tactile output, intensity threshold, focus selector, animation feature in One or more.For brevity, these details are not repeated herein.
Figure 10 A to Figure 10 D is the method 1000 navigated between user interface shown according to some embodiments Flow chart.Method 1000 is in the electronic equipment with display and touch sensitive surface (for example, equipment 300, Fig. 3;Or it is portable more Function device 100, Figure 1A) at execute.In some embodiments, electronic equipment includes for the contact with touch sensitive surface is detected Intensity one or more sensors.In some embodiments, the touch sensitive surface and display are integrated into touch-sensitive display. In some embodiments, display is touch-screen display, and touch sensitive surface is over the display or integrated with display.In In some embodiments, display is to separate with touch sensitive surface.Some operations in method 1000 are optionally combined, and And/or the sequence of some operations of person is optionally changed.
Method 1000 be related in response to contact (e.g., including three, four, five or more contact) gesture more and It navigates between user interface, for example, the gesture considers, to can satisfy the contacts of different mobile conditions flat as one group It moves and contact is moved relative to each other (for example, " kneading " and " expansion " is mobile).Allow user according to whether meeting certain shiftings Dynamic condition (for example, translation and/or mediate speed and position/analog position threshold value standard) navigates to (i) home screen, (ii) application program shown before (for example, immediately before) user interface shown when gently sweeping gesture and starting on the screen (for example, " next or upper application program "), (iii) application program switches user interface, and (sometimes referred to as " multitask " is used Family interface), or (iv) return gently sweep user interface shown when gesture starts (" current application program "), enhance equipment Operability and make user equipment interface more effective (for example, realizing needed for expected results when by reducing operation equipment Step quantity), turn, this reduces electricity usage and extend equipment battery life (for example, by help user faster Speed and equipment is efficiently used).Method 1000 is related to dynamically adjusting threshold value by the end-user interface state based on prediction, To improve the accuracy navigated between user interface.In addition, method 1000 is related to by reducing and outside display area The influence of the sensor that lacks exercise associated unexpected input and artifact, Lai Tigao navigate accurate between user interface Property.
Method 1000 is with display and touch sensitive surface (in some embodiments, display is touch-sensitive display) It is executed at equipment, (for example, on touch-screen display) shows user interface (for example, application program user interface or home screen Curtain user interface).Equipment detect (1002) touch-screen display on multiple contacts (for example, Fig. 5 C10, Fig. 5 C13, Fig. 5 C17, Fig. 5 C21, Fig. 5 C27, Fig. 5 C30, Fig. 5 C33, contact group shown in Fig. 5 C37 and Fig. 5 C43), and enter and allow to navigate to Different user interface is (for example, back to current application program user interface, navigating to different (for example, next/upper one) and answering With program user interface, navigate to home on-screen user interface or navigate to application program switch user interface) transition use Family interface.In some embodiments, correspondence application view replacement application program of the equipment in transition user interface User interface (for example, replacing interaction map user interface with application view 4526, and is replaced with application view 4528 Electronic mail user interface is changed, such as Fig. 5 C11, Fig. 5 C14, Fig. 5 C18, Fig. 5 C22, Fig. 5 C28, Fig. 5 C31, Fig. 5 C34, Fig. 5 C38 With shown in Fig. 5 C44).
The position and speed of equipment monitor (1004) contact and provide visual feedback, (for example, by it is mobile, reduce or Amplify input start when replace user interface application view) indicate to the user that contact is lifted away from when equipment how will navigate (for example, any user interface will be shown and activate).In some embodiments, the shown application view of equipment tracking Position and speed (this is manipulated by the movement contacted), and the feature based on the application view for providing a user feedback (for example, size, position and/or speed) come determine dbjective state (for example, if terminate gesture, will navigate in this case Application program user interface).For example, equipment 100 monitors email application view as shown in Fig. 5 C13 to Fig. 5 C15 4528 position and speed, this is controlled by the movement of contact 4532,4536,4540 and 4544.In Fig. 5 C14, Email The instantaneous attribute of application view 4528 meets application program switch navigation standard, and the coplanar display Email of equipment Application view 4528 and interaction map application view 4526 are coplanar, and show taskbar 4006 in the background, this Indicating equipment will navigate to application program switch user interface when contact will be lifted away from.On the contrary, working as electricity as shown in Fig. 5 C15 When the instantaneous attribute of sub- mail applications view 4528 meets home screen navigation standard, interaction map application view 4526 stop display, and email application view 4528 is in the home screen user for starting to become focus in the background It is shown on interface.
Equipment is then based on the current attribute of input (for example, what user prediction user when being lifted away from input will navigate to Interface) distribute (100x1) current goal state (for example, if being lifted away from input at this moment, it will the user interface navigated to). As shown in Figure 10 A, equipment by current signature based on input (for example, attribute change of the contact mostly in contact gesture) and The value of one or more threshold values is (for example, by measuring (for example, the y based on contact is translated and/or contracting input feature vector with various The first measurement (for example, y magnitude is measured) that tight magnitude determines, the second measurement (example for determining of magnitude based on the x translation of contact Such as, x magnitude measure) and/or based on contact translation change rate and/or contact tighten rate determine third measurement (for example, Change rate metric), third measurement is optionally that the first measurement and/or second measure the rate that changes with time) be compared) make One or more (for example, a series of) determine that (100x2 to 100x11) carrys out selection target state.
Each of these decisions are illustrated in greater detail in corresponding Figure 10 B to Figure 10 D and are carried out below More detailed description.Optionally exclude or rearrange one or more decisions in batch operation 100x1.In some embodiment party In case, optionally determined additional in the group being added in batch operation 100x1 decision.It additionally, optionally will lead to display The decision at other users interface (for example, control panel user interface or notice user interface) is added in batch operation 100x1 The group determine in.
In some embodiments, based on the application program view for replacing user interface when calling user interface selection course The first measurement (for example, vertical magnitude measurement), the second measurement (for example, horizontal magnitude is measured) and/or the third of figure measure (example Such as, change rate is measured) determine the current goal state (for example, user circle that will be navigate to when navigation gesture will terminate Face), such as this manipulates based on multicontact translation and kneading are mobile.In some embodiments, the of application view One measurement, the second measurement and/or third measurement are different from the actual displayed attribute of application view, for example, application program regards The simulation y translation for corresponding to the first measurement of figure may include the mass center positioned at the first y location (for example, in virtual monitor), and The application view shown in equipment have be located at actual display on the second y location mass center, the second position with First position on virtual monitor is different.
In some embodiments, the first measurement, the second measurement and/or third measurement are defeated based on the observable from contact The combination entered.For example, in some embodiments, the first measurement (for example, y magnitude is measured) of application view is with first The increase of observable attribute (for example, contact y location over the display of navigation gesture) and increase, and with second considerable It examines the increase of attribute (for example, kneading campaign of the contact of navigation gesture) and increases.For example, the electronics in Fig. 5 C13 to Fig. 5 C15 First measurement moving up and increase with contact 4532,4536,4540 and 4544 of mail applications view 4528, together When shown email application view 4528 y location also increase over the display.Equally, Fig. 5 C37 to Fig. 5 C39 In interaction map application view 4528 first measurement also with contact 4670,4674,4678,4582 and 4686 contracting Tightly (for example, kneading) increases and increases, while the y location of shown interaction map application view 4528 is over the display Do not increase and (for example, interaction map application view 4526 seems to narrow down in the virtual palm of gesture, rather than is showing It is travelled upwardly on device).
In some embodiments, the first measurement (for example, y magnitude is measured) of application view is based on more contact navigation The y translational motion (for example, gently sweeping movement from contact) contacted in gesture and what is contacted tighten movement (for example, contacting direction It is mutual mediate movement) combination.For example, in Fig. 5 C44 to Fig. 5 C46, the first of interaction map application view 4526 Measurement with 4690,4694,4698 and 4702 vertically move of contact (Fig. 5 C44 to Fig. 5 C45) and contact 4690,4694, 4698, it 4702 and 4706 tightens movement (Fig. 5 C45 to Fig. 5 C46) and increases, although interaction map application view 4526 Actually moved down in Fig. 5 C46.The increase of first measurement passes through interaction map application view in Fig. 5 C45 and Fig. 5 C46 The diminution of Figure 45 26 and by other visual cues (for example, in Fig. 5 C46 email application view 4528 disappearance With the appearance of home on-screen user interface in the background in Fig. 5 C46) it indicates over the display.
In some embodiments, the feature y-component of the movement based on the contact in more contact navigation gestures is (for example, connect The y-component of the movement of the mass center of touching) and the contact in more contact gesture the summation for tightening motion characteristics component (for example, base According to the variation of the simulated altitude for the virtual window of contact tightening movement and shrinking) determine the first of application view Measurement (for example, y magnitude is measured).In some embodiments, based on by the movement of the mass center of the contact during more contact gestures Y-component add half and/or void due to the height change of virtual window caused by tightening movement (for example, refer to kneading) more Intend the y-component of the movement of window to determine the first measurement.
In some embodiments, show that the position of application view tightens fortune to determine by calculating in virtual window The component of dynamic (for example, more contacting kneading gesture) adjusts the size of the virtual window according to the attribute of more contact kneading gestures, For example, diminution or extended window according to the kneading of contact or expansion movement.In some embodiments, based on continuously measuring In the measured translation (for example, measured y is translated) of mass centers of contact in more contact gestures calculate the contracting of virtual window It puts.In some embodiments, the y of y translation feature locations (for example, mass center) of the ratio based on contact of virtual window translates phase The percentage of the characteristic measurements (for example, half plus or minus offset of screen height) shown compared with size, and Optionally by the limit of minimum dimension (for example, indicating the asymptote in the nonlinear function of the size adjusting of application view) System.
In some embodiments, the scaling of virtual window is also proportional to the characteristic measurements for the amount of tightening (for example, empty The ratio of quasi- window is the translation and the product of the characteristic measurements tightened of the mass center of contact).In some embodiments, it contracts The characteristic measurements tightly measured are based on the perimeter between continuous measurement between contact (for example, surrounding the close-shaped of contact Perimeter, such as encirclement or the round or ellipse across some or all contacts, or use the polygon contacted as vertex Or convex polygon) percentage variation.It enables a device to consider finger using the increment variation of perimeter between continuous measurement It is added to gesture or removes finger (for example, if contact is added to 4 existing contacts, such as Fig. 5 C44 to Fig. 5 C45 from gesture Shown, then the previous of window size changes based on the perimeter change between 4 contacts, and next variation base of window size Perimeter change between 5 contacts).
In some embodiments, during tightening movement, the display of application view is maintained at virtual display window Interior feature locations (for example, centered on mass center of contact, for example, in the virtual palm of contact), and moved according to tightening Attribute tuning window size.However, in some embodiments, tightening movement at display edge (for example, display Bottom margin) nearby execute in the case where, there is exception, i.e., when application view is close to screen edge, application program is regarded The movement of figure is slowed or stopped.
In some embodiments, the feature x-component of the movement based on the contact in more contact navigation gestures is (for example, connect The x-component of the movement of the mass center of touching) determine the second measurement (for example, x magnitude measure) of application view.In some implementations In scheme, the second of application view measures any characteristic measurements for tightening movement independently of contact (for example, independent In any contraction or extension by contacting virtual window caused by kneading or expansion movement) more.Therefore, in some embodiments In, for example, executing virtual window in the feature locations (for example, mass center of contact) around the contact relative to more contact gestures In the case where size adjusting, the display of application view is moved toward the feature locations (for example, mass center) of contact, however, second Measurement is not influenced by the feature locations (for example, mass center) contacted.For example, what is executed near display right hand edge tightens movement The right hand edge that will lead to application view towards display is mobile, however, equipment will not select previous application program to use Family interface is as current goal state, because the second measurement of application view is unaffected.
In some embodiments, tightening rate and determine using journey based on the change rate of the translation of contact and/or contact The third measurement (for example, variation rate metric) of sequence view is optionally that the first measurement and/or the second measurement change with time Rate.
Then equipment determines whether (1036) detect being lifted away from for contact.It is lifted away from if detected, equipment navigates to (1038) dbjective state (for example, the dbjective state distributed by batch operation 100x1) currently distributed is (for example, current point of display The user interface for the dbjective state matched).For example, when meet one/next navigation application program standard (for example, being directed to down The vertical of an one/upper application program gently sweeps standard 100x5) when, being lifted away from for contact 4510,4514,4518 and 4522 causes Previous application program user interface is navigate to, as shown in Fig. 5 C10 to Fig. 5 C12;(the example when meeting home screen navigation standard Such as, it is sized/translates back Home standard 100x2), being lifted away from for contact 4530,4534,4538 and 4542 causes to navigate to home On-screen user interface, as shown in Fig. 5 C13 to Fig. 5 C16;And it is punctual (for example, extremely when meeting application program switch clearing mark The of short duration mobile standard 100x8 at a slow speed of application program switch), being lifted away from for contact 4548,4552,4556 and 4560 causes It navigates to application program switch user interface, as shown in Fig. 5 C17 to Fig. 5 C19.
It is lifted away from if be not detected, equipment optionally for example updates (1040) according to submethod shown in Figure 10 D Influence the dynamic threshold of the selection of one or more current goal user interfaces.In some embodiments, dynamic threshold is adjusted Whole is the end-user interface dbjective state for being conducive to current predictive, to prevent the attribute inputted during contact is lifted away from from occurring Unexpected change and influence final determination.For example, in order to prevent equipment fast move its finger by accident when being lifted away from user and It navigates in home, equipment will increase dynamic speed threshold value (for example, threshold speed range 910 in Fig. 9 A) when contacting pause, It is expected that being lifted away from event for what equipment navigated to application program switch user interface.
It is lifted away from if be not detected, equipment continues to monitor the attribute of (1004) input and provides visual feedback, updates (for example, distribution) (100x1) current goal state, and (1040) dynamic threshold is optionally updated until detecting that (1036) are lifted From.
In some embodiments, when distribution (100x1) current goal state, equipment determines (100x2) first, and this is defeated Enter whether to look like substantially big or sufficiently large and substantially vertical (for example, more vertical than horizontal) " quickly adjusts ruler It is very little/to translate back Home " gesture (for example, causing application view with the magnitude of third measurement (for example, variation rate metric) Input), indicate that the intention (being determined by equipment) of user is to navigate to home on-screen user interface.In some embodiments, if Whether the standby third measurement (for example, motion control such as by contacting) for determining application view meets (1006) the first R/T speed Threshold value is spent (for example, vertical and adjustment size speed (VY, r) threshold value 902, define the sector I in Fig. 9 A) or satisfaction (1008) second R/T threshold speed is (for example, lower vertical and adjustment size speed (VY, r) threshold value, the speed in such as Fig. 9 A on y-axis direction Threshold value 910 is (for example, distinguish sector II with sector V) and essentially upward (for example, slope threshold value 904 and 906 in figure 9 a In (speed of the more vertical sector II of speed and contact more horizontal sector III and sector IV are distinguished)).If The attribute of contact meets any of these standards, then home on-screen user interface distribution (1012) is current goal by equipment State.In some embodiments, gesture of " flicking back Home " (for example, it is substantially fast or sufficiently fast in vertical direction and The input of substantially vertical (for example, than horizontal more vertical)) and/or " rapid drop returns Home " gesture (for example, substantially quickly contracting The input tightly moved) (for example, contact as substantially only one gently sweep gesture or more contact tightens the gesture of gesture) meet more The threshold value that (100x2) is used to distribute present threshold value dbjective state to home on-screen user interface, for example, because it causes using journey Sequence view has the two of enough thirds measurement, or because has used for quickly gently sweeping or quickly tightening upwards movement Independent threshold value.
In some embodiments, then equipment checks one or more abnormal (for example, via 100x9,100x10 is determined And 100x11, it is described more fully below), in some cases, redistribute current goal state.Then, equipment determines (1036) whether detect and be lifted away from, and be lifted away from if detected, if current goal state is not according to extremely again Distribution then navigates to (for example, display) (1038) home on-screen user interface.For example it is assumed that contact 4532 in Fig. 5 C14, 4536,4540 and 4544 movement causes the translation specific rate threshold value 902 of application view 4528 fast or falls in figure 9 a In the III of sector (for example, meet " flicking back home upwards " standard (1006) or (1008)), equipment is by home on-screen user interface It is assigned as current goal state, when so that contact being lifted away from Fig. 5 C15, equipment navigation (for example, display) home screen user circle Face, because home on-screen user interface is current goal state when being lifted away from.Equally, it is assumed that contact 4602 in Fig. 5 C28, 4606,4610,4614 and 4618 characteristic measurements tightened cause the drawdown ratio threshold speed of application view 4526 902 is fast or fall in sector III in figure 9 a (for example, meeting " rapid drop returns home " standard (1006 or 1008)), equipment Home on-screen user interface is assigned as current goal state, when so that contact being lifted away from Fig. 5 C29, equipment navigation (for example, Display) home on-screen user interface, because home on-screen user interface is current goal state when being lifted away from.
In some embodiments, if equipment determines that input is unsatisfactory for " quickly adjust size/translation and return Home " standard (100x2), then then whether determining (100x3) input looks like " substantially adjust size/translation and return Home " gesture (example to equipment Such as, so that application view has the input of the magnitude of the first substantially sufficiently large measurement (for example, it is contemplated that application program regards The vertical translation component of the movement of figure and the y magnitude measurement for reducing component)), indicate that the intention (being determined by equipment) of user is to lead It navigates to home on-screen user interface.In some embodiments, equipment determines the first measurement (example of (1010) application view Such as, consider that the combined y magnitude for the amount that the y translation of application view has been reduced with application view is measured) whether meet First upright position and size adjusting threshold value (TY, r) (for example, first simulation y location threshold value 98 in Fig. 9 B).If input Attribute (for example, its movement for controlling application view) meets the standard, then equipment distributes home on-screen user interface It (1012) is current goal state.In some embodiments, " upward to drag back Home " gesture is (for example, in vertical direction Advance it is remote enough, no matter how soon) and/or " zooming back out Home " gesture (for example, tightening input remote enough) (for example, as base Only one contacts more in sheet gently sweeps gesture or more contact tightens the gesture of gesture) meet (100x3) for distributing present threshold value mesh Mark state to home on-screen user interface threshold value, for example, because it cause application view have it is enough first measurement The two, or because used for quickly gently sweeping or quickly tightening the independent threshold value of movement upwards.
In some embodiments, equipment then check it is abnormal (for example, via determine 100x9,100x10 and 100x11 is described more fully below), in some cases, redistribute current goal state.Then, equipment determines (1036) Whether detect and be lifted away from, and is lifted away from if detected, if current goal state is not redistributed according to abnormal, Navigate to (for example, display) (1038) home on-screen user interface.For example it is assumed that the contact 4532,4536,4540 in Fig. 5 C14 With 4544 movement cause the translation of application view 4528 it is remote enough from display bottom (for example, be more than upright position and Size adjusting threshold value 916, as shown in Figure 9 B) (for example, meeting " substantially adjust size/move to Home " standard (1010)), if It is standby that home on-screen user interface is assigned as current goal state, when so that contact being lifted away from Fig. 5 C15, equipment navigation (example Such as, show) home on-screen user interface, because home on-screen user interface is current goal state when being lifted away from.Equally, it is assumed that The characteristic measurements of contact 4602,4606,4610,4614 and 4618 in Fig. 5 C28 tightened cause application view 4526 are contracted to sufficiently small size (for example, meeting " substantially adjust size/move to home " standard (1010)), and equipment will Home on-screen user interface is assigned as current goal state, and when so that contact being lifted away from Fig. 5 C29, equipment navigation is (for example, aobvious Show) home on-screen user interface, because home on-screen user interface is current goal state when being lifted away from.
In some embodiments, if equipment determines that input is unsatisfactory for " substantially adjust size/move to home " standard (100x3), it is " light for the side of next/upper application program that equipment then determines whether (100x4) input looks like Sweep " gesture (for example, more contacts are gently swept to the right or to the left with enough horizontal velocities, i.e., downward horizontal or essentially horizontally Mobile (for example, more more horizontal than vertical), and do not indicate to return from the top of next/upper application program), indicate user Be intended to navigate to the application program user interface previously shown (as determined by equipment) (for example, application program Different application in lamination).In some embodiments, equipment determines whether the x-axis speed of (1014) input is full first The first x-axis threshold speed in sufficient horizontal direction (for example, when advancing to the left, is connected by the left margin of the range of threshold speed 910 With the threshold speed that slope threshold value 904 and 912 limits together, the joint of sector III and VI are limited in figure 9 a, or ought be to the right When traveling, the threshold speed limited by the right margin of the range of threshold speed 910 together with slope threshold value 906 and 914, in figure 9 a Limit the joint of sector IV and VII).In some embodiments, equipment determines the x-component (example of the speed of application view Such as, rather than contact itself, but its movement is caused by x translational component as the movement contacted) whether meet horizontal direction X threshold speed.
In some embodiments, if contact/application view meets the standard, equipment determination is examined with for the first time Whether the estimated magnitude for measuring the first measurement of the corresponding input/application view of the user interface shown when input approaches (1018) original magnitude of the input/application view first measurement is (for example, application view is in device activation user At once the y location and/or size of (for example, show transition navigation user interface for the first time) after the selection course of interface) or first The magnitude of measurement lower than (1020) first threshold (for example, it is desired to contact at least threshold quantity kneading and/or move up, it is right It should not be the probability inputted unintentionally in input).If input is unsatisfactory for any of these standards, equipment is by application program It is current goal state that switch user interface, which distributes (1022),.
In some embodiments, if input meets in estimated size/position (1018) or y-axis position (1020) standard Either one or two of, then equipment determine (1021) input/application view after the movement of threshold quantity whether with previous row Into contrary side travel upwardly.If input/application view is unsatisfactory for the standard, equipment by it is next/on One application program user interface distribution (1024) is current goal state.For example, in Fig. 5 C11, contact 4510,4514, 4518 and 4522 advance to right side (for example, application view 4526 be moved to right side) and had not advanced to a left side previously Side, therefore previous application program user interface (for example, corresponding to indicates 4528) is assigned as current goal state by equipment.In In some embodiments, next application program or a upper application program is selected to take as the decision of current goal state Certainly in the moving direction of input (for example, directional velocity of the position change direction of input or input), the input/application program The moving direction of view sets current goal state for next/upper application program user interface for determination.One In a little embodiments, if the direction of position change is the decisive characteristic of input/application view, input/apply journey The direction of the position change of sequence view is used for determining it is to select next application program or a upper application program as current Dbjective state.In some embodiments, if directional velocity is the decisive characteristic of input/application view, input/ The directional velocity of application view is used for determining it is to select next application program or a upper application program as current Dbjective state.For example, if input/application view is moved to left side and next/upper application program is selected to make For current goal state, then select an application program as current goal state, and if input/application view It is moved to right side and selects next/upper application program as current goal state, then select next application program (or current application program, if without next application program) is used as current goal state, and vice versa.
In some embodiments, if input/application view is advanced after the movement of threshold quantity with previous Contrary side travel upwardly (such as meeting standard (1021)), then equipment distributes current application program user interface It (1030) is current goal state.The distribution avoids unexpected navigation, for example, when user starts gently to sweep gesture to check previously Application program user interface when, it is not intended to actually navigate to previous application program user interface, then change the direction of input To return to " current application program ".Without this rule, distribution logic 100x1 will distribute next application program user interface (for example, application program on the right side of " current " application program), rather than current application program.
By application program switch user interface (1022), next/upper application program user interface (1024) or current application program user interface (1030) is assigned as after current goal state, in some embodiments, if It is standby then to check abnormal (for example, being described more fully below via 100x9,100x10 and 100x11 is determined), in certain situations Under, redistribute current goal state.Then, equipment determines whether (1036) detect and is lifted away from, and if detects It is lifted away from, then navigates to the dbjective state user interface that (for example, display) (1038) are currently distributed.For example it is assumed that Fig. 5 C11's connects The speed of touching 4510,4514,4518 and 4522 and/or application view 4526 makes it with to right side fast enough, and answers With the original y location and size of the close enough application view of y location and size of Views 4526, for example, meeting " gently sweeping for the side of next/upper application program " standard (100x4), equipment will be regarded with the application program in Fig. 5 C11 The corresponding electronic mail user interface previously shown of Figure 45 28 is assigned as current goal state, so that the lift in Fig. 5 C12 From when, equipment navigation (for example, display) electronic mail user interface, because it is current goal state when being lifted away from.
In some embodiments, if equipment determines that input is unsatisfactory for " for the side of next/previous application program Gently sweep in face " standard (100x4), then whether determining (100x5) input looks like " for next/previous application journey equipment The bottom margin of sequence is gently swept " gesture (for example, the input advanced to the left or to the right along the bottom margin of display), indicate user Be intended to navigate to the application program user interface previously shown (as determined by equipment).In some embodiments In, equipment determines the magnitude of second measurement of (1016) input/application view (for example, contact/application view is worked as Preceding x-axis position or contact/application view prediction x-axis position) whether with first measurement minimum magnitude (for example, using The minimum y-axis of Views translates and reduces and (translate threshold value 922 lower than the minimum y-axis described in Fig. 9 B)) in right or left direction The second x-axis position threshold of upper satisfaction (for example, the second x-axis position threshold 920 described in Fig. 9 B).If input/application program The attribute of view meets the standard, then next/upper application program user interface distribution (1024) is current mesh by equipment Mark state.
In some embodiments, equipment then check it is abnormal (for example, via determine 100x9,100x10 and 100x11 is described more fully below), in some cases, redistribute current goal state.Then, equipment determines (1036) Whether detect and be lifted away from, and is lifted away from if detected, if current goal state is not redistributed according to abnormal, Navigate to (for example, display) (1038) next/upper user interface.For example it is assumed that contact 4510 in Fig. 5 C11, 4514,4518 and 4522 and/or application view 4526 the enough express deliveries in position move on to right side (for example, more than being retouched in Fig. 9 B The x-axis position threshold 920-b drawn) and the bottom margin of display is sufficiently closed to (for example, lower than the minimum y described in Fig. 9 B Axis translates threshold value 922), for example, meeting " gently sweeping for the side of next/upper application program " standard (100x5), equipment The electronic mail user interface of the application view 4528 corresponded in Fig. 5 C11 previously shown is assigned as current goal State so that in Fig. 5 C12 when being lifted away from, equipment navigation (for example, display) electronic mail user interface, because of the electronics postal Part user interface is the current goal state when being lifted away from.
In some embodiments, if equipment determines that input is unsatisfactory for " for the bottom of next/upper application program Portion is gently swept at edge " standard (100x5), then whether determining (100x6) input looks like " for application program switch equipment Pause " gesture (for example, pause or close pause in input/application view speed), instruction user is intended to (such as by setting It is standby identified such) navigate to application program switch user interface.Equipment determines (1026) contact/application view X speed and third measurement are (for example, it is contemplated that the change rate degree of the rate of the size adjusting of the rate and application view of y translation Amount) whether there is minimum speed (Vx) and (VY, r) (for example, contact/application view, which has, corresponds to the point near origin Speed, in the sector V that the dynamic speed size/translation threshold value 910 for the threshold speed scheme described in by Fig. 9 A limits). If contact/application view attribute meets the standard, equipment distributes application program switch user interface It (1022) is current goal state.
In some embodiments, equipment then check it is abnormal (for example, via determine 100x9,100x10 and 100x11 is described more fully below), in some cases, redistribute current goal state.Then, equipment determines (1036) Whether detect and be lifted away from, and is lifted away from if detected, if current goal state is not redistributed according to abnormal, Navigate to (for example, display) (1038) application program switch user interface.For example it is assumed that the x-axis of application view 4526 Speed and third measurement (e.g., including size adjusting rate) are the smallest (for example, the speed described in figure 9 a in Fig. 5 C28 Spend near the origin of threshold scheme), such as meet " for the pause of application program switch " standard (100x6), equipment will answer Be assigned as current goal state with program switch user interface so that in Fig. 5 C29 when being lifted away from, equipment navigation (for example, Display) application program switch user interface, because it is the current goal state when being lifted away from.
In some embodiments, if equipment determines that input is unsatisfactory for the standard " for the pause of application program switch " (100x6), equipment then determine (100x7) input whether look like " adjustment size/translation to cancel " gesture (for example, input/ Application view is moved back to bottom of screen and/or input/application program in direction vertical enough with sufficiently large y-axis speed View extends (for example, via expansion) (for example, calling user interface choosing towards input/application view original size When selecting process)), instruction user is intended to navigate back to current application program user interface (as determined by equipment) (for example, the user interface shown when detecting input for the first time).In some embodiments, equipment determines (1028) input Whether speed is in direction substantially downwardly (for example, the slope threshold value 912 and 914 in Fig. 9 A (it is more vertical to distinguish speed Sector VIII and the speed of contact be closer to horizontal sector VI and VII) in).This group of standard requirements speed falls in Fig. 7 A institute In the sector VIII of the threshold speed scheme of description, which requires minimum y-axis threshold speed to meet equal to Fig. 9 A In 910 range of threshold speed bottom boundary value (for example, sector V and sector VIII are separated).However, due to equipment Determine that the speed of contact is fallen in the V of sector really (for example, input is not " for the pause of application program switch " 100x6 hand Gesture), thus the step for equipment do not need to check minimum y-axis speed.It in some embodiments, is not including " for answering With the pause of program switch " determine 100x6, or " for pause of application program switch " determine 100x6 before, make " gently sweeping downwards to cancel " determines 100x7, and whether the y-axis speed for determining contact is met minimum y-axis threshold speed by application program, The lower boundary of the range for the threshold speed 910 described in such as Fig. 9 A.If the attribute of contact meets the standard, equipment will be worked as Preceding application program user interface distribution (1030) is current goal state.
In some embodiments, equipment then check it is abnormal (for example, via determine 100x9,100x10 and 100x11 is described more fully below), in some cases, redistribute current goal state.Then, equipment determines (1036) Whether detect and be lifted away from, and is lifted away from if detected, if current goal state is not redistributed according to abnormal, Navigate to (for example, display) (1038) current application program user interface.For example it is assumed that 5070 base of contact velocity in Fig. 5 A55 It is downward (for example, falling into sector VIII shown in Fig. 9 A) in sheet, such as meets " gently sweeping downwards to cancel " standard (1028), equipment will correspond to the instant message user interface for indicating 5014 (for example, when equipment is detected for the first time in Fig. 5 A52 The user interface shown when contacting 5070) it is assigned as current goal state, when so that being lifted away from Fig. 5 A56, equipment navigation (example Such as, show) instant message application program user interface, because it is the current goal state when being lifted away from.In some embodiment party In case, other than back to current application program user interface, which also removes the initial part in response to input and shows The application program taskbar shown.In some embodiments, equipment does not remove the initial part in response to input and answering for showing With program task column column, and taskbar is still shown in current application program user after equipment exits interim subscriber interface On interface.
In some embodiments, if equipment determines that input is unsatisfactory for " adjustment size/translation is to cancel " standard (100x7), equipment then determine whether (100x8) input looks like the hand " to the of short duration movement at a slow speed of application program switch " Gesture is (for example, the input for the magnitude for causing application view that there is third to measure is (for example, it is contemplated that the translation of application view Y translational component and application view size adjusting variation rate metric, for example, such as light with slowly upward y speed Sweep and/or slowly inwardly to mediate tightening for movement, be not apparent to the right or to left), indicate the intention of user (by equipment Determine) navigate to application program switch user interface.In some embodiments, equipment determination is input/application program view The magnitude of the third measurement of figure is negative (1032) (for example, lower than the x-axis for the threshold speed scheme described in Fig. 9 A) or input/ The magnitude of second measurement of application view is (for example, the current x-axis position of contact/application view or application program view The prediction x-axis position of figure) (1034) third x-axis position threshold is met (for example, the 3rd x on the right side or left direction in Fig. 9 B Shaft position threshold value 924).If input/application view attribute is unsatisfactory for any of these standards, equipment will be answered It is current goal state with program switch user interface distribution (1022).For example it is assumed that in Fig. 5 C38 contact 4670,4674, 4678, the rate that 4682 and 4686 speed for tightening movement and application view 4526 reduce is slow enough, and applies Views 4526 are that sufficiently application program switch user interface is not assigned as current goal by translation, equipment in the x direction State, taskbar concurrently shown in display in application view 4528 as previously shown and backstage.
In some embodiments, if third measurement magnitude be negative (1032) or second measurement magnitude (for example, The current x-axis position of contact/application view or the prediction x-axis position of application view) meet (1034) third x-axis position Set threshold value, equipment then determines whether the input is first gently to sweep gesture (for example, with navigating in a series of application program user interfaces Gently sweep gesture second gently sweeps that gesture is opposite, and wherein card stack layer is not yet reorganized).For example, gently sweeping hand in Fig. 5 C10 to Fig. 5 C11 Gesture is first gently to sweep gesture, because there is no previous gently to sweep gesture to the right or to the left in the series.In some embodiments In, if the input is not first gently to sweep gesture, equipment distributes next/next application program user interface to (1024) For current goal state, because there are users to intend to continue with the increased probability to navigate between the user interface previously shown, Because they only perform and this gently sweep gesture.
In some embodiments, if the input is first gently to sweep gesture (1033), equipment if, determines (1035) x-axis position Set whether threshold value (for example, the magnitude for corresponding to the second measurement) is met (for example, to use to the application program previously shown The purposive navigation at family interface is distinguished with accidental contact).If meeting x-axis position threshold, equipment by it is next/ Upper application program user interface distribution (1024) is current goal state.If being unsatisfactory for x-axis position threshold, equipment will It is dbjective state that current application program user interface, which distributes (1024), and does not find the reality between the contact and dedicated navigation gesture Matter similitude.
In some embodiments, by application program switch user interface (1022), next/upper application Program user interface (1024) or current application program user interface (1030) are assigned as after current goal state, in some realities It applies in scheme, then equipment checks abnormal (for example, via 100x9,100x10 and 100x11 is determined, retouch in further detail below State), in some cases, redistribute current goal state.Then, equipment determines whether (1036) detect and is lifted away from, and And be lifted away from if detected, navigate to the dbjective state user interface that (for example, display) (1038) are currently distributed.
In some embodiments, after every sub-distribution current application program state, equipment inspection is to check contact Whether attribute meets exception, and every attribute is all designed to avoid different unexpected navigation, as illustrated in figure 10 c.In some implementations In scheme, abnormal order and identity can change (for example, executing abnormal order modification, abnormal deleted or modification or adding Add additional exception).Firstly, if equipment determines that input is unexpected (for example, the distance that the input is advanced is from display Initial position is not remote enough (1060), and home screen or application program switch are assigned to dbjective state (1066)), then should Equipment replaces the dbjective state that (100x9) is currently distributed with current application program.
In some embodiments, after said one or multiple determinations, if previous object state is application program Switch (1061), then equipment use distribution application program switch as dbjective state come replace (100x10) distribute it is next or A upper application program user interface.For example, moving left and right and being solved when input causes equipment to show application program user interface It is interpreted as gently sweeping the lamination across card, without being moved to next or previous application program user interface.
In some embodiments, if one or more of contact has entered right hand edge or the left edge area of display Domain, when application program switch user interface is the dbjective state distributed before contacting into fringe region, equipment will be used The distribution of application program switch user interface come replace (100x11) except a next or upper application program user interface it Outer any other distribution.Which compensates the contact sensor lazy weights at fringe region.For example, when display is left in contact When device side, lasting transverse shifting is detected without sensor.As long as however, contact a part be located at display screen top, if Standby still will record vertically moves.Therefore, which is optionally construed to pure vertical movement for diagonal motion.
In some embodiments, equipment checks to see whether to meet " ignoring unexpected input " standard (100x9) (example Such as, wherein user's touch apparatus and be not intended to navigate to different user interfaces).Equipment determines the y-axis position of (1060) input Whether (for example, prediction y-axis position that current y-axis position or user interface that contact/user interface indicates indicate) meets the 2nd y Shaft position threshold value (for example, second y-axis position threshold 926 in Fig. 9 B, close to the bottom margin of display).If input meets Second y-axis position threshold (for example, contact is from initial position on display into enough far to exclude unexpected navigation touch), Then equipment is moved to next exception without updating current goal state (for example, determining that input is not that unexpected navigation touches).
If input is unsatisfactory for the second y-axis position threshold, equipment if, determines that (1066) current goal state is home screen User interface or application program switch user interface.If it is, equipment is then by current application program user interface point It is current goal state (for example, updating current goal state to ignore the feelings that may be unintentionally edge touch with (1068) Condition), and advance to next exception.If current goal state is not home on-screen user interface or application program switch User interface, equipment are then moved to next exception without updating current goal state (for example, determining that input is not unexpected side Edge touches).For example, it is significant to the right or the contact of bottom margin that is moved to the left without traveling out display navigates to instruction The clearly intention of the application program user interface previously shown is (for example, meet " for the side of next/previous application program Gently sweep in face " standard (100x4) therefore, unexpected input should not be confirmed as).
In some embodiments, determining whether " ignoring unexpected input " (100x9) (for example, by by current goal State is updated to current application program user interface) after, equipment checks to see whether to have met that " application program switch is inclined It is good " standard (100x10) (for example, wherein dbjective state changed into from application program switch user interface it is next/previous Application program user interface).Equipment determines whether (1061) current goal state is next/upper application program, and Dbjective state before (for example, immediately before) is application program switch (for example, whether equipment divides application program switch It is current goal state with changing into for current goal state by next/upper application assigned).If it is, Then application program switch user interface distribution (1072) is current goal state by equipment, and advances to next exception. , if it is not, equipment then advances to next exception without updating current goal state.
In some embodiments, determining whether to provide " application program switch preference " (100x10) (for example, passing through Current goal state from next/previous application program user interface is updated to application program switch user circle Face) after, equipment checks to see whether to have met " marginal error correction " standard (100x11) (for example, wherein contact is enough Close to the right hand edge or left edge of display, nearest dbjective state is application program switch, and current goal state is not It is next/previous application program).Equipment determine (1062) contact whether in the x fringe region of display (for example, example Such as meet in the right hand edge of range display or left edge about 1mm, 2mm, 3mm, 4mm or 5mm in Fig. 9 B to right hand edge or a left side The x-axis marginal position threshold value 928 at edge), and if continue to determine that (1036) are not in the x fringe region of display No have detected that is lifted away from (or advancing to additional or rearrangement exception), without updating current goal state.
In some embodiments, determine (1070) first if contact is located in the x-axis fringe region of display, if equipment Preceding dbjective state (for example, the dbjective state distributed in the time threshold for entering x-axis region, for example, previous 1,2,3,4, 5, in 6,7,8,9,10,11,12,13,14,15,16,17,18,19 or 20 frame refreshings or dbjective state determine) it is application program Switch user interface, and current goal state is not next/upper application program user interface.If meeting these Standard, equipment then use previous dbjective state (for example, application program switch) replace (1072) current goal state, then after Whether continuous determining (1036), which have detected that, is lifted away from (or advancing to additional or rearrangement exception).If being unsatisfactory for these marks Standard, equipment then continue determine whether (1036) have detected that and be lifted away from (or advancing to additional or rearrangement exception), without Update current dbjective state.
In some embodiments, after being lifted away from of contact is not detected in determination (1036), equipment determines that (1040) are It is no should adjust dynamic speed threshold value (for example, dynamic dimension/translational velocity threshold value 910, as shown in Fig. 9 A and Fig. 9 C) (for example, Wherein current goal application program is application program switch user interface, and contacts and almost suspend on the screen, and equipment increases Add from the sector V in Fig. 9 A move to sector II needed for dynamic speed threshold value, distribution phase with home on-screen user interface Association, so that the increase that is not intended to of contact velocity when contact is lifted away from by user from screen be prevented to be interpreted that user is intended to change Application program switch user interface is navigated to navigate to home).This dynamic calibration improve navigate to it is specific The prediction and accuracy of dbjective state user interface (for example, application program switch user interface).
In some embodiments, equipment determines whether (1042) current goal state is application program switch user circle Face and contact/application view third measure (VY, r) (for example, it is contemplated that the y speed of application view and using journey The variation rate metric of the size adjusting speed of sequence view) magnitude and contact/application view x speed whether be unsatisfactory for Minimum speed threshold value is (for example, the speed of the smaller area in the range of the threshold speed 910 in Fig. 9 A, or the sector V of restriction Fig. 9 A Spend the range (for example, smaller area around the origin for the threshold speed scheme described in Fig. 9 A) of threshold value).
In some embodiments, if meeting these standards (for example, being application user circle in current goal state The time contact in face is paused or almost suspends), then equipment determines whether (1046) dynamic speed threshold value is in maximum magnitude (example Such as, whether dynamic dimension/translational velocity threshold range 910 is in maximum magnitude 910-b, as shown in fig. 9 a and fig. 9b), and such as Fruit is in maximum magnitude, then continues to monitor the position and speed of (1004) input/application view and provide visual feedback Without updating dynamic threshold.If dynamic threshold is not in maximum magnitude (for example, dynamic dimension/translational velocity threshold range 910 is small In maximum magnitude 910-b), equipment then increases the range of (1048) dynamic speed threshold value (for example, expanding threshold value 910 " frame " direction Max-thresholds range 910-b), it then proceedes to the position and speed of monitoring (1004) input/application view and view is provided Feel feedback.
In some embodiments, if being unsatisfactory for these standards (for example, contact is paused or almost in current goal shape State is the time of application program user interface), then equipment determines whether (1042) dynamic speed threshold value is in minimum zone (example Such as, whether dynamic dimension/translational velocity threshold range 910 is in minimum zone 910-a), and if being in minimum zone, Continue to monitor the position and speed of (1004) input/application view and provides visual feedback without updating dynamic threshold. If dynamic threshold not at minimum zone (for example, dynamic dimension/translational velocity threshold range 910 be greater than minimum zone 910-a), Equipment then reduces the range of (1044) dynamic speed threshold value (for example, collapse threshold 910 " frame " is towards minimum threshold range 910- A), it then proceedes to the position and speed of monitoring (1004) input/application view and visual feedback is provided.It should be appreciated that Process described in flow chart be alternatively applied to be as described herein for determination into application program switch user interface, Any one of the method for home screen and/or one upper/next application program, these methods are used for opposite herein It navigates between the user interface of the user interface description shown in Fig. 5 C1 to Fig. 5 C59.
Fig. 9 A to Fig. 9 C is shown in different user interface such as application program user interface, a upper application program The example thresholds navigated between user interface, home on-screen user interface and application program switch user interface.Figure Threshold value shown in 9A to Fig. 9 C is with method 600,700,1000 and 1100 together for navigating between user interface Threshold value example.
Fig. 9 A is shown for inputting a series of/example threshold speeds of the measurement of application view, and that takes into account defeated Enter the speed (for example, variation rate metric) that/rate of translation and size adjusting of application view/tightens movement, is used for Navigation standard of the face for example about Figure 10 A to Figure 10 D description.Example threshold speed shown in Fig. 9 A includes over the display Horizontal translation velocity component (Vx;For example, correspond to the velocity component of the abscissa in cartesian coordinate system shown in Fig. 9 A, Consider the rate of input/application view horizontal translation) and vertical translation/size adjusting velocity component (VY, r;For example, Corresponding to the velocity component of the ordinate in cartesian coordinate system shown in Fig. 9 A, that takes into account input/application views The rate of vertical translation and size adjusting, for example, it is contemplated that as above the third described in Figure 10 A to Figure 10 D is measured).Boundary Intersection define eight sectors (for example, sector I-VIII), it is respectively associated with the dbjective state of particular user interface.Also It is to say, in interim subscriber interface, when user can navigate to any of multiple user interfaces (for example, application program is used Family interface, next/previous application program user interface, home on-screen user interface or application program switch user circle Face) when, the equipment is at least based on input/application view speed (for example, VxAnd VY, r) distribute dbjective state user circle Face.When input/application view speed is fallen into the particular sector as defined by Fig. 9 A, which will be with the sector phase Associated user interface is assigned as dbjective state, as long as input meets every other standard needed for the selection of the dbjective state (for example, location criteria).In some embodiments, these threshold values are used for together with method 600,700,1000 and 1100 It navigates between user interface.
In some embodiments, when the magnitude of the third of input and/or application view measurement is greater than threshold value 902 When, in the I of sector, the sector is associated as the selection of dbjective state with home on-screen user interface for the input.Similarly, have There is the input of the rate in the II of sector associated with the selection of home on-screen user interface dbjective state.With sector III, IV and The input of rate in V is associated with the selection of application program switch user interface dbjective state.With sector VI and VII The input of rate is associated with the selection of next or previous application program user interface dbjective state.Finally, having sector The input of rate in VIII and current application program user interface before equipment enters interim subscriber interface (for example, show The application program user interface shown) dbjective state selection it is associated.
It is dynamic that Fig. 9 A, which also shows threshold velocity optionally,.For example, (it is limited and application for the range of threshold speed 910 The associated sector V of program switch user interface dbjective state), when contact is hovered in the V of sector with minimum speed, from most Small threshold range 910-a expands to max-thresholds range 910-b.Similarly, threshold speed 904 and 906, provide it is next/on Boundary between one application program user interface and the selection of home state user interface, because dbjective state is optionally for example Dynamically change from boundary 904-c to 904-b, to allow the less input vertically moved and home on-screen user interface as mesh The selection of mark state is associated, or the input for allowing more to vertically move and next/upper application program user interface Selection as dbjective state is associated.Depending on the design of particular system, any threshold value is optionally dynamic, such as is passed through Using the method (for example, similar to method 1040) for dynamically adjusting threshold value.
Fig. 9 B is shown with the first measurement for example on the conformable display for corresponding to equipment (for example, it is contemplated that inputting/answering With the y translational component of the translation of Views and the y magnitude measurement of input/application view size adjusting component) and the Two measurements (for example, it is contemplated that the x magnitude of the x translational component of input/application view translation is measured) are relevant a series of to be shown Example position threshold is (for example, in some embodiments, equipment determines input/application program based on the magnitude of the value of the first measurement The simulation y of view is translated, and determines that input/application view simulation x is translated based on the magnitude of the value of the second measurement, and (x, y) of simulation is translated to the position corresponding being mapped on the display with equipment).In some embodiments, These threshold values are with method 600,700,1000 and 1100 together for navigating between user interface.In some embodiments In, position threshold as shown in Figure 9 B works with threshold speed one as shown in Figure 9 A.In some embodiments, certain bits The satisfaction for setting threshold value optionally covers the satisfaction of respective rate threshold value.For example, the satisfaction of the first y location threshold value 98 in Fig. 9 B is covered Corresponding speed threshold value in lid Fig. 9 A, and it is associated with the selection of home on-screen user interface dbjective state by inputting.
Fig. 9 C shows the specific implementation of dynamic dimension/translational velocity threshold value example (for example, threshold speed 910, also such as schemes Shown in 9A), according to some embodiments, corresponds to input/application view third and measure (for example, change rate degree Amount) magnitude.In time T-3, the magnitude of contact/application view third measurement is (for example, it considers input/apply journey Sequence view translational velocity and the combination of input/application program size adjusting speed) 930 be greater than dynamic speed threshold value 910-D (its stroke Divided the selection to home on-screen user interface and application program switch user interface in Fig. 9 A) and input therefore with it is right The selection of home screen (HS) user interface dbjective state is associated.When the magnitude 930 of third measurement reduces near time T When, the magnitude of third measurement drops below dynamic speed threshold value 910-D, to meet for selecting application program switch (AS) standard of user interface dbjective state.In order to be conducive to select application program switch user interface as end user circle Face, dynamic speed threshold value 910-D passage at any time and increase because the magnitude 930 of third measurement continues lower than threshold value.Cause This, for example, even if the magnitude of the third measurement of input/application view 930 is greater than defeated at time T-3 at time T+5 Enter/the magnitude of the third of application view measurement, because dynamic speed threshold value 910-D is increased, input still meets application The selection criteria of program switch.However, when dynamic speed threshold value 910-D reaches threshold maximum value 910-b, although input/ The magnitude 930 of the third measurement of application view is still less than threshold value, but the equipment also stops increasing threshold value.Once inputting/answering It is more than dynamic speed threshold value 930-D in time T+6 with the magnitude 930 that the third of Views is measured, equipment begins to reduce Dynamic speed threshold value 910-D, there is no be conducive to select application program switch user interface as final goal state for this.Although Variable thresholding discussed above is rate-valve value, but similar principles are optionally applied to other kinds of threshold value, such as position threshold Value, pressure threshold, distance threshold.Similarly, although being selection home screen application program switch above with reference to determination User interface can be applied to various user interface interactions come the variable thresholding that variable thresholding is discussed, but is operated in the above described manner (for example, sweeping gesture gently in response to edge to determine and navigate back to previous user interface or rest on present user interface On, determine whether to delete item whether have the intensity higher than predetermined strength threshold value come really based on input in response to gently sweeping gesture It is fixed whether the extension preview of display content items, if gently sweep gesture in response to edge and carry out display control panel user interface etc.).
Figure 11 A to Figure 11 F is to show navigating between user interface based on more contact gestures according to some embodiments Or the flow chart of the method 1100 of operation is executed in application program.Method 1100 is with display, touch sensitive surface and use In the one or more sensors of intensity that detection is contacted with touch sensitive surface electronic equipment (for example, the equipment 300 in Fig. 3, or Portable multifunction device 100 in Figure 1A) at execute.In some embodiments, display is touch-screen display, and And touch sensitive surface is over the display or integrated with display.In some embodiments, display is to separate with touch sensitive surface. Some operations in method 1100 are optionally combined and/or the sequence of some operations is optionally changed.
Method 1100 facilitates based on the gesture initiated from application program user interface (for example, using multiple while detecting The gesture that the contact arrived executes), another user interface outside application program is navigate to from application program user interface, such as It navigates to different application programs or arrives system user interface (for example, home screen), or execute operation in application program. The result of gesture be based on gesture while terminating (for example, gesture) meet which of multiple and different groups of standards (for example, based on by The gesture-type that contact executes, the sum for the contact being detected simultaneously by, the position of contact, timing and/or moving parameter, and/or The standard of the user interface object of display).When determine equipment destination state (for example, to execute what operation and/or to show Any user interface shown) when, for the gesture of different standard set continuous assessment inputs.Dynamic vision feedback continuously displayed with Based on the input up to the present detected come the possibility destination state of indicating equipment, thus make user have an opportunity to adjust he/ Her input, with modifying the actual purpose of equipment reached after input terminates state.Come using different standard sets true The final destination state (for example, performed operation and/or the user interface finally shown) of locking equipment allows user to use Fluid gesture can change in stream (for example, because user determine change they want realize as a result, user's base Incorrect input is being provided for expected results to him/her in equipment feedback sense) to realize expected results.This is helped to avoid User cancels the influence of unexpected gesture (for example, via another group of input) and then starts again at the needs of gesture, so that with Family-equipment interface is more effective (for example, realizing expected results by the input needed for helping user to provide, and to reduce operation User's mistake when equipment/interacted with equipment), in addition, can faster and more effectively use equipment by using family, reduce Electricity usage and the battery life for extending equipment.It in the method, is except application program user interface for determination Navigation or the heuristics of operation is executed in application program based on the quantity for the contact for including in gesture (for example, two fingers gesture For operating in application program, and four refer to or the five fingers gesture is used for the activation system grade except application program and operates, and such as lead It navigates to different application programs or home screen).After determining that gesture includes the contact more than number of thresholds, inspired in auxiliary Use different criteria for determining whether to navigate to different application program or system level user interface in method (for example, home screen Curtain).Application-level input and system-level input are distinguished using the quantity of contact, enhance the operability of equipment, and Make user-equipment interface is more effective (for example, realizing expected results by the input needed for helping user to provide, and to reduce User's mistake when operation equipment/interacted with equipment), in addition, can faster and more effectively use equipment by using family, subtract Lack electricity usage and extends the battery life of equipment.In addition, allowing other than being operated in selection executing application User is based on various criterion and selects navigating to another application program or navigate between system user interface, also enhances The operability of equipment and make user-equipment interface it is more effective (for example, by reduce execute operation needed for input number Amount), in addition, can faster and more effectively use equipment by using family, reduces electricity usage and extend equipment Battery life.
In method 1100, equipment shows that (1102) are mounted in multiple application programs in equipment over the display The user interface of one application program is (for example, Fig. 5 C1, Fig. 5 C4, Fig. 5 C7, Fig. 5 C10, Fig. 5 C20, Fig. 5 C23, Fig. 5 C27, figure 5C30, Fig. 5 C33, Fig. 5 C37, Fig. 5 C43, Fig. 5 C48, the user interface of map application in Fig. 5 C55 or Fig. 5 C13, figure The user interface of email application in 5C17).Equipment detects the gesture on (1104) touch sensitive surface, wherein detecting hand Gesture includes the initial part of the detection gesture when showing the user interface of the first application program over the display, and detection gesture Including detecting multiple contacts on touch sensitive surface simultaneously (for example, as by detection Fig. 5 C1, Fig. 5 C4, Fig. 5 C7, Fig. 5 C10, figure In 5C13, Fig. 5 C17, Fig. 5 C20, Fig. 5 C23, Fig. 5 C27, Fig. 5 C30, Fig. 5 C33, Fig. 5 C37, Fig. 5 C43, Fig. 5 C48 or Fig. 5 C55 Shown in contact shown in) and detect the multiple contact movement (e.g., including it is the multiple contact at least one of touching Towards (or away from) keeping the substantially stationary movement of at least one in the multiple contact on touch sensitive surface on sensitive surfaces (for example, such as mediate or be unfolded gesture in), it is all it is multiple contact in substantially the same direction while and synchronizing moving (for example, such as refer to gently sweep in gesture), multiple in the multiple contact are in touch sensitive surface towards (or away from) substantially phase more With position while mobile (for example, such as mediating or be unfolded in gesture), and/or by gently sweeping of carrying out of the multiple contact with The mobile combination of kneading/expansion).In some embodiments, detection gesture includes detecting after the movement for detecting contact The multiple contact is lifted away from.
It include (for example, definitely including) according to determining gesture in response to detecting gesture (1106) on touch sensitive surface Two contacts (for example, such as in two fingers gesture) being detected simultaneously by, the contact that equipment is detected simultaneously by based on the two is in hand Movement during gesture is (for example, shifting mobile while contact and/or that a contact is relative to another contact on touch sensitive surface It is dynamic) (1108) operation is executed in the first application program (for example, gesture input is switched to the first application program and first answers The operation of which specific to application program to be executed according to gesture input determination with program).This shows in Fig. 5 C1 to Fig. 5 C9, Wherein equipment re-scaling or rolling map in the user interface of map application.
It include being greater than more than predetermined quantity according to determining gesture in response to detecting gesture (1106) on touch sensitive surface Contact that two (for example, predetermined quantity is three) is detected simultaneously by (for example, such as four refer to or the five fingers gently sweep gesture or four refer to or The five fingers kneading gesture or four refer to or the five fingers gently sweep in the combination with kneading gesture) and at the same time the contact detected in the gesture phase Between the first standard of mobile satisfaction (for example, previously application program standard, wherein previous application program standard requirements gesture includes pre- The contact detected while fixed number amount is on touch sensitive surface along the synchronizing moving of first direction (for example, horizontal leftward or rightward) More fingers to meet on first direction for identification gently sweep the standard of input), equipment is used for boundary from the first application program of display Face switches user circle that (1108) are different from the second application program of the first application program into the multiple application program of display Face is (for example, the second application program is finally showing before the display according to the first application program of application program lamination Application program, the application program lamination be based on application program last time in equipment used (for example, shown) relatively recently Degree lists application program).This shows in Fig. 5 C10 to Fig. 5 C12, Fig. 5 C33 to Fig. 5 C36 and Fig. 5 C37 to Fig. 5 C42, wherein Previous application program standard is met (for example, needle by the gesture of the multiple contact (for example, more than two) progress according to determining Side is gently swept to go to the standard of upper one/next application program 100x4, such as about Fig. 9 A to Fig. 9 C and Figure 10 A to scheming Described in 10D), equipment is switched to user circle of email application from the user interface of display map application Face.
It include (the example more than predetermined quantity according to determining gesture in response to detecting gesture (1106) on touch sensitive surface Such as, predetermined quantity is greater than two, and the contact such as three) being detected simultaneously by is (for example, such as gently sweep gesture or four fingers in four fingers or the five fingers The five fingers kneading gesture or four refer to or the five fingers gently sweep in the combination with kneading gesture) and at the same time the contact detected in gesture The mobile satisfaction of period is different from the second standard of the first standard (for example, previously application program standard) (for example, home clearing mark Standard, wherein home navigate contact that the standard requirements gesture detects while include the predetermined quantity on touch sensitive surface Synchronizing moving in second direction (for example, vertically upward or downwards) is gently swept defeated more fingers for meeting in second direction for identification The contact that the standard entered or the gesture detect while including the predetermined quantity is on touch sensitive surface towards common locus It is moved while (for example, static or mobile) to meet the standard for more referring to kneading gesture for identification, to meet home navigation Standard), equipment includes being used to open to be mounted on equipment from the user interface switching (1112) of the first application program of display to display On multiple application programs in corresponding application programs icon user interface (for example, system user interface, such as home screen User interface or applied program ignitor user interface) (for example, selecting user circle in home on-screen user interface or application program On face, application program is shown with predetermined arrangement, the recency used in equipment without considering them).This Fig. 5 C12 extremely It is shown in Fig. 5 C16, Fig. 5 C27 to Fig. 5 C29 and Fig. 5 C43 to Fig. 5 C47, wherein passing through the multiple contact (example according to determination Such as, more than two) gesture that carries out meets home navigation standard (for example, for navigating to the mark of home screen 100x2,100x3 Standard, as described in Fig. 9 A to Fig. 9 C and Figure 10 A to Figure 10 D), equipment is cut from the user interface of display map application Change to home on-screen user interface.
In some embodiments, the first standard is (for example, previously application program standard, for example, Fig. 9 A to Fig. 9 C and figure For navigating to the standard of one or next application program 100x4 in 10A to Figure 10 D) need (1114) gesture to include First direction is more than the movement of first threshold amount on (for example, corresponding to the direction towards the right hand edge of display on touch sensitive surface) (for example, being more than to be somebody's turn to do by the moving parameter (for example, speed and/or distance etc.) for contacting the movement carried out being detected simultaneously by The first threshold of moving parameter setting, for example, as described in Fig. 9 A to Fig. 9 C and Figure 10 A to Figure 10 D) to meet first Standard (for example, being more than threshold distance or gently sweeping more than the finger of level four or the five fingers of threshold velocity full on touch screen or touch sensitive surface The first standard of foot).For example, this shows in Fig. 5 C10 to Fig. 5 C12, Fig. 5 C33 to Fig. 5 C36, Fig. 5 C37 to Fig. 5 C42.It is required that Gesture includes the movement more than threshold quantity to meet the first standard (for example, for navigating to another application in the corresponding direction The standard of program) it enhances the operability of equipment and keeps user-equipment interface more effective (for example, by meeting one group of item Operation is executed when part to input without further user, reduces input quantity needed for executing operation, and provide function Without keeping user interface and additional controls mixed and disorderly), in addition, can faster and more effectively use equipment by using family, subtract Lack electricity usage and extends the battery life of equipment.
In some embodiments, the second standard is (for example, based on the home navigation standard gently swept, for example, Fig. 9 A to Fig. 9 C With the standard for being used to navigate to home screen 100x2 or 100x3 in Figure 10 A to Figure 10 D) require (1116) gesture to include second Direction (for example, second direction is perpendicular to first direction) on touch sensitive surface (for example, correspond to towards the top edge of display The direction in direction) on more than second threshold amount movement (for example, by be detected simultaneously by contact carry out movement mobile ginseng Number (for example, speed and/or distance etc.) is more than the second threshold for moving parameter setting, for example, such as Fig. 9 A to Fig. 9 C and figure Described in 10A to Figure 10 D) to meet the second standard (for example, based on the home that gently sweeps navigate standard) (for example, touching It is more than preset threshold distance on screen or touch sensitive surface (for example, bigger than for the threshold value based on the multitask navigation standard gently swept Threshold value) or be more than predetermined threshold speed (for example, than threshold value big for the threshold value based on the multitask navigation standard gently swept) Vertically (for example, upwards) four fingers or the five fingers, which are gently swept, meets the first of the second standard (for example, based on the home navigation standard gently swept) Version).For example, this shows in Fig. 5 C13 to Fig. 5 C16 and Fig. 5 C43 to Fig. 5 C47.It is required that gesture respective direction (for example, Different from the direction for navigating to another application program) on include the movement more than threshold quantity to meet the second standard (example Such as, for navigating to the standard of home screen) enhance the operability of equipment and make user-equipment interface it is more effective (for example, It is inputted by executing operation when meeting one group of condition without further user, reduces input number needed for executing operation Amount, and function is provided without keeping user interface and additional controls mixed and disorderly), in addition, can more rapidly and have by using family Effect ground uses equipment, reduces electricity usage and extends the battery life of equipment.
In some embodiments, the second standard is (for example, the home navigation standard based on kneading is (for example, as based on light The substitution for the home navigation standard swept or in addition to the standard), for example, for navigating in Fig. 9 A to Fig. 9 C and Figure 10 A to Figure 10 D To the standard of home screen 100x2 or 100x3) require (1118) gesture include by the contact that is detected simultaneously by toward each other into It is capable more than third threshold quantity movement (for example, the movement of the contact being detected simultaneously by toward each other moving parameter (for example, Speed and/or distance etc.) (for example, being indicated by common static or motion track) be more than third threshold for moving parameter setting Value, for example, as shown in Fig. 9 A to Fig. 9 C and Figure 10 A to Figure 10 D) to meet the second standard (for example, the home based on kneading is led Navigation mark is quasi-).(for example, being more than preset threshold distance (for example, bigger than the threshold value for the multitask navigation standard based on kneading Threshold value) or be more than predetermined threshold speed (for example, threshold value bigger than the threshold value for the multitask navigation standard based on kneading) Four fingers or the five fingers mediate the mobile second edition for meeting the second standard (standard for example, home based on kneading navigates)).For example, This shows in Fig. 5 C27 to Fig. 5 C29 and Fig. 5 C43 to Fig. 5 C47.It is required that gesture includes being carried out by contact towards common locus It is more than the movement (for example, as substitution based on the home navigation standard gently swept or in addition to the standard) of threshold quantity so as to full The second standard of foot (for example, standard for navigating to home screen) enhances the operability of equipment and makes user-equipment circle Face is more effective, and (for example, being inputted by executing operation when meeting one group of condition without further user, reduction executes behaviour Input quantity needed for making, and function is provided without keeping user interface and additional controls mixed and disorderly), in addition, by using family Equipment can be faster and more effectively used, reduces electricity usage and extends the battery life of equipment.
In some embodiments, in method 1100, in response to detecting the gesture on touch sensitive surface (1106), according to Determine detected while gesture includes more than predetermined quantity contact (for example, predetermined quantity be greater than two, such as three) (example Such as, such as refer to four or the five fingers gently sweep gesture or four refer to or the five fingers kneading gesture or four refer to or the five fingers gently sweep group with kneading gesture In conjunction) and movement of the contact being detected simultaneously by during gesture meet third standard (for example, multitask navigation mark Standard, wherein meeting more with substantially the same gesture-type (for example, be directed toward gently sweeps gesture or more finger kneading gestures more) Navigation standard of being engaged in has different threshold values for the characteristic parameter of the movement of contact as home navigation standard) (for example, third Standard be different from the first standard (for example, previously application program standard) and the second standard (for example, home navigation standard) (for example, The standard for navigating to application program switch 100x6 or 100x8 in Fig. 9 A to Fig. 9 C and Figure 10 A to Figure 10 D)), equipment is from aobvious Show the respective table of application program of the switching (1120) of the user interface of the first application program to display including multiple latest activities The user interface shown in equipment (for example, multi-task user interface, wherein based on (example is actively used with these application programs Such as, shown on foreground) recency show the expression of application program, when these expressions are selected, make equipment display application Program).For example, this shows in Fig. 5 C17 to Fig. 5 C19 and Fig. 5 C30 to Fig. 5 C32.Application is distinguished using the quantity of contact Program level input and system-level input, enhance the operability of equipment, and keep user-equipment interface more effective (for example, logical Expected results are realized in the input crossed needed for helping user to provide, and user's mistake when reducing operation equipment/interact with equipment Accidentally), in addition, can faster and more effectively use equipment by using family, reduce electricity usage and extend equipment Battery life.In addition, in addition to selection executing application in operate other than, allow user be based on various criterion navigate to it is another A application program navigates between home on-screen user interface or multi-task user interface and is selected, and also enhances equipment Operability simultaneously keeps user-equipment interface more effective (for example, the quantity for executing the input needed for operating by reducing), in addition, Equipment can faster and more effectively be used by using family, reduce electricity usage and extend the battery life of equipment.
In some embodiments, third standard is (for example, based on the multitask navigation standard gently swept, for example, Fig. 9 A extremely schemes The standard for being used to navigate to application program switch user interface 100x6 or 100x8 in 9C and Figure 10 A to Figure 10 D) it requires (1122) input includes second direction (for example, corresponding to the direction towards the top edge of display on touch sensitive surface Direction) on more than the movement of the 4th threshold quantity threshold value of activated user interface navigation procedure (for example, be used for) and less than the 5th threshold The movement of value amount (for example, based on threshold value used in the home navigation standard gently swept) is (for example, with for navigating to hone screen Movement side needed for the first version (for example, based on the home navigation standard gently swept) of the home navigation standard of curtain user interface To identical) (for example, standard and threshold value as described in Fig. 9 A to Fig. 9 C and Figure 10 A to Figure 10 D) to meet third standard (for example, the multitask navigation standard gently swept based on more fingers).In some embodiments, the movement of the 4th and the 5th threshold quantity It is the moving parameter (for example, speed and/or distance etc.) based on the movement carried out by the contact being detected simultaneously by, and is defined as Moving parameter setting based on the multitask navigation standard gently swept predefined thresholds range (for example, such as Fig. 9 A to Fig. 9 C and Described in Figure 10 A to Figure 10 D).For example, this is in Fig. 5 C13 to Fig. 5 C16 (for going to home screen) and Fig. 5 C17 to Fig. 5 C19 It is shown in (for going to application program switch), vertically moving for contact needed for being relayed to application program switch is less than Contact needed for going to home screen vertically moves.It is required that gesture is included in respective direction (for example, being different from for navigating to The direction of another application program and identical as the direction for navigating to home screen) on be limited in shifting in threshold range (for example, movement more than the movement of the 4th threshold quantity and less than the 5th threshold quantity) is moved to meet third standard (for example, being used for Navigate to the standard of application program switch) enhance the operability of equipment and make user-equipment interface it is more effective (for example, It is inputted by executing operation when meeting one group of condition without further user, reduces input number needed for executing operation Amount, and function is provided without keeping user interface and additional controls mixed and disorderly), in addition, can more rapidly and have by using family Effect ground uses equipment, reduces electricity usage and extends the battery life of equipment.
In some embodiments, third standard is (for example, the multitask navigation standard based on kneading is (for example, as being based on The substitution for the multitask navigation standard gently swept or in addition to the standard), for example, being used in Fig. 9 A to Fig. 9 C and Figure 10 A to Figure 10 D Navigate to the standard of application program switch user interface 100x6 or 100x8) require (1124) input include by being detected simultaneously by The movement less than the 6th threshold quantity that carries out toward each other of contact (for example, with needed for the home navigation standard based on kneading The mobile identical threshold value of threshold quantity) (for example, standard and threshold value as described in Fig. 9 A to Fig. 9 C and Figure 10 A to Figure 10 D) so as to Meet third standard (for example, multitask navigation standard based on kneading).In some embodiments, the movement of the 6th threshold quantity Moving parameter (for example, speed and/or distance etc.) (example based on the movement carried out toward each other by the contact being detected simultaneously by Such as, indicated by common static or motion track), and be arranged with the home navigation standard based on kneading for the moving parameter Respective threshold it is identical.For example, mediating, equipment display home screen user mobile more than the kneading of the threshold quantity if referred to more Interface;If refer to mediate be not above the threshold quantity kneading campaign (but be more than be activated user interface navigation procedure and set The movement for the threshold quantity set), then equipment shows multi-task user interface.For example, this is in Fig. 5 C27 to Fig. 5 C29 (for going to Home screen) and Fig. 5 C30 to Fig. 5 C32 (for going to application program switch) in show, cut wherein such as going to application program The movement of contact toward each other needed for parallel operation is less than as gone to the movement of contact toward each other needed for home screen.It is required that hand Gesture include contact toward each other less than amount of threshold shift movement (for example, be different from require be more than threshold quantity movement with Go to home screen) increase to meet third standard (for example, standard for navigating to application program switch user interface) The strong operability of equipment and make user-equipment interface it is more effective (for example, by executed when meeting one group of condition operate and Further user's input is not needed, reduces input quantity needed for executing operation, and provide function without making user circle Face and additional controls are mixed and disorderly), in addition, can faster and more effectively use equipment by using family, reduce electricity usage simultaneously And extend the battery life of equipment.
In some embodiments, in response to detecting the gesture on touch sensitive surface (1106), include according to determining gesture The contact (for example, predetermined quantity is such as three greater than two) detected while more than predetermined quantity four (for example, such as refer to Or the five fingers gently sweep gesture or four fingers or the five fingers kneading gesture or four fingers or the five fingers are gently swept in the combination with kneading gesture) and it is same When the contact that detects during gesture the 4th standard of mobile satisfaction (for example, current application program show standard (for example, with Standard in ignoring the standard surprisingly inputted or for gently sweeping downwards or being unfolded to cancel)) (for example, being indicated in application program When close to its initial size and/or the detection contact when the expression of application program becomes larger and is mobile towards the bottom of display It is lifted away from) (for example, for keeping the display of current application program and ignoring unexpected input in Fig. 9 A to Fig. 9 C and Figure 10 A to Figure 10 D The standard of 100x7 or 100x9), equipment keeps the display of (1126) first application programs over the display.For example, equipment is shown Certain visual feedback (for example, the user interface currently shown slightly reduces), permission user's acquisition gesture continue to trigger The instruction of user interface navigation process, but if gesture does not continue to, equipment restores the user interface currently shown.For example, This shows in Fig. 5 C20 to Fig. 5 C22, protects wherein gently sweeping after gesture terminates in the small size side carried out by four concurrent contacts Hold map user interface.Equipment is allowed (surprisingly to input or cancel input for example, being used to ignore based on the gesture for meeting the 4th standard Standard) cancel the effect of navigation gesture and restore the application program user interface that currently shows, enhance operating for equipment Property and keep user-equipment interface more effective (for example, operating by executing when meeting one group of condition without further using Family input reduces input quantity needed for executing operation, and provides function without keeping user interface miscellaneous with additional controls Disorderly), in addition, can faster and more effectively use equipment by using family, reduce electricity usage and extend equipment Battery life.
In some embodiments, the 4th standard is (for example, current application program shows standard (for example, ignoring unexpected input Standard or for gently sweeping downwards or going to mediate the standard with cancellation)) (for example, application program indicate close to its originate ruler When very little and/or when the expression of application program becomes larger and is lifted away from towards what detection when the movement of the bottom of display contacted) it requires (1128) input include carried out by the contact being detected simultaneously by less than the 7th threshold quantity movement (for example, a small amount of net movement, Its beginning and end is very close to each other) (for example, the 7th amount of threshold shift and triggering navigate to needed for multi-task user interface Amount of threshold shift is identical (for example, the threshold with the lower limit for the range based on the multitask navigation standard setting gently swept or mediated It is worth identical)) (for example, standard and threshold value for being described in Fig. 9 A to Fig. 9 C and Figure 10 A to Figure 10 D about 100x7 or 100x9) (example Such as, the movement in the threshold quantity time when being initially detected contact toward each other, and/or in a first direction (for example, towards aobvious Show the top edge of device) on synchronizing moving) to meet the 4th standard.For example, when gesture include by it is the multiple contact into The capable kneading less than threshold quantity is mobile, and gesture includes less than when gently sweeping mobile of threshold quantity on first direction, in gesture When termination, gesture meets the 4th standard, and after gesture termination, equipment will not be from the user interface navigation currently shown to another One user interface.Allow equipment input include less than threshold quantity it is mobile when cancel the effect of navigation gesture and restore current The application program user interface of display enhances the operability of equipment and keeps user-equipment interface more effective (for example, passing through Operation is executed when meeting one group of condition to input without further user, reduces input quantity needed for executing operation, And function is provided without keeping user interface and additional controls mixed and disorderly), in addition, can be faster and more effectively by using family Using equipment, reduces electricity usage and extend the battery life of equipment.
In some embodiments, in response to detecting the gesture on touch sensitive surface (1106), include according to determining gesture Detected while more than predetermined quantity contact and movement of the contact being detected simultaneously by touch sensitive surface be After time since the multiple initial detecting of the contact on touch sensitive surface by least threshold quantity, equipment according to Gesture executes (1130) operation (for example, being different from navigating to another system-level user interface (example in the first application program Such as, except the first application program), equipment executed in the application program specific to application program operation (for example, translation or Scale the user interface of the application program, project in delete list etc.)).For example, this shows in Fig. 5 C23 to Fig. 5 C26, Wherein when the movement of contact starts after time threshold TT1, refers to more and gently sweep user circle that gesture causes map application The rolling of map in face.The gesture with the contact more than predetermined quantity is allowed to be passed to the first application using time threshold Program and for executing operation in application program (for example, different from being triggered to user interface except the application program Navigation), it enhances the operability of equipment and keeps user-equipment interface more effective (for example, by holding when meeting one group of condition Row operation is inputted without further user), in addition, can faster and more effectively use equipment by using family, subtract Lack electricity usage and extends the battery life of equipment.
In some embodiments, in method 1100, the contact that equipment detection (1132) is detected simultaneously by is in the gesture phase Between relative movement (for example, such as in more finger kneading gestures) on touch sensitive surface toward each other;And according to it is described simultaneously The relative movement (for example, such as more referring in kneading gesture) of the contact detected toward each other, equipment adjusts the first application program User interface expression size (for example, reducing its size) (for example, toward each other according to the contact being detected simultaneously by Relative movement come dynamically adjust the first application program user interface screenshot capture size).For example, this is in Fig. 5 C27 It is shown into Fig. 5 C28, Fig. 5 C30 to Fig. 5 C31, Fig. 5 C33 to Fig. 5 C34, Fig. 5 C37 to Fig. 5 C40, Fig. 5 C55 to Fig. 5 C56.Example Such as, it describes about Fig. 9 A to Fig. 9 C and Figure 10 A to Figure 10 D for providing the standard of dynamic vision feedback, for example, such as first answering Reflected with the size of the user interface of program indicated.According to the contact relative movement toward each other being detected simultaneously by come Visual feedback (for example, size of the expression of the user interface of the first application program of adjustment) is provided, operating for equipment is enhanced Property and keep user-equipment interface more effective (for example, the input by the internal state of transmission equipment, needed for helping user to provide User's mistake when realizing expected achievement, and reducing operation equipment/interact with equipment), in addition, can by using family Equipment is faster and more effectively used, reduces electricity usage and extends the battery life of equipment.
In some embodiments, in method 1100, the contact that equipment detection (1134) is detected simultaneously by is in touch-sensitive table Movement in respective direction on face is (for example, in substantially the same direction at substantially the same speed simultaneously and synchronously It is mobile), which corresponds to over the display towards the movement at the predefined edge (for example, top edge) of display (for example, such as gently being swept in gesture on be directed toward) more;And it is moved up according to the contact being detected simultaneously by the respective party (for example, such as gently sweeping in gesture in mostly finger), equipment adjusts the size of the expression of the user interface of the first application program (for example, subtracting Its small size) (for example, dynamically adjusting towards the movement of display top edge according to the contact being detected simultaneously by The size of the screenshot capture of the user interface of one application program).In some embodiments, existed based on the contact being detected simultaneously by The movement of the movement and contact of (for example, upwards) toward each other adjusts user circle of the first application program in the respective direction The size of the expression in face.For example, this Fig. 5 C13 to Fig. 5 C15, Fig. 5 C17 to Fig. 5 C18, Fig. 5 C39 to Fig. 5 C40, Fig. 5 C44 extremely It is shown in Fig. 5 C45, Fig. 5 C56 to Fig. 5 C57.For example, being described about Fig. 9 A to Fig. 9 C and Figure 10 A to Figure 10 D dynamic for providing The standard of state visual feedback, for example, as the size of the expression of the user interface of the first application program is reflected.According to examining simultaneously Movement of the contact measured in the respective direction towards display respective edges come provide visual feedback (for example, adjustment first The size of the expression of the user interface of application program), it enhances the operability of equipment and keeps user-equipment interface more effective (for example, helping user to provide required input to realize expected achievement, and reduce by the internal state of transmission equipment User's mistake when operation equipment/interacted with equipment), in addition, can faster and more effectively use equipment by using family, subtract Lack electricity usage and extends the battery life of equipment.
In some embodiments, in method 1100, equipment detects the contact that (1136) are detected simultaneously by simultaneously and is touching The first movement in respective direction on sensitive surfaces, and the second movement contacted toward each other being detected simultaneously by;According to same When the contact that detects first movement (for example, gesture gently sweeps component) in the corresponding direction, equipment makes the first application program The expression of user interface move over the display;And according to the second mobile (example of the contact being detected simultaneously by toward each other Such as, the kneading component of gesture), the expression of the user interface of equipment adjustment (for example, reducing) first application program over the display Size.For example, this shows in Fig. 5 C33 to Fig. 5 C35 and Fig. 5 C37 to Fig. 5 C41.For example, about Fig. 9 A to Fig. 9 C and figure 10A to Figure 10 D is described for providing the standard of dynamic vision feedback, for example, the table of the user interface such as the first application program What the size and position shown were reflected.Visual feedback is provided according to the movement for the contact being detected simultaneously by (for example, according to contact The expression of the user interface of the first application program, and the movement according to contact toward each other are moved in movement in the corresponding direction Come adjust the first application program user interface expression size), enhance the operability of equipment and make user-equipment circle Face it is more effective (for example, by the internal state of transmission equipment, expected achievement is realized in input needed for helping user to provide, And reduce user's mistake when operation equipment/interact with equipment), in addition, can faster and more effectively be made by using family With equipment, reduces electricity usage and extend the battery life of equipment.
In some embodiments, in method 1100, the sum for the contact that equipment detection (1138) is detected simultaneously by exists Change (for example, increasing or decreasing) during gesture is (for example, due to the lift of one or more of the contact that is detected simultaneously by From, and/or the touch-down due to one or more supplementary contacts on touch sensitive surface), wherein the first standard or the second standard The sum for the contact being detected simultaneously by is not asked to be remained unchanged during gesture to meet the first standard or the second standard.For example, This shows in Fig. 5 C33 to Fig. 5 C36, is lifted away from during gesture wherein contacting, and equipment navigates to not in response to gesture Same application program.It is maintained at the sum of the contact on touch sensitive surface during allowing user to change navigation gesture, enhances equipment Operability, and make user-equipment interface it is more effective (for example, by using family be easier to provide with required input come it is real Existing expected results, and user's mistake when reducing operation equipment/interact with equipment), in addition, can be more rapidly by using family And equipment is efficiently used, reduce electricity usage and extends the battery life of equipment.
In some embodiments, in method 1100, change of the equipment in the sum for detecting the contact being detected simultaneously by After change, remaining contacts the additional movement on touch sensitive surface for detection (1140), wherein in the additional shifting for detecting remaining contact Meet the first standard or the second standard (or third standard or the 4th standard) after dynamic.For example, this is in Fig. 5 C33 to Fig. 5 C36 It shows, wherein two contacts detect the additional movement of three contacts during gesture after being lifted away from, and equipment is in response to this Gesture and navigate to different application programs.User is allowed to continue navigation gesture and still after one or more contact is lifted away from Meet for the respective standard in the application program external navigation, enhances the operability of equipment, and make user-equipment circle Face it is more effective (for example, be easier to provide with required input by using family to realize expected results, and reduce operation equipment/ User's mistake when being interacted with equipment), in addition, can faster and more effectively use equipment by using family, reduce electric power Using and extend the battery life of equipment.
In some embodiments, detection gesture includes (1142): the first part of detection gesture and detection gesture Second part after gesture first part, wherein the first part of gesture include at least predetermined quantity while detect connect The synchronizing moving (for example, such as gently sweeping in input in mostly finger) of touching in the corresponding direction, the second part of gesture includes at least predetermined The movement (for example, such as more referring in kneading gesture) of the contact detected while quantity toward each other, and it is in one's hands detecting After the first part of gesture and second part, meet at least one of the first standard and the second standard.For example, this is in Fig. 5 C43 It is shown into Fig. 5 C47, gently sweeps input wherein detecting more and referring to more before referring to kneading input, and meet for showing home The standard of screen.User is allowed to initiate navigation gesture in the first way (for example, passing through the contact being detected simultaneously by respective direction On movement) and continue navigation gesture (for example, contact movement toward each other by being detected simultaneously by) in different ways simultaneously Still meet for the respective standard (for example, the first standard or second standard) in the application program external navigation, enhances equipment Operability, and make user-equipment interface it is more effective (for example, by using family be easier to provide with required input come it is real Existing expected results, and user's mistake when reducing operation equipment/interact with equipment), in addition, can be more rapidly by using family And equipment is efficiently used, reduce electricity usage and extends the battery life of equipment.
In some embodiments, detection gesture includes (1144): the Part III of detection gesture and detection gesture Part IV after gesture Part III, wherein the Part III of gesture include at least predetermined quantity while detect connect The movement (for example, such as in more referring to kneading input) of touching toward each other, the Part IV of gesture includes the same of at least predetermined quantity When detect contact in the corresponding direction synchronizing moving (for example, such as mostly finger gently sweep in gesture), and detection it is in one's hands After the Part III and Part IV of gesture, meet at least one of the first standard and the second standard.For example, this is in Fig. 5 C33 It shows, is inputted wherein being detected before mostly finger gently sweeps input and referring to mediate more, and full into Fig. 5 C36, Fig. 5 C37 to Fig. 5 C42 Foot is for showing the standard of an application program.User is allowed to initiate navigation gesture in the first way (for example, pass through while examining The movement of the contact measured toward each other) and continue navigation gesture in different ways (for example, existing by the contact being detected simultaneously by Movement in respective direction) and still meet for the respective standard in the application program external navigation, enhance grasping for equipment The property made, and keep user-equipment interface more effective (for example, being easier to provide with required input by using family to realize expection As a result, and user's mistake when reducing operation equipment/interact with equipment), in addition, can more rapidly and effectively by using family Ground uses equipment, reduces electricity usage and extends the battery life of equipment.
In some embodiments, it is detected in the central part at any edge of the separate touch sensitive surface of touch sensitive surface (1146) initial part of gesture.For example, the gesture is not that gesture is gently swept at edge.In some embodiments, by coming from bottom The edge that the single contact at edge carries out gently sweeps gesture and recalls taskbar, and the single light continuation for sweeping gesture of contact can be with base In triggering user interface navigation process as described herein for different groups of standards of multi-finger gesture, lead to multitask user circle Face or the application program previously shown or home on-screen user interface.For example, this is in Fig. 5 C1, Fig. 5 C4, Fig. 5 C7, figure 5C10, Fig. 5 C13, Fig. 5 C17, Fig. 5 C20, Fig. 5 C23, Fig. 5 C25, Fig. 5 C27, Fig. 5 C30, Fig. 5 C33, Fig. 5 C37, Fig. 5 C43, figure It is shown in 5C48, Fig. 5 C51 and Fig. 5 C55.Allow user in the central part at any edge of the separate touch sensitive surface of touch sensitive surface It initiates gesture (for example, referring to navigation gesture) in point more, enhances the operability of equipment, and have user-equipment interface more Effect (for example, being easier to provide with required input by using family to realize expected results, and reduces operation equipment/and equipment User's mistake when interaction), in addition, can faster and more effectively use equipment by using family, reduce electricity usage simultaneously And extend the battery life of equipment.
In some embodiments, corresponding one in the first standard and the second standard does not require (1148) to detect multiple Contact be lifted away from meet in the first standard and the second standard this is one corresponding (for example, before detecting being lifted away from of contact Identify the gesture).For example, in some embodiments, gesture makes equipment in the lift for detecting contact in screen intermediate suspension From showing multi-task user interface before.In some embodiments, if detecting being lifted away from for contact in current time, The end-state of the UI feedback indicative user interface shown during gesture.Do not require being lifted away to meet for answering for detection contact With the standard of program external navigation, the operability of equipment is enhanced, and keeps user-equipment interface more effective (for example, passing through So that user is easier to provide with required input to realize expected results, and shorten realize expected results needed for the time), this Outside, equipment can faster and more effectively be used by using family, reduce electricity usage and extends the battery longevity of equipment Life.
It should be appreciated that the particular order that the operation in Figure 11 A to Figure 11 F is described is only exemplary, not It is intended to indicate that the sequence is the unique order that can execute these operations.Those skilled in the art will recognize that a variety of sides Formula resequences to operations described herein.Additionally, it should be noted that herein in regard to other methods (example as described herein Such as, the details of method 600,700,800,900 and other processes 1000) described is equally suitable for closing above in a similar way In the method 1100 of Figure 11 A to Figure 11 F description.For example, contact, gesture, user interface pair above with reference to the description of method 1100 As, tactile output, intensity threshold, focus selector and animation optionally have herein with reference to other methods (example as described herein Such as, method 600,700,800,900 with 1000) description contact, gesture, user interface object, tactile output, intensity threshold, One or more of focus selector, feature of animation.For brevity, these details are not repeated herein.
Operation above with reference to Figure 11 A to Figure 11 F description is optionally implemented by component that Figure 1A describes into Figure 1B.Example Such as, display operation 1102, detection operation 1104,1132,1134,1136 and 1138, execution operation 1108 and 1130, handover operation 1110,1112 and 1120, and keep just doing 1126 optionally by event classifier 170, event recognizer 180 and event handling Program 190 is realized.Event monitor 171 in event classifier 170 detects the contact on touch-sensitive display 112, and Event information is transmitted to application program 136-1 by event dispatcher module 174.The corresponding event identifier of application program 136-1 Event information is defined 186 with corresponding event by 180 to be compared, and determines the first contact on touch sensitive surface at first position Whether whether (or the rotation of the equipment) corresponds to predefined event or subevent, such as choosing to the object in user interface It selects or the equipment is orientated the rotation to another orientation from one.When detecting corresponding predefined event or subevent, Event recognizer 180 activates button.onrelease 190 associated with the detection to the event or subevent.Button.onrelease 190 optionally using or call data renovator 176 or object renovator 177 to carry out more new application internal state 192.One In a little embodiments, button.onrelease 190 accesses corresponding GUI renovator 178 and carrys out content shown by more new application.It is similar Ground, those skilled in the art can know clearly based in Figure 1A, into Figure 1B, how discribed component can realize other mistakes Journey.
For illustrative purposes, the description of front is described by reference to specific embodiment.However, example above The property shown discussion is not intended to exhausted or limits the invention to disclosed precise forms.According to above teaching content, very More modifications and variations are all possible.Selection and description embodiment are to most preferably illustrate the principle of the present invention And its practical application, so as to so that others skilled in the art most preferably can be suitable for being conceived using having The described embodiment of the invention and various of the various modifications of special-purpose.

Claims (86)

1. a kind of method, comprising:
At the equipment with touch-sensitive display:
The first user interface is shown on the display, wherein first user interface is different from home screen user circle Face, the home on-screen user interface include and the different application phase in installation multiple application programs on said device Corresponding multiple application icons;
When showing first user interface on the display, detects and connect in the first edge of the display by first The first input that touching carries out;And
In response to detecting first input on the edge of the display, and described in the display When continuing to test the described first contact in first edge:
According to determination it is described first input be detected in the first part of the first edge of the display and First input meets taskbar and shows standard, shows at the first position along the first edge of the display With multiple application program image target taskbars;And
According to the institute different from the first edge that determination first input is in the first edge of the display It states first input detect on the second part of first part and described and meets the taskbar display standard, along institute State the first edge of display be selected as include the first edge of the display the second part The second place shows the taskbar, wherein the second position is different from the first position.
2. according to the method described in claim 1, wherein along the first position of the first edge of the display It does not include the second part of the first edge of the display.
3. method according to claim 1 or 2, wherein the second of the first edge along the display Set do not include the display the first edge the first part.
4. according to the method in any one of claims 1 to 3, further includes: when showing described first on the display When user interface is without showing the taskbar:
The second input carried out in the second edge of the display by the second contact is detected, the second edge is different from institute State the first edge of display;And
Display has the multiple application program image target at the third place along the second edge of the display The taskbar.
5. according to the method described in claim 4, further include: when show on the display first user interface without When showing the taskbar:
It detects and is inputted on the third edge of the display by the third that third contact carries out, the third edge is different from institute State the first edge of display and the second edge of the display;And
Display has the multiple application program image target at the 4th position along the third edge of the display The taskbar.
6. the method according to any one of claims 1 to 5, further includes: when along described the first of the display The taskbar is shown at the first position at edge, while continuing to detect first contact on the display When:
Detect the first contact being lifted away from from the display;And
In response to detecting being lifted away from for first contact, according to determining when showing the taskbar, first contact is moved It is dynamic to be less than threshold quantity, after being lifted away from described in first contact, keep the taskbar on the display described Display above first user interface.
7. according to the method described in claim 6, further include:
In response to detecting being lifted away from for first contact, according to the determination when showing the taskbar, described first is connect Touching is mobile to be less than the threshold quantity, after being lifted away from described in first contact, expands above first user interface The size of the taskbar of display.
8. method according to claim 6 or 7, further includes:
In response to detecting being lifted away from for first contact, according to the determination when showing the taskbar, described first is connect Touching is mobile to be less than the threshold quantity, by the display of the taskbar from described the of the first edge along the display One position is moved into the third predetermined position of the first edge of display.
9. method according to any one of claim 1 to 8, further includes:
When showing the taskbar at the first position of the first edge along the display:
First contact is detected along the first movement of the taskbar;And
In response to detecting the first movement of first contact, according to the current location selection of first contact Corresponding application programs icon in taskbar;And
After detecting first movement of first contact along the first edge, detects described first and contact from described Display is lifted away from;And
In response to detect it is described first contact described in be lifted away from, according to determine detect it is described first contact described in be lifted away from When, currently select the first application icon in the taskbar:
Open the first application program of the first application program image target corresponded in the taskbar;And
The display of first user interface is replaced with the display at the second user interface for first application program.
10. method according to any one of claim 1 to 9, further includes: when along described the first of the display When showing the taskbar at the first position at edge:
Detect the movement of first contact on the display;And
In response to detecting that it is aobvious with the first application program image target in the taskbar that the contact is located on the display Show corresponding position, selects first application icon.
11. according to the method described in claim 10, further include: when selecting first application icon:
Detect the described first movement for contacting the first edge far from the display on the display;And
In response to detecting first contact on the display described in the first edge far from the display It is mobile, it according to determination first contact is detected at the position for the display for not corresponding to the taskbar, In At the position of the position of the display for not corresponding to the taskbar for corresponding to first contact on the display Show first application icon or its expression.
12. according to the method for claim 11, further includes: on the display correspond to it is described first contact not First application icon or its described expression are shown at the position of the position of the display corresponding to the taskbar When:
Detect being lifted away from for first contact;And
In response to the display for not corresponding to the taskbar for corresponding to first contact on the display When showing first application icon at the position of position, being lifted away from for first contact is detected:
Institute is replaced with the display at the second user interface for corresponding to application program associated with first application icon State display of first user interface in the first part of the display;And
Display of first user interface in the second part of the display is kept, the second part is not shown with described Show the first part overlapping of device.
13. method according to any one of claim 1 to 12, further includes:
When showing the taskbar at the first position of the first edge along the display:
Detect the movement of the first edge of first contact towards the display;And
In response to detecting the movement of the first edge of first contact towards the display, according to determining institute The movement for stating the first contact towards the first edge of the display meets taskbar and removes standard, stops display institute State taskbar.
14. method according to any one of claim 1 to 13, wherein the first edge of the display is described First part is in the first predefined subrange of the first edge of the display, and the first position is described The first predetermined position in described first predefined subrange of first edge.
15. according to claim 1 to method described in any one of 14, wherein the second of the first edge of the display Part is located in the second predefined subrange of the first edge of the display, and
When it is described first contact away from the first edge closer to it is described first contact the first neighboring edge at least threshold value away from From when, the taskbar that the second place is shown be centered at it is described first contact the position;And
When first neighboring edge of first contact away from the first edge is less than the threshold distance, described the The taskbar shown at two positions is shown as abutting first neighboring edge of the first edge.
16. according to claim 1 to method described in 15, wherein described in when the taskbar is shown at the first position The size of taskbar is greater than when the size of taskbar taskbar when the second place is shown.
17. according to claim 1 to method described in any one of 16, further includes: in response on the side of the display First input is detected in first input on edge, and continues to examine in the first edge of the display When measuring the described first contact:
Meet navigation gesture standard according to determination first input, wherein the navigation gesture standard includes detecting described the One contacts the amount of threshold shift of the first edge far from the display on the display to meet the navigation The requirement of gesture standard:
Into interim subscriber interface model, in the interim subscriber interface model, multiple and different user interface states is can With based on it is described first input one group of one or more attribute compared with corresponding one group of one or more threshold value come It is selected.
18. a kind of electronic equipment, comprising:
Touch-sensitive display;
One or more processors;With
Memory, the memory stores one or more programs, wherein one or more of programs are configured as by described One or more processors execute, and one or more of programs include the instruction for performing the following operation:
The first user interface is shown on the display, wherein first user interface is different from home screen user circle Face, the home on-screen user interface include and the different application phase in installation multiple application programs on said device Corresponding multiple application icons;
When showing first user interface on the display, detects and connect in the first edge of the display by first The first input that touching carries out;And
In response to detecting first input on the edge of the display, and described in the display When continuing to test the described first contact in first edge:
According to determination it is described first input be detected in the first part of the first edge of the display and First input meets taskbar and shows standard, shows at the first position along the first edge of the display With multiple application program image target taskbars;And
According to the institute different from the first edge that determination first input is in the first edge of the display It states first input detect on the second part of first part and described and meets the taskbar display standard, along institute State the first edge of display be selected as include the first edge of the display the second part The second place shows the taskbar, wherein the second position is different from the first position.
19. a kind of computer readable storage medium for storing one or more programs, one or more of programs include instruction, Described instruction makes the equipment when being executed by the electronic equipment with touch-sensitive display:
The first user interface is shown on the display, wherein first user interface is different from home screen user circle Face, the home on-screen user interface include and the different application phase in installation multiple application programs on said device Corresponding multiple application icons;
When showing first user interface on the display, detects and connect in the first edge of the display by first The first input that touching carries out;And
In response to detecting first input on the edge of the display, and described in the display When continuing to test the described first contact in first edge:
According to determination it is described first input be detected in the first part of the first edge of the display and First input meets taskbar and shows standard, shows at the first position along the first edge of the display With multiple application program image target taskbars;And
According to the institute different from the first edge that determination first input is in the first edge of the display It states first input detect on the second part of first part and described and meets the taskbar display standard, along institute State the first edge of display be selected as include the first edge of the display the second part The second place shows the taskbar, wherein the second position is different from the first position.
20. the graphic user interface in a kind of electronic equipment, the electronic equipment has touch-sensitive display, memory and is used for Execute the one or more processors for the one or more programs being stored in the memory, the graphic user interface packet It includes:
Rendering from the first application program, wherein the rendering is different from the rendering from home screen, the home screen Including multiple application icons corresponding with the different application in installation multiple application programs on said device;
Wherein:
It detects when showing first user interface on the display and leads in the first edge of the display Cross the first input that the first contact carries out;And
In response to detecting first input on the edge of the display, and described in the display When continuing to test the described first contact in first edge:
According to determination it is described first input be detected in the first part of the first edge of the display and First input meets taskbar and shows standard, shows at the first position along the first edge of the display With multiple application program image target taskbars;And
According to the institute different from the first edge that determination first input is in the first edge of the display It states first input detect on the second part of first part and described and meets the taskbar display standard, along institute State the first edge of display be selected as include the first edge of the display the second part The second place shows the taskbar, wherein the second position is different from the first position.
21. a kind of electronic equipment, comprising:
Touch-sensitive display;
For showing the device of the first user interface on the display, wherein first user interface is different from home screen Curtain user interface, the home on-screen user interface include answering with the difference installed in multiple application programs on said device With the corresponding multiple application icons of program;
Device for following operation: when showing first user interface on the display, the display is detected The first input carried out in first edge by the first contact;With
Device for following operation: in response to detecting first input on the edge of the display, and When continuing to test the described first contact in the first edge in the display:
According to determination it is described first input be detected in the first part of the first edge of the display and First input meets taskbar and shows standard, shows at the first position along the first edge of the display With multiple application program image target taskbars;And
According to the institute different from the first edge that determination first input is in the first edge of the display It states first input detect on the second part of first part and described and meets the taskbar display standard, along institute State the first edge of display be selected as include the first edge of the display the second part The second place shows the taskbar, wherein the second position is different from the first position.
22. a kind of information processing unit used in the electronic equipment with touch-sensitive display, comprising:
For showing the device of the first user interface on the display, wherein first user interface is different from home screen Curtain user interface, the home on-screen user interface include answering with the difference installed in multiple application programs on said device With the corresponding multiple application icons of program;
Device for following operation: when showing first user interface on the display, the display is detected The first input carried out in first edge by the first contact;With
Device for following operation: in response to detecting first input on the edge of the display, and When continuing to test the described first contact in the first edge in the display:
According to determination it is described first input be detected in the first part of the first edge of the display and First input meets taskbar and shows standard, shows at the first position along the first edge of the display With multiple application program image target taskbars;And
According to the institute different from the first edge that determination first input is in the first edge of the display It states first input detect on the second part of first part and described and meets the taskbar display standard, along institute State the first edge of display be selected as include the first edge of the display the second part The second place shows the taskbar, wherein the second position is different from the first position.
23. a kind of electronic equipment, comprising:
Touch-sensitive display;
One or more processors;With
The memory for storing one or more programs, wherein one or more of programs are configured as by one or more of Processor executes, and one or more of programs include for executing any according to claim 1 into method described in 17 The instruction of method.
24. a kind of computer readable storage medium for storing one or more programs, one or more of programs include instruction, Described instruction by the electronic equipment with touch-sensitive display when being executed, so that the equipment executes according to claim 1 to 17 Any method in the method.
25. the graphic user interface in a kind of electronic equipment, the electronic equipment has touch-sensitive display, memory and is used for The one or more processors for the one or more programs being stored in the memory are executed, the graphic user interface includes The user interface that any method into method described in 17 is shown according to claim 1.
26. a kind of electronic equipment, comprising:
Touch-sensitive display;With
For executing the device of the method either into method described in 17 according to claim 1.
27. a kind of information processing unit used in the electronic equipment with touch-sensitive display, comprising:
For executing the device of the method either into method described in 17 according to claim 1.
28. a kind of method, comprising:
At the equipment with touch sensitive surface and display:
In the difference of the display while showing the first application program user interface in the first part of the display In showing the second application program user interface on the second part of the first part;
Described aobvious while showing first application program user interface in the first part in the display Show that detection is by including the shifting on first direction when showing second application program user interface on the second part of device The first input that the first dynamic contact carries out;And
In response to detecting first input:
Meet the first standard according to determination first input, wherein it includes described that first standard, which includes first input, The requirement for meeting first standard on first direction more than the movement of first threshold amount, replaces institute with full frame home screen State the display of the first user interface and the second user interface;And
Meet the second standard according to determination first input, wherein it includes described that second standard, which includes first input, Meet the requirement of second standard on first direction less than the movement of the first threshold amount, and determines described first Input starts in the first edge region corresponding to first application program user interface of the display, is replaced with first It changes user interface and replaces the display of first application program user interface, while keeping second application program user interface Display in the second part of the display;And
Meet second standard according to determination first input, and determines that first input is corresponding to described second Start in the second edge region of application program user interface, replaces second application program with the second replacement user interface and use The display at family interface, while keeping first application program user interface aobvious in the first part of the display Show.
29. according to the method for claim 28, in which:
Second standard includes application program switch interface navigation standard, wherein the application program switch interface navigation First input described in standard requirements includes the movement of first contact, and the movement has far from described in the display The magnitude of moving parameter on the direction in the respective edges region that the first input starts, to meet application program switch circle Face navigation standard;And
The replacement user interface is the application program switch user interface accordingly indicated for including application program, the application The corresponding of program indicates for selectively activating currently indicate in the application program switch user interface multiple to answer With one in program.
30. according to the method for claim 29, comprising: the first part or the display in the display When showing the application program switch user interface in the second part:
It detects to for selectively activating currently indicate in the application program switch user interface the multiple to answer With one application program in program accordingly indicate in the first selection indicated;And
In response to detecting the selection indicated described first:
When detect to described first indicate selection when in the first part of the display show the application When program switch user interface, answer associated with first expression is shown in the first part of the display With the user interface of program, while keeping second application program user interface in the second part of the display Display;And
When detect to described first indicate selection when in the second part of the display show the application When program switch user interface, institute associated with first expression is shown in the second part of the display The user interface of application program is stated, while keeping first application program user interface described the of the display Display in a part.
31. according to the method for claim 30, wherein the application program switch user interface is shown in the display In the first part of device, which comprises
It is shown described in the application program associated with first expression in the first part of the display User interface and when showing second application program user interface in the second part of the display, described in detection It is carried out in the second edge region corresponding to second application program user interface of display by the second contact Second input;And
In response to detecting second input, application program switch interface is met according to determination second input and is led Navigation mark is quasi-, is answered with the application program switch user interface replacement described second in the second part of the display With the display of program user interface, while keeping the user interface of the application program associated with first expression Display in the first part of the display,
Wherein the application program switch user interface in the second part of the display include with previously in institute First application program user interface associated described first shown in the first part of display is stated using journey The expression of sequence.
32. the method according to any one of claim 28 to 31, in which:
Second standard includes last Application Program Interface navigation standard, the standard wherein the last Application Program Interface navigates It is required that first input includes the movement of first contact, the movement, which has, is being arranged essentially parallel to the display The magnitude of moving parameter on the direction in the respective edges region that first input starts;And
The replacement user interface the first of corresponding application programs user interface previously shown is answered different from what is be replaced Use program user interface.
33. according to the method for claim 32, comprising: replacing first application program with the first replacement user interface After the display of user interface, and in the first time threshold from being lifted away from first contact, wherein described first replaces Changing user interface is previously shown application program user interface:
The second input carried out by the second contact started in the first edge region is detected, second input includes It is described second contact movement, the movement on the direction in first edge region for being arranged essentially parallel to the display The magnitude of moving parameter meets the last Application Program Interface navigation standard;And
In response to detecting second input:
According to determining second, previously shown application program user interface can be used for being navigated to, and previously shown with described second The application program user interface shown replaces the display of the described first previously shown application program user interface;And
According to determining second, previously shown application program user interface is not useable for being navigated to, aobvious with full-screen display mode Show the second user interface.
34. the method according to any one of claim 28 to 33, comprising:
In response to detecting first input, third standard is met according to determination first input, wherein the third mark First input described in alignment request includes in said first direction less than the movement of the first threshold amount but in the first party Full frame application program switch user interface is shown to meet the third standard greater than the movement of second threshold amount upwards.
35. the method according to any one of claim 28 to 34, comprising: when the first part in the display When showing second application program user interface while upper display first application program user interface, and detecting To before first input:
Display first is shown and can be indicated above a part of first application program user interface, wherein described first shows energy table The position instruction shown in the first part of the display for starting the conversion zone of Pre-defined gesture input;And
Display second is shown and can be indicated above a part of second application program user interface, wherein described second shows energy table The position instruction shown is used to start on the second part of the display conversion zone of the Pre-defined gesture input.
36. according to the method for claim 35, in which:
Described first shows that the size of the first part for the size and display that can be indicated is directly proportional;
Described second shows that the size of the second part for the size and display that can be indicated is directly proportional;And
The described method includes: when the upper in first application program user interface shows that described first shows energy table Show and the upper of second application program user interface show described second show can indicate when:
Detection meets user's input of split screen size adjusting standard;And
Meet the user input of the split screen size adjusting standard in response to detecting;
The size of the first part of the display is adjusted to the second size from first size, including with the display Second size of the first part proportionally adjust the display of first application program user interface and described First shows the size for the display that can be indicated;And
The size of the second part of the display is resized to the 4th size from third, including with the display The 4th size of the second part proportionally adjust the display of second application program user interface and described Second shows the size for the display that can be indicated.
37. the method according to any one of claim 28 to 36, comprising: when with full-screen display mode display third application When program uses interface, showing that third is shown above a part of the third application program user interface can be indicated, wherein institute Stating third shows the position instruction that can be indicated for starting the conversion zone of Pre-defined gesture input on the display.
38. the method according to any one of claim 28 to 37, wherein first standard and second standard are each From require it is described first input be lifted away from, which comprises
In response to detecting the movement on the first direction of first input on the display, and examining Measure it is described first input be lifted away from before:
According to determination first input the display correspond to first application program user interface described the Start in one fringe region, with include corresponding to first application program user interface the first application view transition User interface replaces the display of first application program user interface, while second application program user interface being kept to exist Display in the second part of the display, wherein the size of first application view is defeated with described first Enter the movement on the display and dynamically changes;And
According to determination first input the display correspond to second application program user interface described the Start in two fringe regions, with include corresponding to second application program user interface the second application view transition User interface replaces the display of second application program user interface, while first application program user interface being kept to exist Display in the first part of the display, wherein the size of second application view is defeated with described first Enter the movement on the display and dynamically changes.
39. according to the method for claim 38, comprising: when showing the interim subscriber interface, monitoring described first is connect The position and speed of touching simultaneously provides corresponding visual feedback, if instruction will detect the described first lift contacted at current time From how the equipment will navigate.
40. according to the method for claim 39, wherein when in the display the first part or the display The second part on when showing the interim subscriber interface, two or more application views are in the interim subscriber When first contact is lifted away from, the equipment is incited somebody to action for display instruction in interface:
Started in the first edge region according to determination first input, in the first part of the display Display includes the application program switch user interface of the expression of multiple application programs, and the expression of the multiple application program is used for One in the multiple application programs indicated in the application program switch user interface is selectively activated, is kept simultaneously Display of second application program user interface in the second part of the display;And
Started in the second edge region according to determination first input, in the second part of the display Display includes the application program switch user interface of the expression of multiple application programs, and the expression of the multiple application program is used for One in the multiple application programs indicated in the application program switch user interface is selectively activated, is kept simultaneously Display of first application program user interface in the first part of the display.
41. the method according to claim 39 or 40, comprising: the first part or the display in the display When showing the interim subscriber interface on the second part of device:
Detection will meet the first attribute of first input of first standard when first contact is lifted away from;And
In response to detecting first attribute of first contact:
Started in the first edge region according to determination first input, stops at described second of the display Second application program user interface is shown in point and by the display at the interim subscriber interface from described in the display First part expands to whole display;And
Started in the second edge region according to determination first input, stops at described first of the display First application program user interface is shown in point and by the display at the interim subscriber interface from described in the display Second part expands to whole display.
42. according to the method for claim 41, wherein stopping showing first application program user interface or described the Two application program user interfaces include:
Started in the first edge region according to determination first input, with first application program user interface The display of first application program user interface is replaced in the display of application view, wherein first application user The display properties of the application view at interface is moved and is dynamically changed according to first input;And
Started in the second edge region according to determination first input, with second application program user interface The display of second application program user interface is replaced in the display of application view, wherein second application user The display properties of the application view at interface is moved and is dynamically changed according to first input.
43. the method according to claim 41 or 42, wherein when showing the full frame interim subscriber interface, two or more When first contact is lifted away from, the equipment will for display instruction of multiple application views in the interim subscriber interface Display includes the application program switch user interface of the expression of multiple application programs, and the expression of the multiple application program is used for Selectively activate one in the multiple application programs indicated in the full frame application program switch user interface.
44. the method according to any one of claim 41 to 43, wherein when showing the full frame interim subscriber interface, Display instruction of the only one application view in the interim subscriber interface is when first contact is lifted away from, the equipment It will show full frame home screen.
45. the method according to any one of claim 42 to 44, comprising: when aobvious in the full frame interim subscriber interface When showing the application view of first application program user interface and second application program user interface:
Detection includes first contact in the first edge region or the second edge region towards the display Second direction on movement gesture;And
In response to detecting the gesture of the movement including first contact in this second direction:
Started in the first edge region according to determination first input, restores second application program user interface Display in the second part of the display;And
Started in the second edge region according to determination first input, restores first application program user interface Display in the first part of the display.
46. the method according to any one of claim 41 to 45, in which:
When showing the full frame application program switch user interface, switch for selectively activating in the application program Multiple expressions of the application program of one in multiple application programs indicated in device user interface include the first expression, institute It is associated at least two application programs being activated simultaneously when the described first expression of selection to state the first expression;And
When showing the application program in the first part of the display or the second part of the display When switch user interface, for selectively activating the multiple applications indicated in the application program switch user interface At least two applications that multiple expressions of one application program in program do not include and are activated simultaneously in selection The associated expression of program.
47. the method according to any one of claim 28 to 46, comprising: when the first part in the display Described second is shown on the second part of the display while upper display first application program user interface When application program user interface, and before detecting first input carried out by first contact:
Detection meets the first touch input that taskbar shows standard in the first edge of the display;And
In response to detecting first touch input in the first edge of the display, and when in the display When continuing to test first touch input in the first edge of device:
It according to determination first touch input is detected in the first part of the first edge of the display, Display has multiple application program image target taskbars at the first position along the first edge of the display;With And
It according to determination first input is detected on the second part of the first edge of the display, on edge The first edge of the display be selected as including described second of the first edge of the display The second place divided shows the taskbar, wherein the second position is different from the first position.
48. a kind of electronic equipment, comprising:
Touch-sensitive display;
One or more processors;With
Memory, the memory stores one or more programs, wherein one or more of programs are configured as by described One or more processors execute, and one or more of programs include the instruction for performing the following operation:
The first application program user interface is shown in the first part of the display, while being different from the display The second application program user interface is shown on the second part of the first part;
Described aobvious while showing first application program user interface in the first part in the display Show that detection is by including the shifting on first direction when showing second application program user interface on the second part of device The first input that the first dynamic contact carries out;And
In response to detecting first input:
Meet the first standard according to determination first input, wherein it includes described that first standard, which includes first input, The requirement for meeting first standard on first direction more than the movement of first threshold amount, replaces institute with full frame home screen State the display of the first user interface and the second user interface;And
Meet the second standard according to determination first input, wherein it includes described that second standard, which includes first input, Meet the requirement of second standard on first direction less than the movement of the first threshold amount, and determines described first Input starts in the first edge region corresponding to first application program user interface of the display, is replaced with first It changes user interface and replaces the display of first application program user interface, while keeping second application program user interface Display in the second part of the display;And
Meet second standard according to determination first input, and determines that first input is corresponding to described second Start in the second edge region of application program user interface, replaces second application program with the second replacement user interface and use The display at family interface, while keeping first application program user interface aobvious in the first part of the display Show.
49. a kind of computer readable storage medium for storing one or more programs, one or more of programs include instruction, Described instruction makes the equipment when being executed by the electronic equipment with touch-sensitive display:
In the difference of the display while showing the first application program user interface in the first part of the display In showing the second application program user interface on the second part of the first part;
Described aobvious while showing first application program user interface in the first part in the display Show that detection is by including the shifting on first direction when showing second application program user interface on the second part of device The first input that the first dynamic contact carries out;And
In response to detecting first input:
Meet the first standard according to determination first input, wherein it includes described that first standard, which includes first input, The requirement for meeting first standard on first direction more than the movement of first threshold amount, replaces institute with full frame home screen State the display of the first user interface and the second user interface;And
Meet the second standard according to determination first input, wherein it includes described that second standard, which includes first input, Meet the requirement of second standard on first direction less than the movement of the first threshold amount, and determines described first Input starts in the first edge region corresponding to first application program user interface of the display, is replaced with first It changes user interface and replaces the display of first application program user interface, while keeping second application program user interface Display in the second part of the display;And
Meet second standard according to determination first input, and determines that first input is corresponding to described second Start in the second edge region of application program user interface, replaces second application program with the second replacement user interface and use The display at family interface, while keeping first application program user interface aobvious in the first part of the display Show.
50. the graphic user interface in a kind of electronic equipment, the electronic equipment has touch-sensitive display, memory and is used for Execute the one or more processors for the one or more programs being stored in the memory, the graphic user interface packet It includes:
The first application program and the display in first part from the display are different from the first part Second part on the second application program concurrently rendering;
Wherein:
Described aobvious while showing first application program user interface in the first part in the display Show when showing second application program user interface on the second part of device, detects by including on first direction The first input that the first mobile contact carries out;And
In response to detecting first input:
Meet the first standard according to determination first input, wherein it includes described that first standard, which includes first input, The requirement for meeting first standard on first direction more than the movement of first threshold amount, replaces institute with full frame home screen State the display of the first user interface and the second user interface;With
Meet the second standard according to determination first input, wherein it includes described that second standard, which includes first input, Meet the requirement of second standard on first direction less than the movement of the first threshold amount, and determines described first Input starts in the first edge region corresponding to first application program user interface of the display, is replaced with first It changes user interface and replaces the display of first application program user interface, while keeping second application program user interface Display in the second part of the display;And
Meet second standard according to determination first input, and determines that first input is corresponding to described second Start in the second edge region of application program user interface, replaces second application program with the second replacement user interface and use The display at family interface, while keeping first application program user interface aobvious in the first part of the display Show.
51. a kind of electronic equipment, comprising:
Touch-sensitive display;
Device for following operation: while showing the first application program user interface in the first part of the display The second application program user interface is shown on the second part different from the first part of the display;
Device for following operation: when showing first application user in the first part in the display While interface when showing second application program user interface on the second part of the display, detection passes through The first input that the first contact including the movement on first direction carries out;And
For executing the following device operated in response to detecting first input:
Meet the first standard according to determination first input, wherein it includes described that first standard, which includes first input, The requirement for meeting first standard on first direction more than the movement of first threshold amount, replaces institute with full frame home screen State the display of the first user interface and the second user interface;And
Meet the second standard according to determination first input, wherein it includes described that second standard, which includes first input, Meet the requirement of second standard on first direction less than the movement of the first threshold amount, and determines described first Input starts in the first edge region corresponding to first application program user interface of the display, is replaced with first It changes user interface and replaces the display of first application program user interface, while keeping second application program user interface Display in the second part of the display;And
Meet second standard according to determination first input, and determines that first input is corresponding to described second Start in the second edge region of application program user interface, replaces second application program with the second replacement user interface and use The display at family interface, while keeping first application program user interface aobvious in the first part of the display Show.
52. a kind of information processing unit used in the electronic equipment with touch-sensitive display, comprising:
Device for following operation: while showing the first application program user interface in the first part of the display The second application program user interface is shown on the second part different from the first part of the display;
Device for following operation: when showing first application user in the first part in the display While interface when showing second application program user interface on the second part of the display, detection passes through The first input that the first contact including the movement on first direction carries out;With
For executing the following device operated in response to detecting first input:
Meet the first standard according to determination first input, wherein it includes described that first standard, which includes first input, The requirement for meeting first standard on first direction more than the movement of first threshold amount, replaces institute with full frame home screen State the display of the first user interface and the second user interface;And
Meet the second standard according to determination first input, wherein it includes described that second standard, which includes first input, Meet the requirement of second standard on first direction less than the movement of the first threshold amount, and determines described first Input starts in the first edge region corresponding to first application program user interface of the display, is replaced with first It changes user interface and replaces the display of first application program user interface, while keeping second application program user interface Display in the second part of the display;And
Meet second standard according to determination first input, and determines that first input is corresponding to described second Start in the second edge region of application program user interface, replaces second application program with the second replacement user interface and use The display at family interface, while keeping first application program user interface aobvious in the first part of the display Show.
53. a kind of electronic equipment, comprising:
Touch-sensitive display;
One or more processors;With
The memory for storing one or more programs, wherein one or more of programs are configured as by one or more of Processor executes, and one or more of programs include for executing any in the method according to claim 28 to 47 The instruction of method.
54. a kind of computer readable storage medium for storing one or more programs, one or more of programs include instruction, Described instruction by the electronic equipment with touch-sensitive display when being executed, so that the equipment is executed according to claim 28 to 47 Any method in the method.
55. the graphic user interface in a kind of electronic equipment, the electronic equipment has touch-sensitive display, memory and is used for The one or more processors for the one or more programs being stored in the memory are executed, the graphic user interface includes The user interface that any method in the method according to claim 28 to 47 is shown.
56. a kind of electronic equipment, comprising:
Touch-sensitive display;With
For executing the device of any method in the method according to claim 28 to 47.
57. a kind of information processing unit used in the electronic equipment with touch-sensitive display, comprising:
For executing the device of any method in the method according to claim 28 to 47.
58. a kind of method, comprising:
At the electronic equipment with display and touch sensitive surface:
It is shown mounted in user circle of the first application program in multiple application programs in the equipment on the display Face;
The gesture on the touch sensitive surface is detected, wherein detecting the gesture includes that ought show described first on the display The initial part of the gesture is detected when the user interface of application program, and is detected the gesture and included while detecting institute It states multiple contacts on touch sensitive surface and detects the movement of the multiple contact;And
In response to detecting the gesture on the touch sensitive surface:
It include two contacts being detected simultaneously by according to the determination gesture, based on described two contacts being detected simultaneously by institute The movement during stating gesture executes operation in first application program;
According to the contact detected while determining that gesture includes more than predetermined number, the predetermined number be greater than Two, and at the same time mobile satisfaction first standard of the contact detected during the gesture, from the use of the first application program of display Family changing interface is different from the user interface of the second application program of the first application program to showing in multiple application programs;And
It the contact that is detected while according to the determination gesture including more than the predetermined quantity and described is detected simultaneously by Contact during the gesture it is described it is mobile meet the second standard for being different from first standard, from display described first It includes being used to open installation on said device the multiple using journey that the user interface of application program, which is switched to display, The user interface of the corresponding application programs icon of sequence.
59. method according to claim 58, wherein gesture described in first standard requirements includes big on first direction In the movement of first threshold amount to meet first standard.
60. the method according to any one of claim 58 to 59, wherein gesture described in second standard requirements includes It is greater than the movement of second threshold amount in second direction to meet second standard.
61. the method according to any one of claim 58 to 60, wherein gesture described in second standard requirements includes By the movement greater than third threshold quantity that the contact being detected simultaneously by carries out toward each other to meet second standard.
62. the method according to any one of claim 58 to 61, comprising:
In response to detecting the gesture on the touch sensitive surface:
It the contact that is detected while according to the determination gesture including more than the predetermined quantity and described is detected simultaneously by The movement of the contact during the gesture meet third standard, from user circle for showing first application program Face is switched to the user interface accordingly indicated that display includes the application program of multiple latest activities.
63. method according to claim 62, wherein input described in the third standard requirements includes big in second direction In the 4th threshold quantity movement and less than the movement of the 5th threshold quantity to meet the third standard.
64. the method according to any one of claim 62 to 63, wherein input described in the third standard requirements includes By the movement less than the 6th threshold quantity that the contact being detected simultaneously by carries out toward each other to meet the third standard.
65. the method according to any one of claim 58 to 64, comprising:
In response to detecting the gesture on the touch sensitive surface:
It the contact that is detected while according to the determination gesture including more than the predetermined quantity and described is detected simultaneously by Contact during the gesture it is described it is mobile meet the 4th standard, keep first application program on the display Display.
66. method according to claim 65, wherein input described in the 4th standard requirements includes by described while examining The movement less than the 7th threshold quantity that the contact measured carries out is to meet the 4th standard.
67. the method according to any one of claim 58 to 66, comprising:
In response to detecting the gesture on the touch sensitive surface:
It the contact that is detected while according to the determination gesture including more than the predetermined quantity and described is detected simultaneously by The movement of the contact on the touch sensitive surface be since on the touch sensitive surface initial detecting connect to the multiple It has touched what the time by least threshold quantity started later, operation is executed in first application program according to the gesture.
68. the method according to any one of claim 58 to 67, comprising:
Relative movement of the contact being detected simultaneously by described in detection during the gesture on the touch sensitive surface toward each other; And
According to the relative movement of the contact being detected simultaneously by toward each other, the described of first application program is adjusted The size of the expression of user interface.
69. the method according to any one of claim 58 to 68, comprising:
Movement of the contact being detected simultaneously by described in detection in the respective direction on the touch sensitive surface, the movement correspond to On the display towards the movement at the predefined edge of the display;And
According to the movement of the contact being detected simultaneously by the respective direction, first application program is adjusted The size of the expression of the user interface.
70. the method according to any one of claim 58 to 69, comprising:
First movement of the contact being detected simultaneously by described in detection simultaneously in the respective direction on the touch sensitive surface, Yi Jisuo State the second movement of the contact being detected simultaneously by toward each other;
According to the first movement of the contact being detected simultaneously by the respective direction, make first application program The expression of the user interface move on the display;And
According to second movement of the contact being detected simultaneously by toward each other, the described of first application program is adjusted The size of the expression of user interface on the display.
71. the method according to any one of claim 58 to 70, comprising:
Change of the sum for the contact being detected simultaneously by during the gesture is detected, wherein first standard or described second Standard does not require the sum of the contact being detected simultaneously by remain unchanged during the gesture just to meet first mark Quasi- or described second standard.
72. method according to claim 71, comprising:
After detecting the change of sum of the contact being detected simultaneously by, remaining contact is detected in the touch-sensitive table Additional movement on face, wherein meeting first standard or institute after the additional movement for detecting remaining contact State the second standard.
73. the method according to any one of claim 58 to 72, in which:
Detecting the gesture includes:
Detect the first part of the gesture: and
Detect the second part after the first part of the gesture of the gesture;
The first part of the gesture include at least the predetermined quantity while contact that detects in the corresponding direction Synchronizing moving, and
The second part of the gesture include at least the predetermined quantity while contact shifting toward each other that detects It is dynamic;And
After the first part and the second part for detecting the gesture, meet first standard and described At least one of two standards.
74. the method according to any one of claim 58 to 73, in which:
Detecting the gesture includes:
Detect the Part III of the gesture: and
Detect the Part IV after the Part III of the gesture of the gesture;
The Part III of the gesture include at least the predetermined quantity while contact shifting toward each other that detects It is dynamic;And
The Part IV of the gesture include at least the predetermined quantity while contact that detects in the corresponding direction Synchronizing moving, and
After the Part III and the Part IV for detecting the gesture, meet first standard and described At least one of two standards.
75. the method according to any one of claim 58 to 74, wherein the initial part of the gesture is in institute It states and is detected in the central part at any edge far from the touch sensitive surface of touch sensitive surface.
76. the method according to any one of claim 58 to 75, wherein in first standard and second standard Corresponding one do not require to detect being lifted away to meet in first standard and second standard of the multiple contact Described corresponding one.
77. a kind of electronic equipment, comprising:
Touch-sensitive display;
One or more processors;With
Memory, the memory stores one or more programs, wherein one or more of programs are configured as by described One or more processors execute, and one or more of programs include the instruction for performing the following operation:
It is shown mounted in user circle of the first application program in multiple application programs in the equipment on the display Face;
The gesture on the touch sensitive surface is detected, wherein detecting the gesture includes that ought show described first on the display The initial part of the gesture is detected when the user interface of application program, and is detected the gesture and included while detecting institute It states multiple contacts on touch sensitive surface and detects the movement of the multiple contact;With
In response to detecting the gesture on the touch sensitive surface:
It include two contacts being detected simultaneously by according to the determination gesture, based on described two contacts being detected simultaneously by institute The movement during stating gesture executes operation in first application program;
According to the contact detected while determining that gesture includes more than predetermined number, the predetermined number be greater than Two, and at the same time mobile satisfaction first standard of the contact detected during the gesture, from the use of the first application program of display Family changing interface is different from the user interface of the second application program of the first application program to showing in multiple application programs;And
It the contact that is detected while according to the determination gesture including more than the predetermined quantity and described is detected simultaneously by Contact during the gesture it is described it is mobile meet the second standard for being different from first standard, from display described first It includes being used to open installation on said device the multiple using journey that the user interface of application program, which is switched to display, The user interface of the corresponding application programs icon of sequence.
78. a kind of computer readable storage medium for storing one or more programs, one or more of programs include instruction, Described instruction makes the equipment when being executed by the electronic equipment with touch-sensitive display:
It is shown mounted in user circle of the first application program in multiple application programs in the equipment on the display Face;
The gesture on the touch sensitive surface is detected, wherein detecting the gesture includes that ought show described first on the display The initial part of the gesture is detected when the user interface of application program, and is detected the gesture and included while detecting institute It states multiple contacts on touch sensitive surface and detects the movement of the multiple contact;And
In response to detecting the gesture on the touch sensitive surface:
It include two contacts being detected simultaneously by according to the determination gesture, based on described two contacts being detected simultaneously by institute The movement during stating gesture executes operation in first application program;
The contact detected while according to the determination gesture including more than predetermined quantity, the predetermined quantity be greater than two, And the mobile satisfaction first standard of the contact being detected simultaneously by during the gesture, is answered from display described first It is switched in the multiple application program of display with the user interface of program and is different from the second of first application program The user interface of application program;And
It the contact that is detected while according to the determination gesture including more than the predetermined quantity and described is detected simultaneously by Contact during the gesture it is described it is mobile meet the second standard for being different from first standard, from display described first It includes being used to open installation on said device the multiple using journey that the user interface of application program, which is switched to display, The user interface of the corresponding application programs icon of sequence.
79. the graphic user interface in a kind of electronic equipment, the electronic equipment has touch-sensitive display, memory and is used for Execute the one or more processors for the one or more programs being stored in the memory, the graphic user interface packet It includes:
The rendering of the user interface of the first application program in the multiple application programs of installation on said device, in which:
The equipment detects the gesture on the touch sensitive surface, wherein detecting the gesture includes showing on the display The initial part of the gesture is detected when the user interface of first application program, and it includes same for detecting the gesture When detect multiple contacts on the touch sensitive surface and detect the movement of the multiple contact;And
In response to detecting the gesture on the touch sensitive surface:
It according to the determination gesture include two contacts being detected simultaneously by, the equipment is detected simultaneously by based on described two The movement during the gesture is contacted, executes operation in first application program;
The contact detected while according to the determination gesture including more than predetermined quantity, the predetermined quantity be greater than two, And the contact being detected simultaneously by during the gesture it is described it is mobile meet the first standard, the equipment is from display institute The user interface for stating the first application program, which is switched in the multiple application program of display, is different from described first using journey The user interface of second application program of sequence;And
It the contact that is detected while according to the determination gesture including more than the predetermined quantity and described is detected simultaneously by Contact during the gesture it is described it is mobile meet the second standard for being different from first standard, the equipment is from display It includes being used to open installation on said device described more that the user interface of first application program, which is switched to display, The user interface of the corresponding application programs icon of a application program.
80. a kind of electronic equipment, comprising:
Touch-sensitive display;
Device for following operation: be shown mounted in multiple application programs in the equipment on the display The user interface of one application program;
For detecting the device of the gesture on the touch sensitive surface, wherein detecting the gesture includes showing on the display The initial part of the gesture is detected when showing the user interface of first application program, and detects the gesture and includes It detects multiple contacts on the touch sensitive surface simultaneously and detects the movement of the multiple contact;With
The device for following operation enabled in response to detecting the gesture on the touch sensitive surface:
It include two contacts being detected simultaneously by according to the determination gesture, based on described two contacts being detected simultaneously by institute The movement during stating gesture executes operation in first application program;
According to the contact detected while determining that gesture includes more than predetermined number, the predetermined number be greater than Two, and at the same time mobile satisfaction first standard of the contact detected during the gesture, from the use of the first application program of display Family changing interface is different from the user interface of the second application program of the first application program to showing in multiple application programs;And
It the contact that is detected while according to the determination gesture including more than the predetermined quantity and described is detected simultaneously by Contact during the gesture it is described it is mobile meet the second standard for being different from first standard, from display described first It includes being used to open installation on said device the multiple using journey that the user interface of application program, which is switched to display, The user interface of the corresponding application programs icon of sequence.
81. a kind of information processing unit used in the electronic equipment with touch-sensitive display, comprising:
Device for following operation: be shown mounted in multiple application programs in the equipment on the display The user interface of one application program;
For detecting the device of the gesture on the touch sensitive surface, wherein detecting the gesture includes showing on the display The initial part of the gesture is detected when showing the user interface of first application program, and detects the gesture and includes It detects multiple contacts on the touch sensitive surface simultaneously and detects the movement of the multiple contact;And
The device for following operation enabled in response to detecting the gesture on the touch sensitive surface:
It include two contacts being detected simultaneously by according to the determination gesture, based on described two contacts being detected simultaneously by institute The movement during stating gesture executes operation in first application program;
According to the contact detected while determining that gesture includes more than predetermined number, the predetermined number be greater than Two, and at the same time mobile satisfaction first standard of the contact detected during the gesture, from the use of the first application program of display Family changing interface is different from the user interface of the second application program of the first application program to showing in multiple application programs;And
It the contact that is detected while according to the determination gesture including more than the predetermined quantity and described is detected simultaneously by Contact during the gesture it is described it is mobile meet the second standard for being different from first standard, from display described first It includes being used to open installation on said device the multiple using journey that the user interface of application program, which is switched to display, The user interface of the corresponding application programs icon of sequence.
82. a kind of electronic equipment, comprising:
Touch-sensitive display;
One or more processors;With
The memory for storing one or more programs, wherein one or more of programs are configured as by one or more of Processor executes, and one or more of programs include for executing any in the method according to claim 58 to 76 The instruction of method.
83. a kind of computer readable storage medium for storing one or more programs, one or more of programs include instruction, Described instruction by the electronic equipment with touch-sensitive display when being executed, so that the equipment is executed according to claim 58 to 76 Any method in the method.
84. the graphic user interface in a kind of electronic equipment, the electronic equipment has touch-sensitive display, memory and is used for The one or more processors for the one or more programs being stored in the memory are executed, the graphic user interface includes The user interface that any method in the method according to claim 58 to 76 is shown.
85. a kind of electronic equipment, comprising:
Touch-sensitive display;With
For either executing in the method according to claim 58 to 76 device of method.
86. a kind of information processing unit used in the electronic equipment with touch-sensitive display, comprising:
For either executing in the method according to claim 58 to 76 device of method.
CN201811166251.1A 2018-05-07 2018-09-29 For the equipment, method and graphic user interface of taskbar to be navigated and shown between user interface Pending CN110456949A (en)

Priority Applications (16)

Application Number Priority Date Filing Date Title
CN202110465095.4A CN113220177A (en) 2018-05-07 2018-09-29 Device, method and graphical user interface for navigating between user interfaces and displaying a taskbar
EP19724034.4A EP3791248A2 (en) 2018-05-07 2019-05-02 Devices, methods, and graphical user interfaces for navigating between user interfaces, displaying a dock, and displaying system user interface elements
JP2020554462A JP7022846B2 (en) 2018-05-07 2019-05-02 Devices, methods, and graphical user interfaces for navigation between user interfaces, displaying docks, and displaying system user interface elements.
KR1020207035129A KR102503076B1 (en) 2018-05-07 2019-05-02 Devices, methods, and graphical user interfaces for navigating between user interfaces, displaying the dock, and displaying system user interface elements
AU2019266126A AU2019266126B2 (en) 2018-05-07 2019-05-02 Devices, methods, and graphical user interfaces for navigating between user interfaces, displaying a dock, and displaying system user interface elements
KR1020237005896A KR102662244B1 (en) 2018-05-07 2019-05-02 Devices, methods, and graphical user interfaces for navigating between user interfaces, displaying a dock, and displaying system user interface elements
PCT/US2019/030385 WO2019217196A2 (en) 2018-05-07 2019-05-02 Devices, methods, and graphical user interfaces for navigating between user interfaces, displaying a dock, and displaying system user interface elements
US17/603,879 US11797150B2 (en) 2018-05-07 2019-05-02 Devices, methods, and graphical user interfaces for navigating between user interfaces, displaying a dock, and displaying system user interface elements
AU2019100488A AU2019100488B4 (en) 2018-05-07 2019-05-06 Devices, methods, and graphical user interfaces for navigating between user interfaces, displaying a dock, and displaying system user interface elements
AU2019101068A AU2019101068B4 (en) 2018-05-07 2019-09-17 Devices, methods, and graphical user interfaces for navigating between user interfaces, displaying a dock, and displaying system user interface elements
US16/661,964 US11079929B2 (en) 2018-05-07 2019-10-23 Devices, methods, and graphical user interfaces for navigating between user interfaces, displaying a dock, and displaying system user interface elements
AU2021282433A AU2021282433B2 (en) 2018-05-07 2021-12-08 Devices, methods, and graphical user interfaces for navigating between user interfaces, displaying a dock, and displaying system user interface elements
JP2022017212A JP7337975B2 (en) 2018-05-07 2022-02-07 Devices, methods, and graphical user interfaces for navigating between user interfaces, displaying docks, and displaying system user interface elements
AU2023202742A AU2023202742B2 (en) 2018-05-07 2023-05-03 Devices, methods, and graphical user interfaces for navigating between user interfaces, displaying a dock, and displaying system user interface elements
JP2023135764A JP2023166446A (en) 2018-05-07 2023-08-23 Device, method, and graphical user interface for navigating between user interfaces, displaying dock, and displaying system user interface element
US18/368,531 US20240045564A1 (en) 2018-05-07 2023-09-14 Devices, Methods, and Graphical User Interfaces for Navigating Between User Interfaces, Displaying a Dock, and Displaying System User Interface Elements

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
US201862668177P 2018-05-07 2018-05-07
US62/668,177 2018-05-07
US201862679959P 2018-06-03 2018-06-03
US62/679,959 2018-06-03
DKPA201870336A DK180116B1 (en) 2018-05-07 2018-06-11 Devices, methods, and graphical user interfaces for navigating between user interfaces and displaying a dock
DKPA201870336 2018-06-11

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202110465095.4A Division CN113220177A (en) 2018-05-07 2018-09-29 Device, method and graphical user interface for navigating between user interfaces and displaying a taskbar

Publications (1)

Publication Number Publication Date
CN110456949A true CN110456949A (en) 2019-11-15

Family

ID=68480528

Family Applications (2)

Application Number Title Priority Date Filing Date
CN201811166251.1A Pending CN110456949A (en) 2018-05-07 2018-09-29 For the equipment, method and graphic user interface of taskbar to be navigated and shown between user interface
CN202110465095.4A Pending CN113220177A (en) 2018-05-07 2018-09-29 Device, method and graphical user interface for navigating between user interfaces and displaying a taskbar

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN202110465095.4A Pending CN113220177A (en) 2018-05-07 2018-09-29 Device, method and graphical user interface for navigating between user interfaces and displaying a taskbar

Country Status (1)

Country Link
CN (2) CN110456949A (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112214126A (en) * 2020-09-23 2021-01-12 杭州鸿雁电器有限公司 Operation panel and display method and device thereof
CN112612388A (en) * 2020-12-24 2021-04-06 九号智能(常州)科技有限公司 Multimedia control method, equipment and storage medium for riding vehicle
CN113535285A (en) * 2020-04-15 2021-10-22 斑马智行网络(香港)有限公司 Interface display method, device, equipment and storage medium
CN113568688A (en) * 2020-04-29 2021-10-29 RealMe重庆移动通信有限公司 View switching method and device, electronic equipment and storage medium
CN115268730A (en) * 2020-03-10 2022-11-01 苹果公司 Device, method and graphical user interface for interacting with user interface objects corresponding to an application
US11630556B2 (en) * 2020-09-16 2023-04-18 Kyndryl, Inc. Finger control of wearable devices

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115185446A (en) * 2022-07-21 2022-10-14 Oppo广东移动通信有限公司 Method and device for executing system-level operation in application program and electronic equipment
CN117170982B (en) * 2023-11-02 2024-02-13 建信金融科技有限责任公司 Man-machine detection method, device, electronic equipment and computer readable medium

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115268730A (en) * 2020-03-10 2022-11-01 苹果公司 Device, method and graphical user interface for interacting with user interface objects corresponding to an application
CN113535285A (en) * 2020-04-15 2021-10-22 斑马智行网络(香港)有限公司 Interface display method, device, equipment and storage medium
CN113568688A (en) * 2020-04-29 2021-10-29 RealMe重庆移动通信有限公司 View switching method and device, electronic equipment and storage medium
CN113568688B (en) * 2020-04-29 2023-06-06 RealMe重庆移动通信有限公司 View switching method and device, electronic equipment and storage medium
US11630556B2 (en) * 2020-09-16 2023-04-18 Kyndryl, Inc. Finger control of wearable devices
CN112214126A (en) * 2020-09-23 2021-01-12 杭州鸿雁电器有限公司 Operation panel and display method and device thereof
CN112214126B (en) * 2020-09-23 2022-11-15 杭州鸿雁电器有限公司 Operation panel and display method and device thereof
CN112612388A (en) * 2020-12-24 2021-04-06 九号智能(常州)科技有限公司 Multimedia control method, equipment and storage medium for riding vehicle
CN112612388B (en) * 2020-12-24 2022-09-09 九号智能(常州)科技有限公司 Multimedia control method, equipment and storage medium for riding vehicle

Also Published As

Publication number Publication date
CN113220177A (en) 2021-08-06

Similar Documents

Publication Publication Date Title
US11747956B2 (en) Multi-dimensional object rearrangement
KR102157759B1 (en) Devices, methods, and graphical user interfaces for wireless pairing with peripheral devices and displaying status information concerning the peripheral devices
CN104903835B (en) For abandoning equipment, method and the graphic user interface of generation tactile output for more contact gestures
CN104487929B (en) For contacting the equipment for carrying out display additional information, method and graphic user interface in response to user
CN108351750B (en) For handling equipment, method and the graphic user interface of strength information associated with touch input
CN104487927B (en) For selecting the equipment, method and graphic user interface of user interface object
CN104471521B (en) For providing the equipment, method and graphic user interface of feedback for the state of activation for changing user interface object
CN110456949A (en) For the equipment, method and graphic user interface of taskbar to be navigated and shown between user interface
CN105264479B (en) Equipment, method and graphic user interface for navigating to user interface hierarchical structure
CN104903834B (en) For equipment, method and the graphic user interface in touch input to transition between display output relation
JP6097843B2 (en) Device, method and graphical user interface for determining whether to scroll or select content
CN110162243A (en) Equipment, method and graphic user interface for being interacted with control object
CN109643217A (en) By based on equipment, method and user interface close and interacted based on the input of contact with user interface object
US20150346929A1 (en) Safari Tab and Private Browsing UI Enhancement
CN110456979A (en) For showing equipment, method and the graphic user interface that can indicate in background
JP2017021826A (en) Device, method, and graphical user interface for transitioning between display states in response to gesture
CN107491186A (en) Touch keypad for screen
CN105892644A (en) Navigation User Interface
CN107683458A (en) For manipulating the equipment, method and graphic user interface of related application window
CN107690614A (en) Movement between multiple views
CN107787478A (en) Content item is selected in user interface display
CN110286836A (en) Equipment, method and graphic user interface for mobile application interface element
CN110457093A (en) Equipment, method and graphic user interface for active management notice
CN109974581A (en) The device and method measured using augmented reality
CN110134248A (en) Tactile output based on content

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination