CN102768608A - Event recognition - Google Patents

Event recognition Download PDF

Info

Publication number
CN102768608A
CN102768608A CN2011104632628A CN201110463262A CN102768608A CN 102768608 A CN102768608 A CN 102768608A CN 2011104632628 A CN2011104632628 A CN 2011104632628A CN 201110463262 A CN201110463262 A CN 201110463262A CN 102768608 A CN102768608 A CN 102768608A
Authority
CN
China
Prior art keywords
touch
software application
recognition device
event
view
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN2011104632628A
Other languages
Chinese (zh)
Other versions
CN102768608B (en
Inventor
J·H·沙法尔
K·L·科西恩达
I·乔德里
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Apple Inc
Original Assignee
Apple Computer Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US13/077,927 external-priority patent/US8566045B2/en
Priority claimed from US13/077,931 external-priority patent/US9311112B2/en
Priority claimed from US13/077,524 external-priority patent/US9244606B2/en
Application filed by Apple Computer Inc filed Critical Apple Computer Inc
Priority to CN201610383388.7A priority Critical patent/CN106095418B/en
Publication of CN102768608A publication Critical patent/CN102768608A/en
Application granted granted Critical
Publication of CN102768608B publication Critical patent/CN102768608B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Abstract

A method includes displaying one or more views of a view hierarchy, and executing software elements associated with a particular view. Each particular view includes event recognizers. Each event recognizer has one or more event definitions, and an event handler that specifies an action for a target and is configured to send the action to the target in response to event recognition. The method includes detecting a sequence of sub-events, and identifying one of the views of the view hierarchy as a hit view that establishes which views are actively involved views. The method includes delivering a respective sub-event to event recognizers for each actively involved view. A respective event recognizer has event definitions, and one of the event definitions is selected based on the internal state. The respective event recognizer processes the respective sub-event prior to processing a next sub-event in the sequence of sub-events.

Description

Event recognition
Technical field
The present invention relates generally to user interface process, includes but not limited to the apparatus and method of recognizing touch operation input.
Background technology
Electronic equipment generally includes and is used for the user interface mutual with computing equipment.User interface can comprise display and/or such as the input equipment of keyboard, mouse and touch sensitive surface, be used for the various aspects of user interface mutual.In having touch sensitive surface some equipment as input equipment; In specific context (for example; In the AD HOC of first application program); First group based on the attitude that touches (for example, twice or more frequently: rap, double-click, level swing, vertically hit, pinch (pinch), scatter (depinch), two commanders hit) be identified as suitable input, in other contexts (for example; Different mode or context in the different application programs and/or first application program), other not on the same group attitude based on touching be identified as suitable input.The result is, identification and in response to possibly becoming complicated with logic based on the required software of the attitude that touches, and upgrade application program or when computing equipment adds new application program, possibly need correction each.These and similar problem possibly appear at use in the user interface based on the input source the attitude that touches.
Therefore, expectation has and is used to discern based on the attitude that touches and incident and from the attitude of other input sources and the comprehensive framework or the mechanism of incident, and it is easy to be applicable to the nearly all context or the pattern of all application programs on the computing equipment.
Summary of the invention
In order to solve aforesaid shortcoming, some embodiment provide the method for in having the electronic equipment of touch-sensitive display, carrying out.Said electronic equipment is configured to carry out at least first software application and second software application.Said first software application package is drawn together first group of one or more gesture recognition device, and said second software application package is drawn together one or more views and second group of one or more gesture recognition device.Gesture recognition utensil separately has corresponding attitude processor.Said method comprises the subclass of the one or more views that show said second software application at least; And when the subclass of the one or more views that show said second software application at least, detect the touch list entries on the said touch-sensitive display.Said touch list entries comprises the first of one or more touch inputs and the second portion of the one or more touch inputs after the said first.Said method also comprises: during the phase one of detecting said touch list entries; The gesture recognition device of one or more couplings of the said first that the one or more touches of identification are imported is assert to said first software application and said second software application by the said first that transmits one or more touch inputs in the gesture recognition device from said first group; And the said first of one or more touches inputs is handled in use corresponding to one or more attitude processors of the gesture recognition device of said one or more couplings.
According to some embodiment, the method for in having the electronic equipment of touch-sensitive display, carrying out is provided.Said electronic equipment is configured to carry out at least first software application and second software application.Said first software application package is drawn together first group of one or more gesture recognition device, and said second software application package is drawn together one or more views and second group of one or more gesture recognition device.Gesture recognition utensil separately has corresponding attitude processor.Said method comprises first group of one or more view of demonstration.Said first group of one or more view comprise the subclass of one or more views of said second software application at least.Said method also comprises: when showing said first group of one or more view, detect the touch list entries on the said touch-sensitive display.Said touch list entries comprises the first of one or more touch inputs and the second portion of the one or more touch inputs after the said first.Said method comprises: confirm whether at least one the gesture recognition device in said first group of one or more gesture recognition device discerns the said first of one or more touch inputs.Said method also comprises: according to confirming of the said first that discerns one or more touch inputs about at least one the gesture recognition device in said first group of one or more gesture recognition device; Transmit said touch list entries to said first software application; And said touch list entries is not sent to said second software application, and confirm whether at least one the gesture recognition device in said first group of one or more gesture recognition device discerns said touch list entries.Said method further comprises: according to discern confirming of said touch list entries about at least one the gesture recognition device in said first group of one or more gesture recognition device, use said at least one gesture recognition device of the said touch list entries of identification in said first group of one or more gesture recognition device to handle said touch list entries.Said method also comprises: according to about there not being the gesture recognition device to discern the confirming of said first of one or more touches input in said first group of one or more gesture recognition device; Transmit said touch list entries to said second software application, and confirm whether at least one the gesture recognition device in said second group of one or more gesture recognition device discerns said touch list entries.Said method further comprises: according to discern confirming of said touch list entries about at least one the gesture recognition device in said second group of one or more gesture recognition device, use said at least one gesture recognition device of the said touch list entries of identification in said second group of one or more gesture recognition device to handle said touch list entries.
According to some embodiment, the method for in having the electronic equipment of internal state, carrying out is provided.Said electronic equipment is configured to carry out the software that comprises the view hierarchical structure with a plurality of views.Said method comprises: show the one or more views in the said view hierarchical structure, and carry out one or more software elements.Each software element is associated with specific view, and each particular figure comprises one or more event recognition devices.Each event recognition utensil has one or more event definitions and the event handler based on one or more subevents; This event handler specify to the action of target and be configured in response to said event recognition device detect with said one or more event definitions in particular event define corresponding incident, and send the said said target of moving.Said method also comprises: detect the sequence of one or more subevents, and discern in the view of said view hierarchical structure one as clicking view (hit view).Which view that said click view is established in the view hierarchical structure is the view (actively involved view) that effectively relates to.Said method further comprises: transmission subevent separately is to the event recognition device that is used for said each view that effectively relates to of view hierarchical structure.At least one the event recognition utensil that is used for the view that said view hierarchical structure effectively relates to has a plurality of event definitions, and selects in said a plurality of event definition according to the internal state of said electronic equipment.According to selected event definition, before the next subevent in handling said subevent sequence, said at least one event recognition device is handled said subevent separately.
According to some embodiment, non-transient state computer-readable recording medium storage is by one or more programs of an execution in a plurality of processors of electronic equipment.Said one or more program comprises when being carried out by said electronic equipment makes said electronic equipment carry out one or more instructions of above-mentioned any method.
According to some embodiment, a kind of electronic equipment with touch-sensitive display comprises that one or more processors and storage are used for the storer by one or more programs of said one or more processors execution.Said one or more program comprises the instruction that is used to realize above-mentioned any method.
According to some embodiment, a kind of electronic equipment with touch-sensitive display comprises the device that is used to realize any above-mentioned any method.
According to some embodiment, the signal conditioning package in a kind of multifunctional equipment with touch-sensitive display comprises the device that is used to realize above-mentioned any method.
According to some embodiment, a kind of electronic equipment comprises the touch-sensitive display unit and the processing unit that is couple to this touch-sensitive display unit that is configured to receive the touch input.This processing unit is configured to carry out at least first software application and second software application.Said first software application package is drawn together first group of one or more gesture recognition device, and said second software application package is drawn together one or more views and second group of one or more gesture recognition device.Gesture recognition utensil separately has corresponding attitude processor.Said processing unit is configured to: the subclass that makes it possible to show at least said one or more views of said second software application; When the subclass of the one or more views that show said second software application at least, detect the touch list entries on the said touch-sensitive display unit.Said touch list entries comprises the first of one or more touch inputs and the second portion of the one or more touch inputs after the said first.Said processing unit is configured to, and during the phase one of detecting said touch list entries: the said first that transmits one or more touch inputs is to said first software application and said second software application; The gesture recognition device of one or more couplings of the said first of one or more touches input is discerned in gesture recognition device identification from said first group; And use the said first that handles one or more touches inputs corresponding to one or more attitude processors of the gesture recognition device of said one or more couplings.
According to some embodiment, a kind of electronic equipment comprises the touch-sensitive display unit and the processing unit that is couple to said touch-sensitive display unit that is configured to receive the touch input.Said processing unit is configured to carry out at least first software application and second software application.Said first software application package is drawn together first group of one or more gesture recognition device, and said second software application package is drawn together one or more views and second group of one or more gesture recognition device.Gesture recognition utensil separately has corresponding attitude processor.Said processing unit is arranged such that and can shows first group of one or more view.Said first group of one or more view comprise the subclass of one or more views of said second software application at least.Said processing unit is configured to, when showing said first group of one or more view: detect the touch list entries (said touch list entries comprises the first of one or more touches inputs and the second portion of the one or more touches inputs after the said first) on the said touch-sensitive display unit; And confirm whether at least one the gesture recognition device in said first group of one or more gesture recognition device discerns the said first of one or more touch inputs.Said processing unit is configured to; According to confirming of the said first that discerns one or more touches input about at least one the gesture recognition device in said first group of one or more gesture recognition device: transmit said touch list entries to said first software application, and said touch list entries is not sent to said second software application; And confirm whether at least one the gesture recognition device in said first group of one or more gesture recognition device discerns said touch list entries.Said processing unit is configured to; According to discern confirming of said touch list entries about at least one the gesture recognition device in said first group of one or more gesture recognition device, use said at least one gesture recognition device of the said touch list entries of identification in said first group of one or more gesture recognition device to handle said touch list entries.Said processing unit is configured to; According to about there not being the gesture recognition device to discern the confirming of said first of one or more touches input in said first group of one or more gesture recognition device; Transmit said touch list entries to said second software application, confirm whether at least one the gesture recognition device in said second group of one or more gesture recognition device discerns said touch list entries; And, use said at least one gesture recognition device of the said touch list entries of identification in said second group of one or more gesture recognition device to handle said touch list entries according to discern confirming of said touch list entries about at least one the gesture recognition device in said second group of one or more gesture recognition device.
According to some embodiment, a kind of electronic equipment comprises: display unit is configured to show one or more views; Memory cell is configured to store internal state; And processing unit, be couple to said display unit and said memory cell.Said processing unit is configured to: carry out the software that comprises the view hierarchical structure with a plurality of views; Make it possible to show one or more views of said view hierarchical structure; And carry out one or more software elements.Each software element is associated with specific view, and each particular figure comprises one or more event recognition devices.Each event recognition utensil has: based on the one or more event definitions and the event handler of one or more subevents.Said event handler is specified the action to target, and be configured in response to said event recognition device detect with said one or more event definitions in particular event define corresponding incident, and send the said said target of moving.Said processing unit is configured to: the sequence that detects one or more subevents; And discern a view in the view of said view hierarchical structure as clicking view.Said click view establishes which view is the view that effectively relates in the said view hierarchical structure.Said processing unit is configured to, and transmission subevent separately is to the event recognition device that is used for said each view that effectively relates to of view hierarchical structure.At least one the event recognition utensil that is used for the view that said view hierarchical structure effectively relates to has a plurality of event definitions; One in said a plurality of event definition is to select according to the internal state of said electronic equipment; And according to selected event definition; Before the next subevent in handling said subevent sequence, said at least one event recognition device is handled subevent separately.
Description of drawings
Figure 1A-1C is an illustration according to the block diagram of the electronic equipment of some embodiment.
Fig. 2 is the figure that handles stack according to the I/O of the exemplary electronic device of some embodiment.
Fig. 3 A illustration according to the exemplary view hierarchical structure of some embodiment.
Fig. 3 B and 3C have been illustrations according to the example event recognition device method of some embodiment and the block diagram of data structure.
Fig. 3 D is an illustration according to the block diagram of the exemplary components that is used for event handling of some embodiment.
Fig. 3 E is an illustration according to the example class of the gesture recognition device of some embodiment and the block diagram of instance.
Fig. 3 F is an illustration according to the block diagram of the event information stream of some embodiment.
Fig. 4 A and 4B are illustrations according to the process flow diagram of the example state machine of some embodiment.
Fig. 4 C illustration according to the example state machine of Fig. 4 A of some embodiment and 4B subevent group to example.
Fig. 5 A-5C according to some embodiment with example event recognition device state machine illustrated example subevent sequence.
Fig. 6 A and 6B are the event recognition method process flow diagrams according to some embodiment.
Fig. 7 A-7S illustration importing with the user by the example user interface of event recognition device identification according to some embodiment in order to navigate through the application program of opening simultaneously.
Fig. 8 A and 8B are illustrations according to the process flow diagram of the event recognition method of some embodiment.
Fig. 9 A-9C is an illustration according to the process flow diagram of the event recognition method of some embodiment.
Figure 10 A and 10B are illustrations according to the process flow diagram of the event recognition method of some embodiment.
Figure 11 is the functional block diagram according to the electronic equipment of some embodiment.
Figure 12 is the functional block diagram according to the electronic equipment of some embodiment.
Figure 13 is the functional block diagram according to the electronic equipment of some embodiment.
Run through whole accompanying drawing, similarly Reference numeral refers to corresponding part.
Embodiment
Electronic equipment (for example, smart phone and flat computer) with the small screen once shows single application program usually, even on this equipment, possibly move a plurality of application programs.Many these equipment have the touch-sensitive display that is configured to receive the attitude of importing as touch.To such equipment; The user by the application program of hiding (for example possibly want to carry out; Application program on running background and display that be not simultaneously displayed on electronic equipment is such as the applied program ignitor software application at running background) operation that provides.The existing method that is used to carry out the operation that is provided by the application program of hiding is needs usually, at first show the application program of hiding, and will touch then to import to be provided to the present application program that is shown.Therefore, existing method needs extra step.Further, the user possibly not want to see hiding application program, but still want to carry out the operation that is provided by the application program of hiding.Among the embodiment that is described below, be input to hiding application program, and use the application program processes of hiding to touch input and do not show hiding application program, realized being used for and the mutual improved method of hiding of application program through sending to touch.Therefore; These methods are simplified and (streamline) mutual with the application program of hiding; Thereby eliminated needs, ability mutual with the application program of hiding based on the attitude input and control is provided simultaneously extra, the independent step that shows hiding application program.
In addition, in certain embodiments, these electronic equipments have at least one gesture recognition device with a plurality of attitude definition.This helps the gesture recognition device under distinct operator scheme, to work.For example, equipment can have normal manipulation mode and auxiliary (accessiblity) operator scheme people of Her Vision Was Tied Down (for example, for).In normal manipulation mode, next application program attitude is used between application program moving, and should next one application program attitude be defined as the three finger left sides attitude of hitting.Under the non-productive operation pattern, the three finger left sides attitude of hitting is used to carry out different functions.Thus, need be different from three attitudes that refer to hit on a left side under the non-productive operation pattern with corresponding to next application program attitude (for example, four finger left sides under the non-productive operation pattern hit attitude).Through making a plurality of attitude definition be associated with next application program attitude, it is that next application program attitude is selected an attitude definition that equipment can depend on current operator scheme.This provides the dirigibility of under different operating modes, using the gesture recognition device.In certain embodiments, a plurality of gesture recognition devices that define with a plurality of attitudes depend on operator scheme and are conditioned (for example, referring to that by three the attitude of carrying out refers to carry out by four under the non-productive operation pattern under the normal manipulation mode).
Below, Figure 1A-1C and Fig. 2 provide the description of example apparatus.Fig. 3 A-3F has described the operation (for example, event information stream) of the parts that are used for event handling and this parts.Fig. 4 A-4C and Fig. 5 A-5C have described the operation of incident recognizer in more detail.Fig. 6 A-6B is the process flow diagram of illustration event recognition method.Fig. 7 A-7S is the example user interface that illustration is used the operation of the event recognition method among Fig. 8 A-8B, 9A-9C and Figure 10.Fig. 8 A-8B is that illustration uses the attitude processor of the application program of hiding of opening to come the process flow diagram of the event recognition method of processing events information.Fig. 9 A-9C is that illustration uses application program of hiding of opening or the gesture recognition device of the application program that is shown to come the process flow diagram of the event recognition method of processing events information conditionally.Figure 10 is illustrated as the individual event recognizer is selected the event recognition method of an event definition from a plurality of event definitions process flow diagram.
Existing with the example of reference implementation at length, its example is illustration in the accompanying drawings.In the detailed description below, for provide for of the present invention understand completely set forth a large amount of details.Yet, it will be apparent to those skilled in the art that the present invention can be implemented under the situation of these details not having.In other instances,, do not describe known method, program, parts, circuit and network in detail in order unnecessarily not make the aspect indigestion of embodiment.
Although it is also understood that term first, second or the like can be used in this paper and be used to represent various elements, these elements do not receive the restriction of these terms.These terms only are used for element is distinguished each other.For example, without departing from the scope of the invention, first contact can be called second contact, and similarly, second contact can be called first contact.It all is contact that first contact contacts with second, but they are not same contacts.
The term that uses in the instructions of the present invention only is used to describe specific embodiment, and is not intended to limit the present invention.As what use in instructions of the present invention and the accompanying claims, singulative " ", " one " and " said " are intended to also comprise plural form, only if other implications of clear from context ground indication.It is also understood that as used herein term " and/or " refer to comprise relatedly list one or more in the item and might make up with institute arbitrarily.Further be to be understood that; Use a technical term in this instructions and " comprise and " existence of said characteristic, integral body, step, operation, element and/or parts is described, but is not got rid of the existence or the interpolation of one or more other characteristics, integral body, step, operation, element, parts and/or its grouping.
Use as this paper, term " if " can based on context be interpreted as " when ... " or " one ... just " or " in response to confirming " or " in response to detecting ".Similarly, based on context phrase " if confirm " or " if detecting [the conditioned disjunction incident of statement] " can be interpreted as that " one confirms ... just " or " in response to confirming " or " once detecting (the conditioned disjunction incident of statement) " perhaps " in response to detecting (the conditioned disjunction incident of statement) ".
Use as this paper, term " incident " refer to by one or more sensor of equipment to input.Especially, term " incident " is included in the touch on the touch sensitive surface.An incident comprises one or more subevents.The subevent typically refer to variation to incident (for example, touch put down, touch mobile, touch that to be lifted away from can be the subevent).Subevent in the sequence of one or more subevents can comprise many forms; Include but not limited to, press that key, button keep, discharge button, press the button, press the button maintenance, release-push, operating rod move, mouse moves, press mouse button, release the mouse button, stylus touch, stylus moves, stylus release, spoken command, detected eyes are moved, biometric input, detected user's physiological change and other.Because incident possibly comprise single subevent (for example, the hyphen of equipment is to motion), so also self-explanatory characters' part of the term " subevent " that this paper uses.
Use as this paper, term " event recognition device " and " gesture recognition device " be used in reference to for the recognizer that can discern attitude or other incidents (for example, the motion of equipment) convertibly.Use as this paper; Term " event handler " and " attitude processor " are used in reference to generation convertibly in response to the processor of the identification of incident/subevent or attitude being carried out one group of predetermined operation (for example, Update Information, upgating object and/or update displayed).
As stated; In having touch sensitive surface some equipment as input equipment; First group based on the attitude that touches (for example; Two or more: rap, twoly strike, level swing, vertically hit) in specific context (for example; In the AD HOC of first application program) be identified as suitable input, and other different groups based on the attitude that touches (for example, different application program and/or the different mode in first application program or preceding text down) in other contexts are identified as suitable input.The result is, is used to discern and in response to possibly becoming complicated based on the required software of the attitude that touches with logic, and upgrades application program or when computing equipment adds new application program, possibly need correction each.Embodiment described herein solves these problems through the comprehensive framework that is provided for the input of processing events and/or attitude.
Among the embodiment that is described below, be incident based on the attitude that touches.One recognizes predefined incident, for example with the current context of application program in the corresponding incident of suitable input, just send relate to this incident information to application program.Further, each separately incident be defined as the subevent sequence.In having multiple point touching display device (this paper is commonly referred to " screen ") or other multiple point touching sensing surfaces and the equipment of acceptance based on the attitude of multiple point touching, definition can comprise multiple point touching subevent (needing two or more fingers touch sensitive surface of contact arrangement simultaneously) based on the subevent of multiple point touching incident.For example, in equipment, when user's finger can begin the multiple point touching sequence of subevent separately during touch screen first with multiple point touching sensitive display.When one or more other fingers sequentially or side by side other subevent can take place during touch screen, when screen moves other subevents can take place and stride when all fingers of touch screen.When being lifted away from time series from screen, last finger of user finishes.
When use was controlled the application program that operates in the equipment with touch sensitive surface based on the attitude that touches, touch had time and two aspects, space.The time aspect is called the stage, and when indication touches beginning, touch is that move or static and when touch end (that is, when point from screen be lifted away from).The aspect, space that touches is that the view of touch or the set of user interface windows take place on it.The view or the window that detect touch therein can be corresponding to the program ranks in program or the view hierarchical structure.For example, other view of lowermost level that detects touch therein can be called the click view, and is identified as the event group that is fit to input and can be at least in part confirms based on the click view of the initial contact of the attitude that touches based on beginning.Alternately or additionally, at least in part based on the one or more software programs in the program layer aggregated(particle) structure (that is software application) with the input of event recognition for being fit to.For example, the five fingers are pinched attitude and are pinched and be identified as suitable input in the applied program ignitor of gesture recognition device having the five fingers, but pinch in the Web-browser application of gesture recognition device and can not be identified as suitable input not having the five fingers.
Figure 1A-1C is an illustration according to the block diagram of the different embodiment of the electronic equipment 102 of some embodiment.Electronic equipment 102 can be any electronic equipment, includes but not limited to desktop computer computer system, laptop system, mobile phone, smart phone, personal digital assistant or navigational system.Electronic equipment 102 also can be have be configured to provide user interface touch-screen display (for example; Touch-sensitive display 156; Portable electric appts Figure 1B), have the touch-screen display that is configured to provide user interface computing machine, have the touch sensitive surface and the computing machine of display and the computing equipment of any other form that are configured to provide user interface; Include but not limited to consumer electronics, mobile phone, video game system, electronic music player, dull and stereotyped PC, electronic-book reading system, e-book, PDA, electronic organisers, electronic mail equipment, on knee or other computing machines, computer installation (kiosk computer), vending machine, intelligent device or the like.Electronic equipment 102 comprises user interface 113.
In certain embodiments, electronic equipment 102 comprises touch-sensitive display 156 (Figure 1B).In these embodiment, user interface 113 can comprise the on-screen keyboard (not shown), is used for by user and electronic equipment 102 mutual.In certain embodiments, electronic equipment 102 also comprises one or more input equipments 128 (for example, keyboard, mouse, trace ball, microphone, physical button, touch pad or the like).In certain embodiments; Touch-sensitive display 156 can detect that two or more are different, (or part simultaneously) touch simultaneously; And in these embodiment, display 156 is sometimes referred to as multiple point touching display or multiple point touching sensitive display in this article.In certain embodiments, the keyboard of one or more input equipments 128 can be opened with electronic equipment and be different in 102 minutes.For example, keyboard can be the wired or Wireless Keyboard that is couple to electronic equipment 102.
In certain embodiments, electronic equipment 102 comprises the display 126 and one or more input equipments 128 (for example, keyboard, mouse, trace ball, microphone, physical button, touch pad, track pad or the like) that is couple to electronic equipment 102.In these embodiment, one or more in the input equipment 128 can selectively leave with electronic equipment and be different in 102 minutes.For example, one or more input equipments can comprise one or more in following: keyboard, mouse, track pad, trace ball and electronic pen, more than can selectively separate arbitrarily with electronic equipment.Selectively, equipment 102 can comprise one or more sensors 116, for example, and one or more accelerometers, gyroscope, gps system, loudspeaker, infrared (IR) sensor, biometric sensors, camera or the like.Should be noted that not having materially affect for the operation of embodiment described herein as input equipment 128 or as the above description of the various example apparatus of sensor 116.And any input described as input equipment among this paper or sensor device can be described as sensor with being equal to well, and vice versa.In certain embodiments, the signal that is produced by one or more sensors 116 is used the input source that acts on the detection incident.
In certain embodiments, electronic equipment 102 comprises touch-sensitive display 156 (that is the display that, has touch sensitive surface) and the one or more input equipment 128 (Figure 1B) that is couple to electronic equipment 102.In certain embodiments; Touch-sensitive display 156 can detect two or more (or part simultaneously) touches different the time; And in these embodiment, display 156 is sometimes referred to as multiple point touching display or multiple point touching sensitive display in this article.
In some embodiment of the electronic equipment 102 that this paper discusses, input equipment 128 is arranged in the electronic equipment 102.In other embodiments, the one or more and electronic equipment in the input equipment 128 left and is different in 102 minutes.For example, one or more in the input equipment 128 can pass through cable (for example, USB cable) or wireless connections (for example, bluetooth connects) are couple to electronic equipment 102.
When using input equipment 128, perhaps when carrying out based on the attitude that touches on the touch-sensitive display 156 at electronic equipment 102, the user produces the subevent sequence by one or more CPU 110 processing of electronic equipment 102.In certain embodiments, one or more CPU 110 processing subevent sequences of electronic equipment 102 are with the identification incident.
Electronic equipment 102 generally includes one or more monokaryons or multinuclear processing unit (CPU or a plurality of CPU) 110 and one or more network or other communication interfaces 112.Electronic equipment 102 comprises storer 111 and the one or more communication buss 115 that are used at these parts of interconnection.Communication bus 115 can comprise the circuit (being sometimes referred to as chipset) of the communication between interconnection and the control system parts (this paper is not shown).As stated, electronic equipment 102 comprises the user interface 113 that comprises display (for example, display 126 or touch-sensitive display 156).Further, electronic equipment 102 generally includes input equipment 128 (for example, keyboard, mouse, touch sensitive surface, key plate or the like).In certain embodiments, input equipment 128 comprises input equipment on the screen (for example, the touch sensitive surface of display apparatus).Storer 111 can comprise high-speed random access memory, such as DRAM, SRAM, DDR RAM or other random access solid-state memory device; And can comprise nonvolatile memory, such as one or more disk storage devices, optical disc memory apparatus, flash memory device or other non-volatile solid storage devices.Storer 111 can comprise the one or more memory devices with CPU 110 long-range placements alternatively.Non-volatile memory devices in the storer 111 that storer 111 or conduct substitute comprises computer-readable recording medium.In certain embodiments, storer 111 or the non-volatile memory devices in storer 111 comprise non-transient state computer-readable recording medium.In certain embodiments, program, module and data structure or its subclass below the computer-readable recording medium of (electronic equipment 102) storer 111 or storer 111 has been stored:
Operating system 118 comprises being used to handle make preparations for sowing basic system services and the process that is used to carry out the hardware-dependent task;
Supplementary module 127 (Fig. 1 C); Be used for being modified in the behavior of one or more software applications of application software 124; Perhaps revise data from touch-sensitive display 156 or input equipment 128; With the ease for use of improving the one or more software applications in the application software 124 or the ease for use of displayed content (for example, webpage) (for example, for visually impaired people or the limited people of ability to act) wherein;
Communication module 120, be used for through one or more separately communication interface 112 (wired or wireless) and such as one or more communication networks of the Internet, other wide area networks, LAN, Metropolitan Area Network (MAN) etc., connect electronic equipment 102 to other equipment;
Subscriber interface module 123 (Fig. 1 C) is used to show the user interface that is included in the user interface object on display 126 or the touch-sensitive display 156;
Controlling application program 132 (Fig. 1 C) is used for control process (for example, click that view is confirmed, thread management and/or event-monitoring or the like); In certain embodiments, controlling application program 132 comprises the application program of moving; In other embodiments, the application program of moving comprises controlling application program 132;
Incident transfer system 122 can be realized with various interchangeable embodiment in operating system 118 or in application software 124; Yet in certain embodiments, some aspects of incident transfer system 122 can realize in operating system 118, and other aspects realize in application software 124;
Application software 124; (for example comprise one or more software applications; Application program 133-1 among Fig. 1 C, 133-2 and 133-3, wherein each can be one of following: email application, Web-browser application, notepad application, text message communication applications program or the like); Software application separately at least the term of execution have the Application Status of the state of indicating software application separately and parts (for example, gesture recognition device) thereof usually; Referring to the application program internal state 321 that describes below (Fig. 3 D); And
Equipment/overall internal state 134 (Fig. 1 C) comprises one or more in following: Application Status, the state of indication software application and parts (for example gesture recognition device and representative) thereof; Display state, the indication take touch-sensitive display 156 or display 126 each the zone be what application program, view or other information; Sensor states comprises the information that each sensor 116 of never being equipped with, input equipment 128 and/or touch-sensitive display 156 obtain; Positional information relates to the position and/or the orientation of equipment; And other states.
As what in instructions and claims, use; Term " application program of opening " is meant the software application of status information with reservation part of equipment/overall internal state 134 and/or application program internal state 321 (Fig. 3 D) (for example, as).An application program of opening is one of application program of following type:
Applications active, it is current to be presented at (perhaps the corresponding application program view is current is presented on the display) on display 126 or the touch-sensitive display 156;
Background application (or background process); Not its current not being presented on display 126 or the touch-sensitive display 156; But the one or more program process (for example, instruction) that are used for corresponding application program are handled (for example, operation) by one or more processors 110;
The application program of hanging up, it is current not in operation, and this application storage is in volatile memory (for example, other volatibility random access solid-state memory device of DRAM, SRAM, DDR RAM or storer 111); And
The application program of dormancy; It is current not in operation; And this application storage is (for example, other non-volatile solid-state memory devices of one or more disk storage devices, optical disc memory apparatus, flash memory device or storer 111) in nonvolatile memory.
Use as this paper, term " application program of closing " is meant the software application (status information of the application program of for example, closing is not stored in the storer of equipment) of the status information with reservation.Correspondingly, closing application program comprises and stops and/or removing the program process of application program and the storer of the status information slave unit of application program is removed.In general, open second application program in the time of in being in first application program and can not close first application program.When first application program stops to show and second application program displays time; Once first application program that was applications active when showing can become the application program of background application, hang-up or the application program of dormancy; But when keeping its status information by equipment, first application program remains the application program of opening.
Above-mentioned assert in the element each can be stored among one or more in the above-mentioned memory devices.Above-mentioned assert in module, application program or the system element each corresponding to one group of instruction that is used to carry out function described herein.This group instruction can be carried out by one or more processors (for example, one or more CPU 110).Above-mentionedly assert that module or program (that is, the instruction group) do not need to realize as independent software program, process or module, so in each embodiment, the subclass of various these modules can make up or otherwise arrange again.In certain embodiments, storer 111 can be stored the module of above identification and the subclass of data structure.Further, other module and the data structure do not described above storer 111 can be stored.
Fig. 2 is the figure that stack 200 is handled in the I/O of according to some embodiments of the invention exemplary electronic device or device (for example, equipment 102).The hardware of equipment (for example, electronic circuit) 212 handled the basal layer of stack 200 in I/O.Hardware 212 can comprise various hardware interface parts, such as the parts of in Figure 1A and/or 1B, describing.Hardware 212 can also comprise one or more in the sensor 116.In other elements (202-210) of I/O processing stack 200 at least some are software process of software process or part; It is handled the input that receives from hardware 212 and generates the various outputs that provide through hardware user interface (for example, in display, loudspeaker, the vibration equipment actuator etc. one or more).
A driver or a set drive 210 are communicated by letter with hardware 212.Driver 210 can receive and handle the input data that receive from hardware 212.Kernel operating system (OS) 208 can be communicated by letter with driver 210.Core os 208 can be handled the original input data that receives from driver 210.In certain embodiments, driver 210 can be regarded as the part of core os 208.
One group of OS API (" OSAPI ") the 206th, the software process of communicating by letter with core os 208.In certain embodiments, API 206 is included in the operation of equipment system, but is in the layer of core os more than 208.API206 designs for being used by the application program of operation on electronic equipment or device of this paper discussion.User interface (UI) API 204 can use OSAPI 206.The application software that moves on the equipment (" application program ") 202 can use UIAPI 204 so that and telex network.UIAPI 204 can be then communicates by letter with the element of low layer more, thus the various user interface hardware of final sum (for example, the multiple point touching display 156) communication.In certain embodiments, application software 202 is included in the application program in the application software 124 (Figure 1A).
Although handling each layer of stack 200, I/O can use one deck below it, not always required.For example, in certain embodiments, application program 202 can directly be communicated by letter with OS API 206.Usually, OSAPI layer 206 place or on the layer cannot directly visit core os 208, driver 210 or hardware 212 privately owned because these layers are regarded as.Usually directly call OSAPI206 in layer 202 and the application program among the UIAPI204, OSAPI206 visits core os 208, driver 210 and hardware 212 these layers then.
In other words; One or more hardware elements 212 of electronic equipment 102 and on this equipment the software of moving row; Detection is at the incoming event at one or more input equipments 128 and/or touch-sensitive display 156 places (this incoming event can corresponding to the subevent in the attitude); And generate or upgrade (being stored in the storer 111 of equipment 102) various data structures, this data structure is used to confirm that whether and when this incoming event corresponding to the incident that will be sent to application program 124 by current life event recognizer group.The embodiment of event recognition method, device and computer program will be described below in more detail.
Fig. 3 A has described exemplary view hierarchical structure 300, and this view hierarchical structure 300 is the search utility that is presented in the outermost view 302 in this example.Outermost view 302 generally comprises the user can directly mutual with it whole user interface, and comprises the subordinate view, for example,
Search Results panel 304, it divides into groups Search Results and can vertical scrolling;
Search field 306, it accepts the text input; And
A beginning position row 310, its with application packet so that fast access.
In this example, each subordinate view comprises other subordinate view of even lower level.In other examples; Other number of view level in hierarchical structure 300 can be different in the different branches of hierarchical structure, and wherein one or more subordinate views have other subordinate view of even lower level and one or more other subordinate views do not have other subordinate view of any such even lower level.Continue the example shown in Fig. 3 A, for each Search Results, Search Results panel 304 comprises independent subordinate view 305 (being subordinated to panel 304).At this, this example is illustrated in a Search Results in the subordinate view that is called map view 305.Search field 306 comprises that this paper is called the subordinate view of the icon view 307 that clears contents, and when on the clear contents icon of user in view 307, carrying out specific action (for example, single-point touches is perhaps rapped attitude), view 307 is removed the content of search fields.Beginning position row 310 comprises subordinate view 310-1,310-2,310-3 and 3104, and these subordinate views correspond respectively to contact application, email application, web browser and iPod music interface.
Touching subevent 301-1 representes in outermost view 302.Given touch subevent 301-1 is positioned at Search Results panel 304 and map view 305 on both, touches the subevent and also can on Search Results panel 304 and map view 305, be expressed as 301-2 and 301-3 respectively.The view that effectively relates to that touches the subevent comprises view Search Results panel 304, map view 305 and outermost view 302.Below with reference to Fig. 3 B and 3C the additional information about subevent transmission and the view that effectively relates to is provided.
View (and corresponding program rank) can be nested.In other words, a view can comprise other views.Therefore, the software element that is associated with first view (for example, event recognition device) can comprise or be linked to first view in one or more software elements of being associated of view.Although some views can with application-associated, other views can be associated with high-level OS element (for example, graphic user interface, window manager or the like).In certain embodiments, some views are associated with other OS elements.In certain embodiments, the view hierarchical structure comprises the view from a plurality of software applications.For example, the view hierarchical structure can comprise view from applied program ignitor (for example, a beginning position picture) and from the view (view that for example, comprises web page contents) of Web-browser application.
The program layer aggregated(particle) structure comprises one or more software elements or the software application in the hierarchical structure.In order to simplify discussion subsequently, usually will only mention view and view hierarchical structure, but it must be understood that in certain embodiments, this method can be come work with program layer aggregated(particle) structure with a plurality of program layers and/or view hierarchical structure.
Fig. 3 B has described exemplary method and the structure relevant with the event recognition device with 3C.Fig. 3 B has described the method and the data structure of event handling when event handler is related with the particular figure in the view hierarchical structure.Fig. 3 C described when event handler be associated with other hierarchical structure of program level in specific rank the time be used for the method and the data structure of event handling.Event recognition device global approach 312 and 350 comprises respectively to be clicked view and clicks rank determination module 314 and 352, life event recognizer determination module 316 and 354 and subevent delivery module 318 and 356.
In certain embodiments, electronic equipment 102 comprises one or more in following: event recognition device global approach 312 and 350.In certain embodiments, electronic equipment 102 comprises one or more in following: click view determination module 314 and click rank determination module 352.In certain embodiments, electronic equipment 102 comprises one or more in following: life event recognizer determination module 316 and 354.In certain embodiments, electronic equipment 102 comprises one or more in following: subevent delivery module 318 and 356.In certain embodiments, one or more in these methods or the module be included in still less or more method or module in.For example, in certain embodiments, click view/rank determination module that electronic equipment 102 comprises has comprised to be clicked view determination module 314 and clicks the functional of rank determination module 352.In certain embodiments, the life event recognizer determination module that comprises of electronic equipment 102 has comprised the functional of life event recognizer determination module 316 and 354.
Click view and software program is provided respectively with click rank determination module 314 and 352; (for example be used at one or more views; The exemplary view hierarchical structure of describing among Fig. 3 A 300 with 3 main splits) one or more software elements (for example and/or in the program layer aggregated(particle) structure corresponding with the subevent; One or more in the application program 133 among Fig. 1 C) in, confirm where to have taken place the subevent.
The relevant information in click view determination module 314 receptions among Fig. 3 B and subevent (for example, be expressed as 301-1 on the outermost view 302, go up, user on Search Results panel 304 touches) at Search Results (map view 305).Click view determination module 314 and assert that clicking view is the minimum view in the hierarchical structure that handle this subevent.In most cases, clicking view is the minimum rank view that initial subevent (that is first subevent in the subevent sequence of formation incident, or potential incident) takes place.In certain embodiments, in case assert the click view, this clicks view will receive and click the view identification identical touch or all relevant subevents of input source.In certain embodiments, one or more other views (for example, acquiescence or predefine view) receive some subevents in the subevent of this clicks view reception at least.
In certain embodiments, the click rank of Fig. 3 C affirmation module 352 can be used similar processing.For example, in certain embodiments, click rank and confirm that module 352 identifications click ranks and be the minimum rank of the program layer aggregated(particle) structure that handle this subevent (or in the program layer aggregated(particle) structure software application in the minimum program rank).In certain embodiments, in case assert the click rank, this is clicked rank or should click the identical touch that software application will receive and the click rank is assert or all the relevant subevents of input source in the rank.In certain embodiments, one or more other ranks or software application (for example, acquiescence or predefine software application) receive some subevents in the subevent of this clicks view reception at least.
Event recognition device global approach 312 and 350 life event recognizer determination module 316 and 354 confirm that respectively which or which view in view hierarchical structure and/or program layer aggregated(particle) structure should receive specific subevent sequence.Fig. 3 A has described the one group of example activities view 302,304 and 305 that receives subevent 301.In the example of Fig. 3 A, life event recognizer determination module 316 will confirm that outermost view 302, Search Results panel 304 and map view 305 are the views that effectively relate to, because these views have comprised the physical location by the touch of subevent 301 representatives.It should be noted that; Even touching subevent 301 all is limited in the zone that is associated with map view 305; Search Results panel 304 will still remain the view that effectively relates to outermost view 302, because Search Results panel 304 is elder generation of map view 305 with outermost view 302.
In certain embodiments, life event recognizer determination module 316 and 354 uses similarly and handles.In the example of Fig. 3 A; Life event recognizer determination module 350 will confirm that map application effectively relates to, because the view of map application is shown and/or the view of map application has comprised the physical location by the touch of subevent 301 representatives.Even should be noted that to touch in the zone that subevent 301 all is limited in map application is associated that other application programs in the program layer aggregated(particle) structure will still remain the application program (or the application program in the program rank that effectively relates to) that effectively relates to.
Delivery module 318 transmission subevents in subevent are to the event recognition device of the view that is used for effectively relating to.Use the example among Fig. 3 A, a user's touch is represented by touching mark 301-1,301-2 and 301-3 in the different views of hierarchical structure.In certain embodiments, represent that the subevent data of this user's touch are sent to the event recognition device at the view place that effectively relates to by subevent delivery module 318, that is, and top-level view 302, Search Results panel 304 and map view 305.Further, the event recognition device of view may be received in the subevent sequence (for example, when in view, initial subevent taking place) of the incident that begins in this view.In other words, view may be received in the subevent that is associated with user interactions that begins in this view, even it continues in the outside of this view.
In certain embodiments, subevent delivery module 356 transmits the subevent to other event recognition device of the program level that is used for effectively relating in being similar to the processing of being used by subevent delivery module 318.For example, delivery module 356 transmission subevents in subevent are to the event recognition device of the application program that is used for effectively relating to.Use the example of Fig. 3 A, user's touch 301 is sent to the event recognition device of locating at the view that effectively relates to (for example, any other application program that effectively relates in map application and the program layer aggregated(particle) structure) by subevent delivery module 356.In certain embodiments, acquiescence has comprised acquiescence or predefine software application in the program layer aggregated(particle) structure.
In certain embodiments, for each event recognition device that effectively relates to, independent event recognition device structure 320 or 360 produces and is stored in the storer of equipment.Event recognition device structure 320 and 360 comprises event recognition device state 334,374 (following discuss in more detail with reference to figure 4A and 4B) usually respectively and has the event recognition device special code 338,378 of state machine 340,380 respectively.Event recognition device structure 320 also comprises the view hierarchical structure with reference to 336, and event recognition device structure 360 comprises that the program layer aggregated(particle) structure is with reference to 376.Each instance of particular event recognizer is just with reference to view or program rank.The view hierarchical structure is with reference to 336 or the program layer aggregated(particle) structure is used to establish which view with reference to 376 (for particular event recognizers) or the program rank logically is coupled to event recognition device separately.
View metadata 341 can comprise respectively about view or other data of level with rank metadata 381.View or rank metadata can comprise the characteristic that transmit the following subevent that possibly have influence on the event recognition device at least:
Stop performance 342,382, its when being set for view or program rank, stop the subevent be sent to this view or program rank and view or program layer aggregated(particle) structure in the event recognition device that is associated of this view or other elder generation of program level.
Skip feature 343,383; It is when being set for view or program rank; Stop the subevent to be sent to the event recognition device that is associated with this view or program rank, but allow the subevent to be sent to this view or other elder generation of program level in view or the program layer aggregated(particle) structure.
Non-click skip feature 344,384, it stops the subevent to be sent to the event recognition device that is associated with this view when being set for view, only if this view is to click view.As stated, click view determination module 314 and assert that clicking view (or being to click rank under the situation of clicking rank determination module 352) is the minimum view in the hierarchical structure that handle the subevent.
Event recognition device structure 320 and 360 can comprise metadata 322,362 respectively.In certain embodiments, metadata 322,362 comprises how indication incident transfer system should implement configurable characteristic, sign and the tabulation of the subevent transmission of the event recognition device that effectively relates to.In certain embodiments, metadata 322,362 can comprise configurable characteristic, sign and the tabulation how indication event recognition device can be mutual each other.In certain embodiments, metadata 322,362 can comprise whether the indication subevent is sent to other configurable characteristic of level, sign and the tabulation of the variation in view or the program layer aggregated(particle) structure.In certain embodiments; The combination of event recognition device metadata 322,362 and view or rank metadata (being respectively 341,381) the two all be used for the Configuration events transfer system with: transmit the subevent that a) implements the event recognition device that effectively relates to; B) indication event recognition device can how mutual each other, and c) indicate the subevent whether and when to be sent to the different stage of view or program layer aggregated(particle) structure.
Should be noted that in certain embodiments, according to the field defined of the structure 320,360 of event recognition device, event recognition device separately sends event recognition action 333,373 to its target 335,375 separately.It is different to separately click view or rank that target and transmission (and delayed delivery) subevent are moved in transmission.
Be stored in metadata characteristics in the device of the event recognition separately structure 320,360 of corresponding event recognition device and comprise one or more in following:
Exclusive sign 324,364; It is when being set for the event recognition device; Indication is in case recognize an incident by the event recognition device, and the incident transfer system should stop to transmit the subevent to the view that effectively relates to or other any other event recognition device of program level (exception be any other event recognition device of in exception list 326,366, listing).When the reception of subevent causes that the particular event recognizer gets into as during by its corresponding exclusive sign 324 or 364 indicated exclusive states, ensuing subevent only is sent to the event recognition device (and any other event recognition device of in exception list 326,366, listing) in exclusive state.
Some incident recognizer structures 320,360, it can comprise exclusive exception list 326,366.When in being included in event recognition device structure 320,360, being used for event recognition device separately; If incident recognizer group is arranged, even tabulation 326,366 indication event recognition device groups also continue the reception subevent after separately event recognition device has got into exclusive state.For example; Get into exclusive state if singly strike the event recognition device of incident; And the view that relates at present comprises and two strikes the event recognition device of incident, tabulates so 320,360 will list two event recognition devices that strike, even so that still can discern the incidents of pair striking after singly striking incident detecting.Correspondingly, exclusive exception list 326,366 allows the identification of event recognition devices to share the different events of shared subevent sequence, for example, singly strikes event recognition and does not get rid of subsequently by other event recognition devices identifications two and strike or rap incident three times.
Some incident recognizer structures 320,360, it can comprise waits for tabulation 327,367.When it is used for event recognition device separately in being included in event recognition device structure 320,360; If incident recognizer group is arranged, this tabulation 327,367 indication event recognition device groups must be before event recognition device separately can be discerned incident separately the entering incident can not or incident cancellation state.In fact, listed event recognition device is useful on the more high priority of identification incident than the event recognition utensil of spy's tabulations 327,367 such as having.
Postpone to touch opening flag 328,368; It is when being set for the event recognition device; Cause separately click view or the rank of this event recognition device delayed delivery subevent (comprising that touch begins or finger puts down subevent and incident subsequently), confirm that up to this subevent sequence does not correspond to the event type of this event recognition device to the event recognition device.This sign can be used under the situation that attitude is identified, making click view or rank can see any subevent at no time.When event recognition device identification incident is failed, touch beginning subevent (and touch subsequently finishes the subevent) and can be sent to click view or rank.In an example, transmit such subevent and make that user interface is outstanding concisely and show an object, and never call the action with this object associated to clicking view or rank
Postpone to touch end mark 330,370; It is when being set for the event recognition device; (for example cause event recognition device delayed delivery subevent; Touch and finish the subevent) to the view of click separately or the rank of event recognition device, confirm that up to this subevent sequence does not correspond to the event type of this event recognition device.This can be used under the situation that attitude is identified after a while, stoping click view or rank to move according to touching the end subevent.Only more touch the end subevent and do not send, touch cancellation and just can send to click view or rank.If discerned incident, then corresponding action is carried out by application program, and touch end subevent is sent to click view or rank.
Touch cancellation sign 332,372; It is when being set for the event recognition device; If confirmed that the subevent sequence does not correspond to the event type of this event recognition device, cause that then the event recognition device sends touch or view of click separately or the rank to the event recognition device cancelled in input.Sending to the subevent (for example, touching the beginning subevent) of clicking view or other touch of level or input cancellation indication and having earlier is cancelled.Touch or the input cancellation can cause that the state (seeing Fig. 4 B) of input source processor gets into and assists into sequence cancellation state 460 (the following discussion).
In certain embodiments, exception list 326,366 can also be used by non-exclusive event recognition device.Especially; When non-exclusive event recognition device is discerned incident; Subevent subsequently is not sent to the exclusive event recognition device that is associated with the current active view, in the exception list 326,366 of event recognition device of this incident of identification except listed those exclusive event recognition devices.
In certain embodiments, the event recognition device can be configured to, and uses touch cancellation sign to stop undesired subevent to be sent to the click view in conjunction with postponing to touch end mark.The definition of for example, singly striking attitude is same with two definition of striking the first half of attitude.Successfully discerned singly and strike in case singly strike the event recognition device, a undesired action just possibly take place.Postpone to touch end mark if be provided with, singly strike the event recognition device and be prevented from sending the subevent, singly strike incident up to having discerned one to clicking view.In addition, singly strike the wait tabulation of event recognition device and can assert two event recognition devices that strike, singly strike event recognition device identification form and strike thereby stop, up to two strike the event recognition devices incident that got into can not state.It is two when striking attitude and singly strike the execution of associated action when carrying out that the use of waiting for tabulation has been avoided.Alternatively, in response to two identifications of striking incident, have only with two and strike associated action and just will be performed.
Then specifically mention the form that the user on touch sensitive surface touches, and as stated, touch and user's attitude can comprise the action that needs not to be moment, for example, touch can be included in the action of moving or keeping finger on the inherent display of a period of time.Yet, the touch data organization definition the state of the touch of special time (perhaps, more generally, the state of any input source).Therefore, the value that is stored in the touch data structure possibly change in the process of single-point touches, thereby makes it possible to state transfer with single-point touches to application program at the different time point.
Each touch data structure can comprise different fields.In certain embodiments, the touch data structure can comprise corresponding to the touch specific fields 339 among Fig. 3 B at least or the data of the input source specific fields 379 among Fig. 3 C.
For example, whether " be used for view first touch " field 345 among Fig. 3 B (" be used for level other first touch " field 385 of Fig. 3 C) can indicate the touch data structure to define to be used for first of particular figure and touch (owing to realizing that the software element of view is instantiation)." timestamp " field 346,386 can be indicated the concrete time of touch data structurally associated.
Alternatively, " information " field 347,387 can be used for whether the indication touch is basic attitude.For example, whether " information " field 347,387 can indicate touch to hit, and if hit towards which direction.Hitting, to be that one or more fingers are lengthwise pull fast.API realizes whether (following discussion) can be confirmed to touch is to hit and transmit these information to application program through " information " field 347,387, thereby is to alleviate some data processing of the application program of necessity originally under the situation of hitting in touch.
Alternatively, " rapping counting " field 348 among Fig. 3 B (" event count " field 388 among Fig. 3 C) can indicate in the position of initial touch have how much rapped execution continuously.One is rapped can be defined as at ad-hoc location and on the touch-sensitive panel, presses and be lifted away from finger fast.If finger with fast in succession mode press and discharge in the same position of this panel again, a plurality of continuous rapping then can take place.Incident transfer system 122 can be counted rapping, and passes on this information to application program through " rapping counting " field 348.Sometimes be considered to useful and easy repeatedly rapping of same position, be used to touch the order that enables interface with record.So through counting rapping, incident transfer system 122 can alleviate some data processing from application program again.
" stage " field 349,389 can be indicated based on the current moment that is in of attitude that touches.Stage field 349,389 can have various values, such as " the touch stage begins " indication touch data organization definition previous touch data structure as yet not with reference to the new touch of crossing." the touch stage moves " value can indicate the touch that is being defined to move from previous position." the touch stage is static " value can indicate touch to rest on identical position." end of touch stage " value can indicate touch to finish (for example, the user has been lifted away from his/her finger from the surface of multiple point touching display)." cancellation of touch stage " value can indicate this touch to be cancelled by equipment.The touch of cancellation can be to be finished but equipment has determined the touch that will ignore by the user.For example, equipment can confirm that this touch is (that is, as the result in the pocket that portable multiple point touching enabled devices is placed on someone) not being in the mood for producing, and therefore ignores this touch.Each value of " stage " field 349,389 can be an integer.
Therefore, each touch data structure can be defined in other information (such as the position) that the concrete time what (for example, this touch whether static, be moved or the like) taken place and be associated with this touch for separately touch (or other input sources).Correspondingly, each touch data structure can be defined in the state of the specific touch of particular point in time.Can join with reference to one or more touch data structures of identical time and can define particular figure (some touch data structures can also reference finished and no longer received touch) as stated, in the touch event data structure of the state of all touches that receive constantly.Pass in time, in order to provide a description the continuous information of occurent touch in the view to software, the multiple point touching event data structure can send to the software of realizing view.
Fig. 3 D is an illustration according to the block diagram of the exemplary components that is used for event handling (for example, the event handling parts 390) of some embodiment.In certain embodiments, storer 111 (Figure 1A) comprises event recognition device global approach 312 and one or more application programs (for example, 133-1 is to 133-3).
In certain embodiments, event recognition device global approach 312 comprises event monitor 311, clicks view determination module 314, life event recognizer determination module 316 and event scheduling module 315.In certain embodiments, event recognition device global approach 312 is arranged in incident transfer system 122 (Figure 1A).In certain embodiments, event recognition device global approach 312 is realized in operating system 118 (Figure 1A).Alternatively, event recognition device global approach 312 is realized in application program 133-1 separately.In also having another embodiment, event recognition device global approach 312 is realized as single formwork erection piece, perhaps realizes as a part that is stored in another module (for example, contact/motion module (not shown)) in the storer 111.
Event monitor 311 receives the event information from one or more sensor 116, touch-sensitive display 156 and/or one or more input equipment 128.Event information comprises the information about incident (for example, user on touch-sensitive display 156 touches, as the part of the motion of multiple point touching attitude or equipment 102) and/or subevent (for example, striding the moving of touch of touch-sensitive display 156).For example, the event information of touch event comprises one or more in following: the position of touch and timestamp.Similarly, the hit event information of incident comprises in following two or more: the position of hitting, timestamp, direction and speed.Sensor 116, touch-sensitive display 156 and input equipment 128 directly or the peripheral interface through retrieval and storage event information send message event and subevent information to event monitor 311.Sensor 116 comprises one or more in following: proximity transducer, accelerometer, gyroscope, microphone and video camera.In certain embodiments, sensor 116 also comprises input equipment 128 and/or touch-sensitive display 156.
In certain embodiments, event monitor 311 transmit a request to sensor 116 and/or peripheral interface with predetermined interval.As response, sensor 116 and/or peripheral interface send event information.In other embodiments, only when major event is arranged (for example, the input of reception has exceeded the predetermined noise threshold value and/or exceeded predetermined lasting time), sensor 116 and/or peripheral interface just send event information.
Event monitor 311 receives event information and transmits event information to event scheduler module 315.In certain embodiments, event monitor 311 confirms that one or more application programs separately that event informations will be sent to (for example, 133-1).In certain embodiments, event monitor 311 is also confirmed one or more application view 317 separately of one or more application programs separately that event information will be sent to.
In some application programs, event recognition device global approach 312 also comprises clicks view determination module 314 and/or life event recognizer determination module 316.
Click view determination module 314 if exist, then when touch-sensitive display 156 shows more than a view, click view determination module 314 and be provided in one or more views, confirming where to have taken place the software program of incident or subevent.View is made up of control or other element that the user can see on display.
With separately application program (for example, the user interface that 133-1) is associated be one group of view 317 on the other hand, be sometimes referred to as application view or user interface windows among this paper, therein display message and taking place based on the attitude that touches.(the application program separately) application view that detects touch therein can be corresponding to the specific view in the view hierarchical structure of this application program.For example, the lowermost level view that detects touch therein can be called the click view, and the incident button that is identified as suitable input can be at least in part confirmed based on the click view of the initial touch of the attitude that touches based on beginning.
Click the view determination module 314 receptions information relevant with incident and/or subevent.When application program has a plurality of view of in hierarchical structure, organizing, click view determination module 314 and assert that clicking view is the minimum view in the hierarchical structure that handle this incident or subevent.In most of the cases, clicking view is the lowermost level view that primary event or subevent (that is first incident or the subevent in the sequence of the incident of formation attitude and/or subevent) have wherein taken place.In case assert the click view by clicking the view determination module, then this click view receives usually and is identified as identical touch or relevant all incidents and/or the subevent of input source of clicking view.Yet, click view and always do not receive all incidents relevant and/or unique view of subevent with being identified as the identical touch of clicking view or input source.In other words; In certain embodiments; The Another application program (for example; 133-2) or another view of same application domain also receive the incident that identical touch with this or input source be correlated with and/or the subclass of subevent at least, and do not consider whether confirmed the click view for this touch or input source.
Life event recognizer determination module 316 confirms that in the view hierarchical structure which or which view should receive specific incident and/or subevent sequence.In some application context, life event recognizer determination module 316 confirms to have only the click view should receive specific incident and/or subevent sequence.In other application context; Life event recognizer determination module 316 confirms to comprise that all views of incident or subevent physical location all are the views that effectively relates to, and confirms that therefore all views that effectively relate to should receive specific incident and/or subevent sequence.In other application context; Even touch event and/or subevent all are limited in the zone that is associated with a particular figure; The view of the higher level in the hierarchical structure also still remains the view that effectively relates to, so the view of the higher level in the hierarchical structure should receive specific incident and/or subevent sequence.Additionally or as alternatively, life event recognizer determination module 316 confirms that in the program layer aggregated(particle) structure which or which application program should receive specific incident and/or subevent sequence.Therefore, in certain embodiments, life event recognizer determination module 316 confirms to have only the application program separately in the program layer aggregated(particle) structure just should receive specific incident and/or subevent sequence.In certain embodiments, life event recognizer determination module 316 confirms that a plurality of application programs in the program layer aggregated(particle) structure should receive specific incident and/or subevent sequence.
Event scheduler module 315 scheduling events information are to event recognition device (being also referred to as " gesture recognition device " among this paper) (for example, event recognition device 325-1).In the embodiment that comprises life event recognizer determination module 316, event scheduler module 315 transmits event information and arrives by life event recognizer determination module 316 definite event recognizers.In certain embodiments, event scheduler module 315 will be stored in the event queue by the event information that event recognition device 325 (perhaps by the incident receiver 3031 in event recognition device 325 separately) is separately retrieved.
In certain embodiments; Application program separately (for example; 133-1) comprise application program internal state 321, wherein application program internal state 321 indication is presented at the current application program view on the touch-sensitive display 156 when application program is movable or is carrying out.In certain embodiments; Equipment/overall internal state 134 (Fig. 1 C) is used for confirming that by event recognition device global approach 312 current which or which application program is movable, and application program internal state 321 is used for the application view 317 that definite event information will be sent to by event recognition device global approach 312.
In certain embodiments, application program internal state 321 comprises additional information, for example below in one or more: the recovering information that when application program 133-1 recover to carry out, will use; User interface state information, the information that its indication is being shown by application program 133-1 or preparing to show; State queue is used to make that the user can return back to original state or the view of application program 133-1; And the formation of reforming/cancel of the previous action of carrying out by the user.In certain embodiments, application program internal state 321 further comprises contextual information/text and metadata 323.
In certain embodiments; Application program 133-1 comprises one or more application view 317; Wherein each all has corresponding instruction and is used to handle the touch event (for example, the event handler 319 of correspondence) in the particular figure that occurs in User Interface.At least one application view 317 of application program 133-1 comprises one or more event recognition devices 325.Usually, separately application view 317 comprises a plurality of event recognition devices 325.In other embodiments, one or more in the event recognition device 325 are parts of separate modular, and such as user interface external member (not shown) or high-level objects more, wherein application program 133-1 has inherited method or other characteristic from it.In certain embodiments, separately application view 317 also comprises one or more in following: the event data of Data Update device, object renovator, GU1 renovator and/or reception.
Application program separately (for example, 133-1) also comprises one or more event handlers 319.Usually, separately application program (for example, 133-1) comprises a plurality of event handlers 319.
Event recognition device 325-1 separately receives from the event information of event scheduler module 315 (directly or indirectly passing through application program 133-1) and from incident information Recognition incident.Event recognition device 325-1 comprises incident receiver 3031 and event comparator 3033.
Event information comprises the information about incident (for example, touching) or subevent (for example, touch is moved).According to incident or subevent, event information also comprises additional information, such as the position of incident or subevent.When incident or subevent related to the motion of touch, event information can also comprise the speed and the direction of subevent.In certain embodiments, incident comprise equipment from one towards to another rotation (for example, from vertically towards to laterally towards, or conversely), and event information comprises the current corresponding informance towards (being also referred to as the equipment orientation) about equipment.
Event comparator 3033 with event information and one or more predefined attitude definition (being also referred to as " event definition " among this paper) relatively and is relatively confirmed incident or subevent based on this, perhaps confirms or the state of update event or subevent.In certain embodiments, event comparator 3033 comprises one or more attitude definition 3035 (as stated, being also referred to as " event definition " among this paper).Attitude definition 3035 comprises the definition (for example, predefined incident and/or subevent sequence) of attitude, for example, and attitude 1 (3037-1), attitude 2 (3037-2) and other.In certain embodiments, the subevent in the attitude definition 3035 comprises, for example, touches beginning, touches and finish, touch and move, touch cancellation and multiple point touching.In one example, the definition of attitude 1 (3037-1) is two striking on the object that shows.For example, twoly strike first on the inherent object that shows of the predefined phase that is included in this attitude and touch (touching beginning), first being lifted away from (touch and finish), in the predefined phase subsequently of this attitude, touching second in (touching beginning) and the final predefined phase and be lifted away from (touch and finish) in the next predefined phase of this attitude in this attitude at second on the object that shows.In another example, the definition of attitude 2 (3037-2) is included in pulling on the object of demonstration.For example, pull on the object that is included in demonstration touch (or contact), stride touch-sensitive display 156 touch move and touch be lifted away from (touch and finish).
In certain embodiments, event recognition device 325-1 comprises that also being used for incident transmits 3039 information.The information that is used for incident transmission 3039 comprises the reference to the event handler 319 of correspondence.Alternatively, being used for incident transmits 3039 information and comprises that action-target is right.In certain embodiments, in response to identification attitude (or part of attitude), event information (for example, action message) is sent to by the one or more targets of action-target to assert.In other embodiments, in response to identification attitude (or part of attitude), it is right to activate action-target.
In certain embodiments, attitude definition 3035 comprises the definition for the attitude of separately user interface object.In certain embodiments, event comparator 3033 is carried out hit testing to confirm that which user interface object is associated with the subevent.For example; On touch-sensitive display 156, shown in the application view of three user interface object; When on touch-sensitive display 156, detecting touch; Event comparator 3033 is carried out hit testings, is associated with this touch (incident) to confirm if any then in three user interface object which.If the object of each demonstration all is associated with separately event handler 319, event comparator 3033 uses the result of hit testings to confirm that which event handler 319 should be activated so.For example, event comparator 3033 is selected and the event handler 319 of the object associated of this incident and triggering hit testing.
In certain embodiments; For the attitude definition 3037 separately of separately attitude also comprise transmission that the action of delay, the action of this delay postpone event information up to confirm incident and/or subevent sequence corresponding to or do not correspond to the event type of event recognition device.
When separately event recognition device 325-1 confirms that incident and/or subevent sequence do not match any incident in the attitude definition 3035; Separately event recognition device 325-1 gets into the incident status of fail, and separately event recognition device 325-1 does not consider incident subsequently and/or subevent based on the attitude that touches after this.In this case, if any, continue tracking and handle ongoing incident and/or subevent based on the attitude that touches for other event recognition devices of clicking view maintenance activity.
In certain embodiments, when not having the event recognition device to keep for the click view, event information sends to the one or more event recognition devices in the higher view in the view hierarchical structure.Instead, when not having the event recognition device to keep, ignore this event information for the click view.In certain embodiments, when not having the event recognition device to keep for the view in the view hierarchical structure, event information sends to the one or more event recognition devices in the higher program rank in the program layer aggregated(particle) structure.Instead, when not having the event recognition device to keep for the view in the view hierarchical structure, ignore this event information.
In certain embodiments, separately event recognition device 325-1 comprises event recognition device state 334.Event recognition device state 334 comprises the state of event recognition device 325-1 separately.The example of event recognition device state is described below with reference to Fig. 4 A-4B and 5A-5C in more detail.
In certain embodiments, event recognition device state 334 comprises recognizer metadata and characteristic 3043.In certain embodiments, recognizer metadata and characteristic 3043 comprise one or more in following: A) how indication incident transfer system should implement configurable characteristic, sign and/or the tabulation of transmission of incident and/or the subevent of the event recognition device that effectively relates to; B) how mutual each other configurable characteristic, sign and/or the tabulation of indication event recognition device; C) how indication event recognition device receives configurable characteristic, sign and/or the tabulation of event information; D) how indication event recognition device can discern configurable characteristic, sign and/or the tabulation of attitude; E) whether indication incident and/or subevent are sent to other configurable characteristic of level, sign and/or the tabulation of the variation in the view hierarchical structure; And F) to the reference of the event handler 319 of correspondence.
In certain embodiments, event recognition device state 334 comprises incident/touch metadata 3045.Incident/touch metadata 3045 comprises about detected and corresponding to the incident/touch information of the incident/touch separately of the attitude definition 3037 separately of attitude definition 3035.Incident/touch information comprises one or more in following: the position of incident/touch separately, timestamp, speed, direction, distance, scope (or range) and angle (or angle variation).
In certain embodiments, when the one or more particular events of having discerned attitude and/or subevent, the event handler 319 that event recognition device 325 activation separately are associated with event recognition device 325 separately.In certain embodiments, the event information that is associated with incident of separately event recognition device 325 transmission is to event handler 319.
One or more in below event handler 319 is carried out when being activated: create and/or Update Information, establishment and upgating object and prepare display message and send this display message in order on display 126 or touch-sensitive display 156, to show.
In certain embodiments, separately application view 317-2 comprises view metadata 341.As above described about Fig. 3 B, view metadata 341 comprises the data about view.Alternatively, view metadata 341 comprises one or more in following: stop performance 342, skip feature 343, non-click skip feature 344 and other view metadata 329.
In certain embodiments, the view that effectively relates to of first in the view hierarchical structure can be configured to stop and transmit corresponding subevent to the event recognition device that is associated with this first view of effectively relating to.The behavior can be realized skip feature 343.When skip feature is set for application view, still to the view hierarchical structure in the event recognition device that is associated of other views that effectively relate to carry out the transmission of corresponding subevent.
Instead, the view that first in the view hierarchical structure effectively relates to can be configured to stop and transmit corresponding subevent to the event recognition device that is associated with this first view of effectively relating to, only if this first view of effectively relating to is the click view.This behavior can realization condition property non-click skip feature 344.
In certain embodiments, the view configuration that effectively relates to of second in the view hierarchical structure becomes to stop to transmit corresponding subevent to event recognition device that is associated with this second view of effectively relating to and the event recognition device that is associated to the elder generation with second view that effectively relate to.This behavior can realize stop performance 342.
Fig. 3 E is an illustration according to the example class of the gesture recognition device of some embodiment and the block diagram of instance (for example, the event handling parts 390).
Software application (for example, application program 133-1) has one or more event recognition devices 3040.In certain embodiments, separately event recognition device (3040-2) is an event recognition device class for example.This separately event recognition device (for example, 3040-2) comprises event recognition device special code 338 (for example, one group of instruction of the operation of definition event recognition device) and state machine 340.
In certain embodiments, the Application Status 321 of software application (for example, application program 133-1) comprises the instance of event recognition device.Each instance of event recognition device is the object with state (for example, event recognition device state 334)." execution " of event recognition device instance separately is through carrying out corresponding event recognition device special code (for example, 338) and renewal or keeping the state 334 of event recognition device instance 3047 to realize.The state 334 of event recognition device instance 3047 comprises the state 3038 of the state machine 340 of event recognition device instance.
In certain embodiments, Application Status 321 comprises a plurality of event recognition device instances 3047.Event recognition device instance 3047 separately is usually corresponding to the event recognition device of the view that is tied to (being also referred to as " being attached to ") application program.In certain embodiments, one or more event recognition device instances 3047 are tied to the application program separately in the program layer aggregated(particle) structure, and not with reference to any particular figure of this application program separately.In certain embodiments, Application Status 321 comprises separately event recognition device (for example, 3040-2) a plurality of instances (for example, 3047-1 is to 3047-L).In certain embodiments, Application Status 321 comprises the instance 3047 of a plurality of event recognition devices (for example, 3040-1 is to 3040-R).
In certain embodiments, the instance 3047-2 separately of gesture recognition device 3040 comprises event recognition device state 334.As discussed above, in certain embodiments, event recognition device state 334 comprises recognizer metadata and characteristic 3043 and incident/touch metadata 3045.In certain embodiments, event recognition device state 334 also comprises the view hierarchical structure with reference to 336, is attached to which view in order to the instance 3047-2 separately that indicates gesture recognition device 3040-2.
In certain embodiments, recognizer metadata and characteristic 3043 comprise following or its subclass or superset:
Exclusive sign 324;
Exclusive exception list 326;
Wait for tabulation 327;
Postpone to touch opening flag 328;
Postpone to touch end mark 330; And
Touch cancellation sign 332.
In certain embodiments, the transmission that can be suitable for postponing the one or more subevents in the sequence of subevent of one or more event recognition devices is after event recognition device identification incident.This behavior has reflected the incident that postpones.For example, consider singly to strike attitude in the view, it also is possible repeatedly rapping attitude for it.In this case, the incident of rapping becomes " rapping+postpone " recognizer.In fact, when the event recognition device is realized this behavior, the event recognition device will postpone event recognition, confirm that up to it the subevent sequence is in fact really corresponding to its event definition.Can not be suitably during when receiving view in response to the incident of cancellation, this behavior possibly be fit to.In certain embodiments, the event recognition device will postpone to upgrade its event recognition state to its view that effectively relates to separately, confirm that up to the event recognition device subevent sequence does not correspond to its event definition.Provide to postpone to touch opening flag 328, postpone to touch end mark 330 and touch cancellation sign 332, so that subevent tranmission techniques and event recognition device and viewstate information updating are suitable for concrete needs.
In certain embodiments, recognizer metadata and characteristic 3043 comprise following or its subclass or superset:
In state machine state/stage 3038, it is for separately event recognition device instance (for example, the 3047-2) state of instruction state machine (for example 340); State machine state/stage 3038 can have various state values, such as " incident maybe ", " event recognition goes out ", " incident failure " and other, be described below; Alternatively or additionally, state machine state/stage 3038 can have various Stage Values, such as " the touch stage begins " can indicate the touch data organization definition previous touch data structure also not with reference to the new touch of crossing; " the touch stage moves " value can indicate the touch that is being defined to move from previous position; " the touch stage is static " value can indicate touch to rest on identical position; " end of touch stage " value can indicate touch to finish (for example, the user is lifted away from his/her finger from the surface of multiple point touching display); " cancellation of touch stage " value can indicate this touch by this equipment cancellation; The touch of cancellation can be to be finished but the definite touch that will ignore of equipment by the user; For example, equipment can confirm that this touch is (that is, as the result in the pocket that portable multiple point touching enabled devices is placed on someone) who is not in the mood for producing and therefore ignores this touch; Each value in state machine state/stage 3038 can be an integer (being called " gesture recognition device state value " among this paper);
Action-target is to 3051, and wherein in response to being the part of attitude or attitude with incident or touch recognition, each is to assert a target, and event recognition device instance separately sends the action message of assert and arrives this target;
Represent 3053, when a representative was assigned to event recognition device instance separately, this representative was the reference to the correspondence representative; When a representative is not assigned to event recognition device instance separately, represent 346 to comprise null value; And
Launch characteristic 3055, whether its indication event recognition device instance is separately launched; In certain embodiments, when event recognition device instance is not separately launched (for example, forbidding), not processing events or touch of event recognition device instance separately.
In certain embodiments, exception list 326 can also be used by non-exclusive event recognition device.Particularly; When non-exclusive event recognition device identification incident or subevent; Incident subsequently and/or subevent are not sent to the exclusive event recognition device that is associated with the current active view, in the exception list 326 of the event recognition device that once identifies this incident or subevent except listed those exclusive event recognition devices.
In certain embodiments, the event recognition device can be configured to combine to postpone to touch end mark 330 and use touch cancellation sign 332, sends to the click view to stop undesired incident and/or subevent.For example, singly striking attitude is the same with two definition of striking the first half of attitude.Successfully discerned singly and strike in case singly strike the event recognition device, undesired action just possibly take place.Postpone to touch end mark if be provided with, singly strike the event recognition device and be prevented from sending the subevent to clicking view, up to having discerned the incident of singly striking.In addition, singly strike the wait tabulation of event recognition device and can assert two event recognition devices that strike, singly strike event recognition device identification form and strike thereby stop, up to two strike the event recognition devices incident that got into can not state.The use of waiting for tabulation has been avoided when carrying out two associated action of carrying out and singly strike when striking attitude.Instead, in response to two identifications of striking incident, have only with two and strike associated action and just will be performed.
Then specifically mention the form that the user on touch sensitive surface touches, and as stated, touch and user's attitude can comprise the action that needs not to be moment, for example, touch can be included in the action of moving or keeping finger on the inherent display of a period of time.Yet, the touch data organization definition the state of the touch of special time (perhaps, more generally, the state of any input source).Therefore, the value of storing in the touch data structure can change in the process of single-point touches, makes the state of single-point touches to be transferred to application program at the different time point.
Each touch data structure can comprise various clauses and subclauses.In certain embodiments, the touch data structure can comprise at least the data corresponding to the touch particular items in incident/touch metadata 3045, such as following or its subclass or superset:
" being used for first of view touches " clauses and subclauses 345;
" every touch information " clauses and subclauses 3051 comprise " timestamp " information of the special time (for example, the time of touch) of indicating this touch data structurally associated; Alternatively, " every touch information " clauses and subclauses 3051 comprise other information of the position that touches such as correspondence; And
Optional " rapping counting " clauses and subclauses 348.
Thereby each touch data structure can be defined in other information (such as the position) that the specific time what (for example, this touch whether static, be moved or the like) taken place and be associated with this touch for separately touch (or other input sources).Correspondingly, each touch data structure can be defined in the state of the specific touch of particular moment.Can join with reference to one or more touch data structures of identical time and can define particular figure (some touch data structures can also reference finished and no longer received touch) as stated, in the touch event data structure of the state of all touches that receiving sometime.Pass in time, in order to provide a description the continuous information of occurent touch in the view to software, the multiple point touching event data structure can send to the software of realizing view.
The ability based on the attitude that touches of handling the complicacy that comprises the multiple point touching attitude alternatively can increase the complicacy of various software applications.In some cases, the complicacy of this increase is for realizing that the senior interface feature with expectation possibly be necessary.For example, recreation can need to handle the ability that the multiple spot that in different views, takes place touches simultaneously, because recreation often need be pressed a plurality of buttons simultaneously, perhaps will speed up the counting certificate and combines with touch on the touch sensitive surface.Yet some simpler application programs and/or view do not need senior interface feature.For example, a simple soft key (that is the button that, shows on the touch-sensitive display) can be satisfactorily with single-point touches rather than multi-touch function work.In these cases, the OS of lower floor can send unnecessary or too much touch data (for example, multiple point touching data) to the software part that only is intended to be associated by the view that single-point touches (for example, the single touch on soft key or rap) is operated.Because software part possibly more handled this data, thus possibly need all complicacy of the software application of characterization process multiple point touching, even the view that it is associated with is only relevant with single-point touches.This can increase the software development cost that is used for equipment, maybe be much complicated under the multiple point touching environment at the software part of (that is various buttons or the like) programming under the mouse interface environment because be easy to traditionally.
In order to reduce the complicated complicacy based on the attitude that touches of identification, according to some embodiment, representative can be used for the behavior of control event recognizer.Be described below, for example, representative can confirm whether corresponding event recognition device (or gesture recognition device) can receive incident (for example, touching) information; Whether corresponding event recognition device (or gesture recognition device) can be transformed into another state from the original state (for example, incident possibility state) of state machine; And/or whether corresponding event recognition device (or gesture recognition device) can (for example side by side discern incident; Touch) be corresponding attitude, and can not hinder other event recognition devices (or gesture recognition device) obstruction that other event recognition devices (or gesture recognition device) identification incident perhaps is identified this incident.
Yet; Be to be understood that; The front also is applicable to the user's input in order to the form of ownership that utilizes input equipment 128 operating electronic equipments 102 about the discussion of the complicacy that the user on assessment and the processing touch sensitive surface touches; Wherein not every user's input all begins on touch-screen; For example; Coordinate mouse move with mouse button press with do not press with single or a plurality of keyboards or maintenance, equipment rotation or other move, the user on touch pad such as moving of rapping, pull, roll or the like, stylus input, oral indication, detected eye motion, biometric input, detected user's physiological change and/or their any combination, they can be as corresponding to the incident that defines the incident that will discern and/or the input of subevent.
Turn to event information stream, Fig. 3 F is an illustration according to the block diagram of the event information stream of some embodiment.Event scheduler module 315 (for example, in operating system 118 or application software 124) receives event information, and sends this event information to one or more application programs (for example, 133-1 and 133-2).In certain embodiments, application program 133-1 comprises a plurality of gesture recognition devices (516-1 is to 516-3) in a plurality of views in the view hierarchical structure 506 (for example, corresponding to the view among Fig. 3 D 317 508,510 and 512) and these a plurality of views.Application program 133-1 also comprises corresponding to the one or more attitude processors 550 of target-action to the desired value in (for example, 552-1 and 552-2).In certain embodiments, event scheduler module 315 receives the click view information from clicking view determination module 314, and sends event information and arrive the event recognition device (for example, 516-1 and 516-2) of clicking view (for example, 512) or being attached to this click view.Additionally or alternatively; Event scheduler module 315 receives the click class information from clicking rank determination module 352; And send event information to this click in rank application program (for example; 133-1 and 133-2) maybe should click in level applications program one or more event recognition devices (for example, 516-4).In certain embodiments, receive in the application program of this event information one be silent from application program (for example, 133-2 can be a default application).In certain embodiments, only the subclass of the gesture recognition device in each reception application program is allowed to (or being configured to) and receives this event information.For example, the gesture recognition device 516-3 among the application program 133-1 does not receive event information.Be called among gesture recognition device this paper of reception event information and receive the gesture recognition device.In Fig. 3 F, receive gesture recognition device 516-1,516-2 and 5164 reception event informations, and the event information that receives is compared with attitude definition 3037 (Fig. 3 D) separately that receive in the gesture recognition device.In Fig. 3 F, gesture recognition device 516-1 and 5164 has the attitude definition separately of the event information that coupling receives, and sends separately action message (for example, 518-1 and 518-2) to corresponding attitude processor (for example, 552-1 and 552-3).
Fig. 4 A has described the event recognition device state machine 400 that comprises one of four states.Through based on the state exchange that receives in the subevent Admin Events recognizer state machine 400, the event recognition device has been expressed event definition effectively.For example, rapping attitude can or be defined by the sequence of three subevents by two alternatively effectively.At first, touch should be detected, and this will be subevent 1.For example, touching the subevent can be that user's finger touches the touch sensitive surface in the view that comprises the event recognition device with state machine 400.Next is; (for example not have under the situation that any assigned direction essence in edge moves touching; Any of touch location moves less than predetermined threshold value, this threshold value can on display, be measured as distance (for example, 5mm) or number of pixels (for example; 5 pixels)) the optional delay that records will wherein should postpone enough short as subevent 2.At last, the termination of touch (for example, user's finger is lifted away from touch sensitive surface) will be as subevent 3.To change between state based on receiving these subevents, event recognition device state machine 400 has been expressed effectively and has been rapped the attitude event definition through coding event recognition device state machine 400.Yet; Should be noted that; State shown in Fig. 4 A is exemplary state, and event recognition device state machine 400 can comprise more or less state, and/or each state in the event recognition device state machine 400 can be corresponding in the state that illustrates or any other state.
In certain embodiments, do not consider event type, event recognition device state machine 400 begins at event recognition initial state 405, and can proceed to any remaining state according to having received what subevent.Talk the matter over recognizer state machine 400 for ease; Do well 415 from event recognition initial state 405 to event recognition with discussing, incident maybe state 410 and incident can not state 420 direct-path, then be description to the path of maybe state 410 from incident drawing.
From 405 beginnings of event recognition initial state, if the subevent self that receives comprises the event definition for incident, then event recognition device state machine 400 will be transformed into event recognition and do well 415.
From 405 beginnings of event recognition initial state, if the subevent that receives is not first subevent of event definition, then event recognition device state machine 400 will be transformed into the impossible state 420 of incident.
From 405 beginnings of event recognition initial state, if the subevent that receives is first subevent of given event definition rather than last subevent, then event recognition device state machine 400 will be transformed into incident possibility state 410.If the next subevent that receives is second subevent rather than the last subevent of given event definition, then event recognition device state machine 400 incident that will remain on maybe state 410.As long as it is the part of event definition that the subevent sequence that receives continues, event recognition device state machine 400 just can remain on incident maybe state 410.If be in any moment of incident possibility state 410 at event recognition device state machine 400; Event recognition device state machine 400 receives the subevent of a part that is not event definition; It will be transformed into the impossible state 420 of incident so; Thereby confirm that current event (if any) is not the event type corresponding to this event recognition device event recognition device of state 400 (that is, corresponding to).On the other hand; If event recognition device state machine 400 is in incident possibility state 410; And event recognition device state machine 400 receives the last subevent in the event definition, and then it will be transformed into event recognition and do well 415, thereby accomplishes successful event recognition.
Fig. 4 B has described the embodiment of input source processing procedure 440, and this embodiment has the expression view and how to receive the finite state machine about the information of separately input.Should be noted that when having a plurality of touch on the touch sensitive surface of equipment each in the touch all is the independent input source with its oneself finite state machine.In this embodiment, input source processing procedure 440 comprises one of four states: list entries begins 445, list entries continues 450, list entries finishes 455 and list entries cancellation 460.Input source processing procedure 440 can be used by event recognition device separately, for example, and when input will be sent to application program, but only after detecting the list entries completion.Input source processing procedure 440 can be used with the application program that can not cancel or cancel the change of making in response to the list entries that is sent to this application program.Should be noted that; State shown in Fig. 4 B is an exemplary status; Input source processing procedure 440 can comprise more or less state, and/or each state in the input source processing procedure 440 can be corresponding in the state that illustrates or any other state.
Begin 445 beginnings from list entries, if a list entries is accomplished in the input oneself that receives, then input source processing procedure 440 will be transformed into list entries end 455.
Begin 445 beginnings from list entries, if the input that receives indication list entries stops, then input source processing procedure 440 will be transformed into list entries cancellation 460.
Begin 445 beginnings from list entries, if the input that receives is first rather than last input in the list entries, input source processing procedure 440 will be transformed into list entries and continue state 450 so.If the next one that receives input is second input in the list entries, then input source processing procedure 440 will remain on list entries continuation state 450.As long as it is the part of given list entries that the subevent sequence that just is being transmitted continues, input source processing procedure 440 just can remain on list entries and continue state 450.Continue any moment in the state 450 if be in list entries in input source processing procedure 440, and input source processing procedure 440 receives the input of a part that is not list entries, it will be transformed into list entries cancellation state 460 so.On the other hand, continue in 450 if input source processing procedure 440 is in list entries, and input source processing procedure 440 receives the last input in the given input definition, it will be transformed into list entries and finish 455, thereby successfully receive one group of subevent.
In certain embodiments, can realize input source processing procedure 440 for particular figure or program rank.In this case, some subevent sequence can cause being transformed into input cancellation state 460.
As an example, consider Fig. 4 C, Fig. 4 C supposes a view that effectively relates to, and this view is only by effectively relating to view input source processor 480 (after this being called " view 480 ") expression.View 480 comprises the event recognition device of vertically hitting, and this event recognition device is only represented as one of its event recognition device by the event recognition device 468 of vertically hitting (after this being called " recognizer 468 ").In this case, recognizer 468 can detect as the part of its definition: 1) finger puts down 465-1; 2) optional weak point postpones 465-2; 3) 465-3 that vertically hits of N pixel at least; And 4) finger is lifted away from 465-4.
The delay that example hereto, recognizer 468 also are provided with it touches opening flag 328 and touches cancellation sign 332.Consider of the transmission of following subevent sequence now to recognizer 468 and view 480:
Subevent sequence 465-1: detect finger and put down, this finger puts down the event definition corresponding to recognizer 468
Subevent sequence 465-2: measure delay, this postpones the event definition corresponding to recognizer 468
Subevent sequence 465-3: finger is carried out the motion of vertically hitting, and this motion of vertically hitting can be compatible with vertical scrolling, but be less than N pixel, so and does not correspond to the event definition of recognizer 468
Subevent sequence 465-4: detect finger and be lifted away from, this finger is lifted away from the event definition corresponding to recognizer 468
Here, recognizer 468 is a recognin incident 1 and 2 parts of event definition as it successfully, correspondingly, will remove in the subevent 3 transmission and be in the possible state 472 of incident before.Because recognizer 468 is provided with its delay and touches opening flag 328, therefore initial touch subevent does not send to the click view.Correspondingly, the input source processing procedure 440 of view 480 just will still be in the list entries initial state in the subevent before 3 the transmission.
In case accomplished the transmission of subevent 3 to recognizer 468; The state exchange of recognizer 468 is to incident possibility 476; And importantly; Recognizer 468 has confirmed the subevent sequences not correspond to its specific attitude event type of vertically hitting now, and (that is, it has determined that this incident is not vertically to hit.In other words, in this example, do not take place) as the identification of vertically hitting 474.The input source disposal system 440 that is used for view input source processor 480 also will be upgraded its state.In certain embodiments, when the event recognition device sends indication it has begun the status information of the incident of discerning, the state of view input source processor 480 will proceed to list entries from list entries initial state 482 and continue state 484.When touch or EOI and the incident that has not been identified owing to be provided with the touch cancellation sign 322 of event recognition device, view input source processor 480 proceeds to list entries cancellation state 488.Alternatively, if the touch cancellation sign 322 of event recognition device is not set, then when touch or EOI, view input source processor 480 proceeds to list entries done state 486.
Owing to be provided with the touch cancellation sign 332 of event recognition device 468, so be transformed into incident can not state 476 time when event recognition device 468, this recognizer will send to touch cancels the subevent or message arrives the click view corresponding to this event recognition device.The result is that view input source processor 480 will be transformed into list entries cancellation state 488.
In certain embodiments, the transmission of subevent 465-4 does not have confidential relation with the event recognition decision of being made by recognizer 468, although other event recognition devices (if any) of view input source processor 480 can continue to analyze this subevent sequence.
The form that following form is tabulated with summary has provided the processing of this example subevent sequence 465 relevant with the state of above-mentioned event recognition device 468, and the state of view input source processor 480.In this example,,, the state of view input source processor 480 proceeds to list entries cancellation 488 so beginning 445 from list entries owing to be provided with the touch cancellation sign 332 of recognizer 468:
Subevent sequence 465 states: recognizer 468 states: view 480
Event recognition begins 470 before transmitting beginning
The detection finger puts down the 465-1 incident possibly begin 482 by 472 list entries
Measure and postpone 465-2 incident possibility 472 list entries continuation 484
Detect and point the impossible 476 list entries continuation 484 of the 465-3 incident of vertically hitting
Detect finger and be lifted away from the impossible 476 list entries cancellation 488 of 465-4 incident
Forward Fig. 5 A to, notice forwards the example of subevent sequence 520 to, and subevent sequence 520 is comprised that the view of a plurality of event recognition devices receives.Example has illustrated two event recognition devices in Fig. 5 A hereto, promptly roll events recognizer 580 with rap event recognition device 590.For illustrative purposes, the view Search Results panel 304 among Fig. 3 A will be relevant with the reception of subevent sequence 520, and roll events recognizer 580 changes with the state that raps in the event recognition device 590.Notice in this example; Subevent sequence 520 has defined the finger gesture that raps on touch-sensitive display or track pad; But same event recognition technology goes in a large amount of context (for example, detect mouse button press) and/or among the embodiment of other program layer aggregated(particle) structure of service routine level.
Before first subevent was sent to view Search Results panel 304, event recognition device 580 and 590 was in event recognition initial state 582 and 592 respectively.Then touch 301 as detect finger put down subevent 521-1 be sent to the event recognition device that effectively relates to that is used for view Search Results panel 304 as touch subevent 301-2 (and be sent to be used for map view 305 the event recognition device that effectively relates to as touching subevent 301-3); Roll events recognizer 580 is transformed into incident possibility state 584; Similarly, rap event recognition device 590 and be transformed into incident possibility state 594.This is because rap and the event definition that rolls all is to touch (for example, on touch sensitive surface, detecting finger puts down) beginning.
Rap and some definition of roll attitude possibly be included in initial touch and any delay between next step in event definition alternatively.In all examples of this discussion, be used to rap with the two example event definition of roll attitude and discern delay subevent afterwards, the first touch subevent (detecting finger puts down).
Correspondingly, when measurement delay subevent 521-2 was sent to event recognition device 580 and 590, the two remains on incident respectively maybe state 584 and 594.
At last, the detection finger is lifted away from subevent 521-3 and is sent to event recognition device 580 and 590.In this case, being used for event recognition device 580 is different with 590 state exchange, because the event definition that is used to rap with rolling is different.Under the situation of roll events recognizer 580, the next subevent that remains in the incident possibility state will be to detect to move.Yet because the subevent of transmitting is to detect finger to be lifted away from 521-3, roll events recognizer 580 is transformed into the impossible state 588 of incident.Be lifted away from the subevent end and rap event definition with finger.Therefore, transmit to detect after finger is lifted away from subevent 521-3, rapping event recognition device 590 and be transformed into event recognition and do well 596.
Notice that in certain embodiments, about Fig. 4 B and 4C discussion, the input source processing procedure of discussing among Fig. 4 B 440 can be used in view level other places for various purposes as top.Below form provide the transmission and the input source processing procedure 440 of the relevant subevent sequence 520 of event recognition device 580,590 with the form of summarizing tabulation:
Figure BSA00000674738200381
Forward Fig. 5 B to, notice forwards another example subevent sequence 530 to, and subevent sequence 530 is comprised that the view of a plurality of event recognition devices receives.Example has illustrated two event recognition devices in Fig. 5 B hereto, promptly roll events recognizer 580 with rap event recognition device 590.For illustrative purposes, the view Search Results panel 304 among Fig. 3 A will be relevant with the reception of subevent sequence 530, and roll events recognizer 580 changes with the state that raps event recognition device 590.Notice in this example; Subevent sequence 530 has defined the rolling finger gesture on touch-sensitive display; But same event definition technology goes in a large amount of context (for example, detect mouse button is pressed, mouse moves and mouse button discharges) and/or among the embodiment of other program layer aggregated(particle) structure of service routine level.
Be sent to before the event recognition device that effectively relates to that is used for view Search Results panel 304 in first subevent, event recognition device 580 and 590 is in event recognition initial state 582 and 592 respectively.Follow the transmission (as discussed above) corresponding to the subevent of touch 301, roll events recognizer 580 is transformed into incident possibility state 584, similarly, raps event recognition device 590 and is transformed into incident possibility state 594.
When measurement delay subevent 531-2 was sent to event recognition device 580 and 590, the two is transformed into incident respectively maybe state 584 and 594.
Next, detect finger mover incident 531-3 and be sent to event recognition device 580 and 590.In this case, being used for event recognition device 580 is different with 590 state exchange, because the event definition that is used to rap with rolling is different.Under the situation of roll events recognizer 580; The next subevent that remains in the incident possibility state is to detect to move, so roll events recognizer 580 remains in the incident possibility state 584 when roll events recognizer 580 receives detection finger mover incident 531-3.Yet as discussed above, the definition that is used to rap is lifted away from the subevent with finger and finishes, and is transformed into the impossible state 598 of incident so rap event recognition device 590.
At last, the detection finger is lifted away from subevent 531-4 and is sent to event recognition device 580 and 590.Rap the event recognition device and be in the impossible state 598 of incident, so there is not state exchange to take place.The event definition of roll events recognizer 580 is lifted away from end to detect finger.Because the subevent of transmitting is to detect finger to be lifted away from 531-4, does well 586 so roll events recognizer 580 is transformed into event recognition.Noticing that finger on the touch sensitive surface moves possibly produce a plurality of mover incidents, and it is maybe be before being lifted away from identified or continue identification up to being lifted away from therefore to roll.
Below form provide the transmission and the input source processing procedure 440 of the subevent sequence 530 relevant with event recognition device 580,590 with the form of summarizing tabulation:
Forward Fig. 5 C to, notice forwards another example subevent sequence 540 to, and subevent sequence 540 is just being comprised that the view of a plurality of event recognition devices receives.Example has illustrated two event recognition devices in Fig. 5 C hereto, promptly twoly strikes event recognition device 570 and raps event recognition device 590.For illustrative purposes, the map view 305 among Fig. 3 A will be relevant with the reception of subevent sequence 540, and two event recognition device 570 that strikes changes with the state that raps in the event recognition device 590.Notice in this example; Subevent sequence 540 has defined the two attitudes of striking on touch-sensitive display; But same event recognition technology goes in a large amount of context (for example, detect mouse double-click) and/or among the embodiment of other program layer aggregated(particle) structure of service routine level.
Be sent to before the event recognition device that effectively relates to that is used for map view 305 in first subevent, event recognition device 570 and 590 is in event recognition initial state 572 and 592 respectively.Then will with touch 301 relevant subevents, subevent and be sent to map view 304 (as stated), twoly strike event recognition device 570 and rap event recognition device 590 and be transformed into the possible state 574 and 594 of incident respectively.This be because rap with two event definitions that strike all be to touch (for example, on touch sensitive surface, detect finger and put down 541-1) beginning.
When measurement delay subevent 541-2 was sent to event recognition device 570 and 590, the two is transformed into incident respectively maybe state 574 and 594.
Next, the detection finger is lifted away from subevent 541-3 and is sent to event recognition device 570 and 590.In this case, event recognition device 580 is different with 590 state exchange, because be used to rap different with two exemplary event definition of striking.Under the situation of rapping event recognition device 590, last subevent in the event definition is will detect finger to be lifted away from, and is transformed into event recognition and does well 596 so rap event recognition device 590.
Yet no matter what the user finally possibly do, owing to begun a delay, two event recognition device 570 that strikes remains on incident possibility state 574.But be used for two complete event identification definition of striking and need another delay, follow the complete subevent sequence of rapping thereafter.This caused be in event recognition do well 576 rap event recognition device 590 and the incident that still is in maybe state 574 twoly strikes the ambiguity between the event recognition device 570.
Correspondingly, in certain embodiments, discuss about Fig. 3 B and 3C as top, the event recognition device can be realized exclusive sign and exclusive exception list.Here; With the exclusive sign 324 that is provided for rapping event recognition device 590; In addition; The exclusive exception list 326 that will be used to rap event recognition device 590 is configured to get into event recognition and do well and continue the permission subevent 596 after and be sent to some incident recognizers (for example, two event recognition devices 570 that strike) rapping event recognition device 590.
Remain on event recognition and do well 596 the time when rapping event recognition device 590; Subevent sequence 540 continues to be sent to two event recognition devices 570 that strike, wherein measure postpone subevent 541-4, detect finger put down subevent 541-5 and measure postpone subevent 541-6 keep two strike event recognition devices 570 be in incident maybe state 574; The last subevent of sequence 540 is promptly detected transmission that finger is lifted away from 541-7 and will two be struck event recognition device 570 and be transformed into event recognition and do well 576.
At this moment, map view 305 obtains by the two of event recognition device 570 identifications and strikes incident, rather than raps the incident of singly striking of event recognition device 590 identifications.According to the exclusive sign 324 that raps event recognition device 590 that is set up, rap comprising of event recognition device 590 and two strike the exclusive exception list 326 of incident and rap event recognition device 590 and twoly strike the fact that event recognition device 570 boths have successfully discerned its event type separately, made this and obtained two decisions of striking incident.
Following form provides the transmission and the subevent processing procedure 440 of the subevent sequence 540 relevant with event recognition device 570 and 590 with the form of summarizing tabulation:
Figure BSA00000674738200411
Figure BSA00000674738200421
In another embodiment, in the incident scene of Fig. 5 C, singly strike attitude and be not identified, the two waits tabulations of striking the event recognition device of identification are arranged because singly strike the event recognition utensil.The result is singly to strike attitude and can not be identified up to (if possible occurring) two impossible states of event recognition devices entering incident that strike.In this example, discerned and two struck attitude, singly strike the event recognition device will keep incident maybe state up to identifying two attitudes of striking, singly strike this moment the event recognition device will be transformed into incident can not state.
Pay close attention to now Fig. 6 A and 6B, Fig. 6 A and 6B are illustrations according to the process flow diagram of the event recognition method of some embodiment.This method 600 is carried out in electronic equipment, and is as discussed above, and in certain embodiments, this electronic equipment can be an electronic equipment 102.In certain embodiments, this electronic equipment can comprise the touch sensitive surface that is configured to detect the multiple point touching attitude.Alternatively, this electronic equipment can comprise the touch-screen that is configured to detect the multiple point touching attitude.
Method 600 is configured to carry out the software that comprises the view hierarchical structure with a plurality of views.One or more views that method 600 shows in the 608 view hierarchical structures, and carry out 610 one or more software elements.Each software element is associated with a particular figure, and each particular figure comprises one or more event recognition devices, in Fig. 3 B and 3C, is described as the event recognition device of event recognition device structure 320 and 360 respectively such as those.
Each event recognition device generally comprises the event definition based on one or more subevents, and wherein event definition can be used as the state machine realization, for example referring to the state machine among Fig. 3 B 340.The event recognition device generally also comprises event handler, and wherein event handler is specified the action to target, and is configured to send in response to the event recognition device detects the incident corresponding with event definition and moves target.
In certain embodiments, indicated like the step 612 of Fig. 6 A, at least one in a plurality of event recognition devices is the gesture recognition device with attitude definition and attitude processor.
In certain embodiments, indicated like the step 614 of Fig. 6 A, event definition has defined user's attitude.
Alternatively, the event recognition utensil has one group of event recognition state 616.These event recognition states can comprise at least incident maybe state, incident can not state and event recognition do well.
In certain embodiments, if the event recognition device incident that gets into maybe state, then event handler begins its preparation 618 that is used to be sent to the respective action of target.Discuss about example among Fig. 4 A and Fig. 5 A-5C as top, the state machine of realizing for each event recognition device generally comprises original state, for example, and event recognition initial state 405.State variation is triggered to incident possibility state 410 in the subevent that receives the initial part that forms event definition.Correspondingly; In certain embodiments; Along with the event recognition device is transformed into incident from event recognition initial state 405 maybe state 410, the event handler of event recognition device can incident by the specific action that begins to prepare it after successfully identifying in order to be sent to the target of event recognition device.
On the other hand, in certain embodiments, if the event recognition device gets into the impossible state 420 of incident, then event handler can stop the preparation 620 of its respective action.In certain embodiments, stop any preparation that corresponding action comprises the respective action of cancelling this event handler.
The example of Fig. 5 B provides information for this embodiment; Possibly begin preparation 618 because rap event recognition device 590 to its action; But then; In case detection finger mover incident 531-3 is sent to and raps event recognition device 590, recognizer 590 just will be transformed into incident can not state 598,578.At this moment, rapping event recognition device 590 can stop beginning to prepare the preparation 620 of 618 action.
In certain embodiments, do well if the event recognition device gets into event recognition, event handler is accomplished its preparation 622 that is used to be sent to the respective action of target so.The example illustration of Fig. 5 C this embodiment discerned two striking because be used for the event recognition device by effectively relating to of map view 305, in certain embodiments, this will be to be tied to select and/or carry out the incident by the Search Results shown in the map view 305.Here, successfully identify after be made up of subevent sequence 540 two strike incident at two event recognition devices 570 that strike, the event handler of map view 305 is accomplished the preparation 622 to its action, promptly indicates it to receive activation command.
In certain embodiments, event handler transmits 624 its respective action to the target that is associated with the event recognition device.Continue the example of Fig. 5 C, the action of preparation, i.e. the activation command of map view 305, with being sent to the specific objective that is associated with map view 305, this specific objective can be any suitable program technic or object.
Alternatively, a plurality of event recognition devices can be handled the sequence of 626 one or more subevents concurrently independently.
In certain embodiments, one or more event recognition devices can be configured to exclusive event recognition device 628, as the top exclusive sign of discussing respectively about Fig. 3 B and 3C 324 and 364.When the event recognition device was configured to exclusive event recognition device, the incident transfer system stoped exclusive event recognition device identifies incident after reception (same subevent sequence) subevent subsequently of any other event recognition device (except those are listed in the exception list 326,366 of the event recognition device of this incident of identification) of the view that is used for effectively relating in the view hierarchical structure.Further; When non-exclusive event recognition device identifies incident; The incident transfer system stops any exclusive event recognition device of the view that is used for effectively relating in the view hierarchical structure to receive subevent subsequently, those (if any) in the exception list 326,366 of the event recognition device of this incident of identification, list except.
In certain embodiments, exclusive event recognition device can comprise 630 incident exception list, as the top exclusive exception list of discussing respectively about Fig. 3 B and 3C 326 and 366.Note as the above-mentioned discussion of Fig. 5 C, though the exclusive exception list of event recognition device can be used for when formation its separately during the subevent sequence overlapping of event definition, also allow the event recognition device to proceed event recognition.Correspondingly, in certain embodiments, the incident exception list comprises that its corresponding event definition has the incident 632 of the subevent of repetition, and for example Fig. 5 C's singly strikes/two Event Examples that strike.
Alternatively, event definition can define user's input operation 634.
In certain embodiments, the transmission that goes for postponing each subevent in the sequence of subevent of one or more event recognition devices is after incident is identified.
Method 600 detects the sequence of 636 one or more subevents, and in certain embodiments, the sequence of one or more subevents can comprise basis (primitive) touch event 638.The basis touch event can include but not limited on the touch sensitive surface basic element of character based on the attitude that touches; For example with initial finger or stylus touch put down relevant data, with refer to or stylus begin to stride touch sensitive surface move relevant data, two finger oppositely mobile, be lifted away from stylus from touch sensitive surface, or the like.
Subevent in the sequence of one or more subevents can comprise various ways; Include but not limited to that key is pressed, key is pressed maintenance, key discharges, button is pressed, button is pressed maintenances, button is pressed release, operating rod moves, mouse moves, mouse button is pressed, mouse button discharges, stylus touches, stylus moves, stylus discharges, oral indication, detected eye motion, biometric are imported, detected user's physiological change and other.
One in the view of method 600 identifications 640 view hierarchical structures as clicking view.Which view of clicking in the view establishment view hierarchical structure is the view that effectively relates to.Among Fig. 3 A example has been shown, the view 303 that wherein effectively relates to comprises Search Results panel 304 and map view 305, because touch the zone that subevent 301 contacts are associated with map view 305.
In certain embodiments, the view that effectively relates to of first in the view hierarchical structure can be configured to 642 and stops separately subevents to be sent to the event recognition device that is associated with first view that effectively relate to.The behavior can be realized top skip feature (being respectively 330 and 370) about Fig. 3 B and 3C discussion.When to the event recognition device when being provided with skip feature, for the view hierarchical structure in the event recognition device that is associated with view that other effectively relate to, still carry out be equipped with from the transmission of subevent.
Alternatively, the view that first in the view hierarchical structure effectively relates to can be configured to 644 and stop separately subevents to be sent to the event recognition device that is associated with first view that effectively relate to, only if first view that effectively relate to is to click view.The behavior can be realized top conditionality skip feature (being respectively 332 and 372) about Fig. 3 B and 3C discussion.
In certain embodiments, the view configuration that effectively relates to of second in the view hierarchical structure event recognition device that becomes the 646 event recognition devices that stop separately subevents to be sent to be associated and be associated with the elder generation of second view that effectively relate to second view that effectively relate to.The behavior can be realized top stop performance (being respectively 328 and 368) about Fig. 3 B and 3C discussion.
Method 600 transmission 648 subevents separately are to the event recognition device that is used for each view that effectively relates to of view hierarchical structure.In certain embodiments, be used for handling subevent separately before the next subevent of event recognition device in handling the subevent sequence of the view that the view hierarchical structure effectively relates to.Alternatively, the event recognition device that is used for the view that the view hierarchical structure effectively relates to is made their subevent identification decision when the subevent of handling separately.
In certain embodiments, the event recognition device that is used for the view that the view hierarchical structure effectively relates to can be handled the sequence 650 of one or more subevents simultaneously; Alternatively, the event recognition device that is used for the view that the view hierarchical structure effectively relates to can be handled the sequence of one or more subevents concurrently.
In certain embodiments, one or more event recognition devices one or more subevents that can be applied to postpone to transmit 652 subevent sequences are after the event recognition device identifies incident.The behavior has been reflected the incident that postpones.For example, consider singly to strike attitude in the view, it also is possible repeatedly rapping attitude for it.In this case, the incident of rapping becomes " rapping+postpone " recognizer.In fact, when the event recognition device is realized the behavior, the event recognition device will postpone event recognition and confirm that up to it the subevent sequence is in fact really corresponding to its event definition.Can not be suitably the behavior can be fit to during in response to the incident of cancellation when receiving view.In certain embodiments, the event recognition device will postpone to upgrade separately the view that effectively relate to of its event recognition state to it, confirm that up to the event recognition device subevent sequence does not correspond to its event definition.Discuss about Fig. 3 B and 3C as top; Provide and postpone to touch opening flag 328,368; Postpone to touch end mark 330,370, and touch cancellation sign 332,372 makes subevent tranmission techniques and event recognition device and viewstate information updating be suitable for concrete needs.
Fig. 7 A-7S illustration importing with the user by the example user interface of event recognition device identification according to some embodiment in order to navigate through the application program of opening simultaneously.User interface among these figure is used for the following process of illustration, comprises the process among Fig. 8 A-8B, Fig. 9 A-9C and Figure 10 A-10B.
Although following many examples will provide with reference to the input on the touch-screen display 156 (wherein having combined touch sensitive surface and display); But in certain embodiments; Equipment Inspection is independent of the input on the touch sensitive surface (for example, touch pad or track pad) of display.In certain embodiments, the main shaft of touch sensitive surface is corresponding to the main shaft on the display.According to these embodiment, equipment is detecting corresponding to the position of the position separately on the display and the contacting of touch sensitive surface.Like this, when touch sensitive surface and display separate, be used for the user interface on the display of operating electronic equipment by equipment in detected user's input on the touch sensitive surface by equipment.Should be appreciated that similarly method can be used for other user interfaces described herein.
Fig. 7 A illustration according to the example user interface on the electronic equipment 102 of some embodiment (" beginning position a picture " 708).Similarly user interface can realize on electronic equipment 102.In certain embodiments, beginning position picture 708 is shown by the applied program ignitor software application, is sometimes referred to as starting point (springboard).In certain embodiments, the user interface on the touch-screen 156 comprises following element or its subclass or superset:
The S meter 702 of radio communication is such as honeycomb and Wi-Fi signal;
Time 704; And
Battery Status Indicator 706.
The user interface of example comprises a plurality of application icons 5002 (for example, 5002-25 is to 5002-38).From beginning position picture 708, finger gesture can be used to start application program.For example, begin to start email application at the finger gesture 701 that raps corresponding to the position of application icon 5002-36.
At Fig. 7 B,, start email application and on touch-screen 156, show email application view 712-1 in response on application icon 5002-36, detecting finger gesture 701.The user can start other application programs in a similar fashion.For example, the user can press beginning position button 710 and turn back to beginning position picture 708 (Fig. 7 A) from Any Application view 712, and on the application icon separately 5002 on the beginning position picture 708, uses finger gesture to start other application programs.
Fig. 7 C-7G illustration in response to beginning position picture 708 on separately application icon 5002 corresponding positions detect finger gesture separately and sequentially start application program separately; And show separately user interface (that is, separately application view) successively.Especially, Fig. 7 C illustration in response to the finger gesture on the application icon 5002-32, display media storehouse application view 712-2.In Fig. 7 D,, show notepad application view 712-3 in response to the finger gesture on the application icon 5002-30.Fig. 7 E illustration in response to the finger gesture on the application icon 5002-27, displayed map application view 712-4.In Fig. 7 F,, show weather application view 712-5 in response to the finger gesture on the application icon 5002-28.Fig. 7 G illustration in response to the finger gesture on the application icon 5002-37, display web page browser application view 712-6.The sequence of the application program of opening in certain embodiments, is corresponding to the startup of email application, media gallery application, notepad application, map application, weather application and Web-browser application.
Fig. 7 G also illustration the finger gesture 703 (for example, rapping attitude) on user interface object (for example, bookmark icon).In certain embodiments, in response on the bookmark icon, detecting finger gesture 703, Web-browser application is the display bookmark tabulation on touch-screen 156.Similarly, the user can use other attitudes (for example, on the user interface object of address, rap attitude, it allows user to use on-screen keyboard to import new address usually or revises the address that shows; Any in the webpage that shows chains raps attitude, and it begins to navigate to selected and links corresponding webpage; Or the like) mutual with the application program (for example, Web-browser application) that shows.
In Fig. 7 G, detect the first predetermined input (for example, the double-click 705 on beginning position button 710).Alternatively, on touch-screen 156, detect many commanders and hit attitude (for example, three point to the attitude of hitting, illustrative like the mobile institute that utilizes finger contact 707,709 and 711).
Fig. 7 H illustration in response to (for example detecting the first predetermined input; Double-click 705 and comprise that perhaps many commanders of finger contact 707,709 and 711 hit attitude), a part and the application icon zone 716 of while display web page browser application view 712-6.In certain embodiments; In response to detecting the first predetermined input; Equipment gets into the application view preference pattern; Be used for selecting of the application program opened simultaneously, and that part of and application icon regional 716 of Web-browser application view 712-6 is shown as the part of application view preference pattern simultaneously.Application icon zone 716 comprises at least corresponding to some the one group of application icon opened in a plurality of application programs of opening simultaneously.In this example; Portable electric appts (for example has a plurality of application programs of opening simultaneously; Email application, media gallery application, notepad application, map application, weather application and Web-browser application), although they show not all simultaneously.As illustrative among Fig. 7 H; Application icon zone 716 comprises and is used for weather application, map application, notepad application and media gallery application (promptly; In the sequence of the application program of opening; And then the application program of current demonstration is four application programs of Web-browser application) application icon (for example, 5004-2,5004-4,5004-6 and 5004-8).In certain embodiments, application program image target sequence of opening that in application icon zone 716, shows or order are corresponding to the sequence (for example, weather, map, notepad and media gallery application) of the application program of opening in predetermined sequence.
Fig. 7 H also illustration on the application icon 5004-8 that opens, detect attitude 713 (for example, rapping attitude).In certain embodiments, in response to detecting attitude 713, show corresponding application program view (for example, media gallery application view 712-2, Fig. 7 C).
Fig. 7 H illustration detect the left side attitude 715 of hitting in position corresponding to application icon zone 716.In Fig. 7 I, in response to detecting the left side attitude 715 of hitting, the application icon (for example, 5004-2,5004-4,5004-6 and 5004-8) in the rolling application icon zone 716.The result who rolls is that the application icon 5004-12 that is used for email application replaces the previous application icon (for example, 5004-2,5004-4,5004-6 and 5004-8) that shows to be presented in the application icon zone 506.
In Fig. 7 J, on Web-browser application view 712-6, detect the attitude (the many fingers left side of moving that for example, comprises finger contact 717,719 and 721 hit attitude) of the first kind.Fig. 7 K illustration in response to the attitude that detects the first kind, weather application view 712-5 is presented on the touch-screen 156.Should be noted that weather application is right after after Web-browser application in the sequence of the application program of opening.
Fig. 7 K also illustration on weather application view 712-5, detect second attitude (the many fingers left side of moving that for example, comprises finger contact 723,725 and 727 hit attitude) of the first kind.Fig. 7 L illustration in response to second attitude that detects the first kind, map application view 712-4 is presented on the touch-screen 156.Should be noted that map application is right after after weather application in the sequence of the application program of opening.
Fig. 7 L also illustration on map application view 712-4, detect the 3rd attitude (the many fingers left side of moving that for example, comprises finger contact 729,731 and 733 hit attitude) of the first kind.Fig. 7 M illustration in response to the 3rd attitude that detects the first kind, notepad application view 712-3 is presented on the touch-screen 156.Should be noted that notepad application is right after after map application in the sequence of the application program of opening.
Fig. 7 M also illustration on notepad application view 712-3, detect the 4th attitude (the many fingers left side of moving that for example, comprises finger contact 735,737 and 739 hit attitude) of the first kind.Fig. 7 N illustration in response to the 4th attitude that detects the first kind, media gallery application view 712-2 is presented on the touch-screen 156.Should be noted that media gallery application is right after after notepad application in the sequence of the application program of opening.
Fig. 7 N also illustration on media gallery application view 712-2, detect the 5th attitude (the many fingers left side of moving that for example, comprises finger contact 741,743 and 745 hit attitude) of the first kind.Fig. 7 O illustration in response to the 5th attitude that detects the first kind, email application view 712-1 is presented on the touch-screen 156.Should be noted that email application is right after after media gallery application in the sequence of the application program of opening.
Fig. 7 O also illustration on email application view 712-1, detect the 6th attitude (the many fingers left side of moving that for example, comprises finger contact 747,749 and 751 hit attitude) of the first kind.Fig. 7 P has described in response to the 6th attitude that detects the first kind, and Web-browser application view 712-6 is presented on the touch-screen 156.Should be noted that the end of Web-browser application, and email application is at the other end of the sequence of the application program of opening in the sequence of the application program of opening.
Fig. 7 P also illustration on Web-browser application view 712-6, detect the attitude (the many fingers right side of moving that for example, comprises finger contact 753,755 and 757 hit attitude) of second type.Fig. 7 Q illustration, in certain embodiments, in response to the attitude that detects second type, email application view 712-1 is presented on the touch-screen 156.
With reference to figure 7R, on Web-browser application view 712-6, detect to refer to attitude (the five fingers that move that for example, comprise finger contact 759,761,763,765 and 767 are pinched attitude) more.Fig. 7 S illustration when on touch-screen 156, detecting when referring to attitude more, Web-browser application view 712-6 shows with at least a portion beginning picture 708 simultaneously.Illustrative like institute, Web-browser application view 712-6 shows with scale down.When on touch-screen 156, detecting when referring to attitude, adjust scale down according to these attitudes that refer to more more.For example, scale down is pinched along with further grabbing of finger contact 759,761,763,765 and 767 and is reduced (that is, Web-browser application view 712-6 shows with littler ratio).Instead, scale down increases (that is, Web-browser application view 712-6 is to show than before bigger ratio) along with scattering of finger contact 759,761,763,765 and 767.
In certain embodiments, when stop to detect when referring to attitude more, stop display web page browser application view 712-6 and show the picture 708 of whole beginning.Instead, when stop to detect when referring to attitude more, determine whether and to show beginning position picture 708 or Web-browser application view 712-6 with ratio all over the screen.In certain embodiments, when stopping to show when referring to attitude more, based on scale down make definite (for example, if when stop to detect when referring to attitude application view then show the picture 708 of whole beginning with ratio demonstration less than predetermined threshold; If application view shows with the ratio greater than predetermined threshold when referring to attitude when stopping to detect more, then show application view and do not show a beginning position picture 708) with ratio all over the screen.In certain embodiments, confirm also to make based on the speed of many fingers attitude.
Fig. 8 A and 8B are illustrations according to the process flow diagram of the event recognition method 800 of some embodiment.Method 800 has the electronic equipment of touch-sensitive display (for example, equipment 102, Figure 1B) middle carry out (802).This electronic configurations becomes to carry out at least first software application and second software application.First software application package is drawn together first group of one or more gesture recognition device; Second software application package (is for example drawn together one or more views and second group of one or more gesture recognition device; Application program 133-2 has gesture recognition device 516-4; And application program 133-1 has gesture recognition device 516-1 to 516-3 and view 508,510 and 512, Fig. 3 F).Gesture recognition utensil separately has corresponding attitude processor (for example, attitude processor 552-1 is corresponding to gesture recognition device 516-1, and attitude processor 552-3 is corresponding to gesture recognition device 516-4).First group of one or more gesture recognition device is different from second group of one or more gesture recognition device usually.
Method 800 allows the user to use the current hiding application program of opening (for example, first software application) that on the display of electronic equipment, does not show of attitude control, the for example application program of the application program of background application, hang-up or dormancy.Therefore; The user can carry out and by the application program on the current display that is presented at electronic equipment (for example not be; Second software application) that provides but by an operation that provides in the application program of front opening (for example, use attitude to show beginning position picture or switch to next software application) for the applied program ignitor software application of hiding.
In certain embodiments, first software application (804) is applied program ignitor (for example a, starting point).For example, shown in Fig. 7 A, applied program ignitor shows a plurality of application icons 5002 corresponding to a plurality of application programs.The user that applied program ignitor receives application programs icon 5002 selects (for example, based on the finger gesture on the touch-screen 156), and selects in response to receiving this user, starts the application program corresponding to the application icon of selecting 5002.
The software application that second software application is normally started by applied program ignitor.Illustrative like institute among Fig. 7 A and the 7B, applied program ignitor receives about the information of rapping attitude 701 on the email application icon 5002-36 and starts email application.As response, email application shows email application view 712-1 on touch-screen 156.Second software application can be the Any Application corresponding to application icon 5002 (Fig. 7 A), perhaps can be by any other application program (for example, media gallery application, Fig. 7 C of applied program ignitor startup; Notepad application, Fig. 7 D; Map application, Fig. 7 E; Weather application, Fig. 7 F; Web-browser application, Fig. 7 G; Or the like).In the following description of method 800, applied program ignitor is as the first exemplary software application, and Web-browser application is as the second exemplary software application.
In certain embodiments, electronic equipment has only two software applications in the program layer aggregated(particle) structure: applied program ignitor and other software application software application of the one or more views on the touch-screen that is presented at electronic equipment 102 156 (normally corresponding to).
In certain embodiments, first software application (806) is the operating system application program.The employed operating system application program of this paper relates to the application program of operating system 118 integrated (Figure 1A-1C).The operating system application program resides in the core os layer 208 or operating system API software 206 among Fig. 2 usually.The operating system application program can not be removed by the user usually, yet other application programs can or remove by user installation usually.In certain embodiments, the operating system application program comprises applied program ignitor.In certain embodiments, the operating system application program comprises application program (for example, being used to show/application program of one or more values of the setting of modification system or equipment/overall internal state 134 Fig. 1 C) is set.In certain embodiments, the operating system application program comprises supplementary module 127.In certain embodiments, electronic equipment has only three software applications in the program layer aggregated(particle) structure: applied program ignitor, application program and other application program software application of the one or more views that on the touch-screen 156 of electronic equipment 102, show (normally corresponding to) are set.
Electronic equipment shows the subclass (for example, Web-browser application view 712-6, Fig. 7 G) of one or more views of (808) second software applications at least.
In certain embodiments, show to comprise that (810) show the subclass of one or more views of second software application at least, and do not show any view of first software application.For example, in Fig. 7 G, do not show the view (for example, beginning position picture 708) of applied program ignitor.
According to some embodiment, show to comprise that (812) show the subclass of one or more views of second software application at least, and do not show the view of any other application program.For example, in Fig. 7 G, one or more views of a display web page browser application.
When the subclass of the one or more views that show second software application at least, and the touch list entries on electronic equipment detection (814) touch-sensitive display (for example, attitude 703, it comprises that incident is put down in touch and (touch-up) incident is mentioned in touch; Or another attitude, it comprise finger contact 707,709 put down with 711 touch, point contact 707,709 and 711 stride touch-screen 156 move and point contact 707,709 and 711 be lifted away from).Touch list entries and comprise the first of one or more touch inputs and the second portion of the one or more touch inputs after the first.Use like this paper, term " sequence " is meant the ordering that one or more touch events wherein take place.For example; In the touch list entries that comprises finger contact 707,709 and 711; First can comprise that the touch of finger contact 707,709 and 711 puts down, and second portion can comprise finger contact 707,709 with 711 move and finger contacts 707,709 and 711 be lifted away from.
In when, in certain embodiments, (816) in the view of the demonstration that overlaps on second software application when touch input in the first of one or more touches inputs at least in part at least one detect taking place.In certain embodiments, although touch at least one in the view of importing the demonstration that overlaps on second software application at least in part, first software application still receives the first of one or more touches inputs.For example, the first (Fig. 7 G) of the touch input on the view of the demonstration of applied program ignitor reception web browser is not although applied program ignitor is shown.
During the phase one of senses touch list entries (818); The first that electronic equipment transmits (820) one or more touches inputs to first software application and second software application (for example; Use event scheduler module 315, Fig. 3 D), the gesture recognition device of one or more couplings of the first of the one or more touches inputs of identification (822) identification is (for example in the gesture recognition device from first group; Use in first group each gesture recognition device (normally; Each receives gesture recognition device) in event comparator 3033, Fig. 3 D), and use handle (824) one or more touches inputs corresponding to one or more attitude processors of the gesture recognition device of one or more couplings first (for example; Activate corresponding event handler 319, Fig. 3 D).
In certain embodiments, the phase one of senses touch list entries is the stage of detecting the first of one or more touch inputs.
About transfer operation (820); In certain embodiments; First software application is after the first that receives one or more touch inputs; The subclass of the gesture recognition device of the first that transmits one or more touches input at least the first group, and second software application is after the first that receives one or more touches inputs, the subclass of the gesture recognition device of the first that transmits one or more touches inputs at least the second group.In certain embodiments; Event scheduler module in electronic equipment or the electronic equipment (for example 315; The subclass of the gesture recognition device of the first that Fig. 3 D) transmits one or more touches inputs at least the first group and second group (for example; Event scheduler module 315 transmits the first of one or more touch inputs to gesture recognition device 516-1,516-2 and 516-4, Fig. 3 F).
For example; When on touch-screen 156, detecting the finger gesture that comprises finger contact 707,709 and 711 (Fig. 7 G), transmit and touch the incident of putting down to one or more gesture recognition devices of applied program ignitor and one or more gesture recognition devices of Web-browser application.In another example, the touch of rapping attitude 703 is put down incident (Fig. 7 G) and is sent to one or more gesture recognition devices of applied program ignitor and one or more gesture recognition devices of Web-browser application.
In certain embodiments; When not having the gesture recognition device to discern the first of one or more touches input in first group (for example; Do not match between detected incident and the attitude definition or not completion of attitude); The first that handles one or more touches input comprises and carries out blank operation (for example, equipment not the user interface of update displayed).
In certain embodiments, assert the gesture recognition device of one or more couplings of the first that the one or more touches of identification are imported in the gesture recognition device of electronic equipment from second group.Electronic equipment uses the first that handles one or more touch inputs corresponding to one or more attitude processors of the gesture recognition device of one or more couplings.For example; Rap attitude 703 (Fig. 7 G) in response to the one or more gesture recognition devices that are sent to Web-browser application; The gesture recognition device of the coupling in the Web-browser application (for example; Discern the gesture recognition device that raps attitude on the bookmark icon, Fig. 7 G) handle through the tabulation of display bookmark on touch-screen 156 and rap attitude 703.
In certain embodiments; After the phase one, during the subordinate phase of senses touch list entries, electronic equipment transmits (826; Fig. 8 B) second portion of one or more touch inputs is to first software application; And the second portion that does not transmit one or more touches input is to second software application (for example, using event scheduler module 315, Fig. 3 D); From the gesture recognition device of one or more couplings, assert the gesture recognition device (for example, using the event comparator 3033 in the gesture recognition device of each coupling, Fig. 3 D) of second coupling of recognizing touch operation list entries; And use corresponding to the attitude processor of the gesture recognition device that matees separately and handle the touch list entries.In certain embodiments, the subordinate phase of senses touch list entries is to detect the stage of the second portion of one or more touch inputs.
For example; When on touch-screen 156, detecting the finger gesture that comprises finger contact 707,709 and 711 (Fig. 7 G); Transmit to touch and move and be lifted away from the one or more gesture recognition devices of incident, and do not transmit this touch event to Web-browser application to applied program ignitor.Electronic equipment is assert the gesture recognition device (for example, three refer to hit the gesture recognition device) of the coupling of applied program ignitor, and uses corresponding to the attitude processor of the gesture recognition device of hitting on three fingers and handle this touch list entries.
During subordinate phase, second software application does not receive the second portion of one or more touch inputs, and this is normally because first software application has the right of priority (for example, in the program layer aggregated(particle) structure) that surpasses second software application.Therefore; In certain embodiments; When the gesture recognition device in first software application was discerned the first of one or more touch inputs, the one or more gesture recognition devices in first software application received second further part of one or more touch inputs exclusively.In addition, during subordinate phase, second software application can not receive the second portion of one or more touch inputs, because there is not the gesture recognition device to mate the first of one or more touch inputs in second software application.
In certain embodiments; Use touches list entries corresponding to the attitude processor processes of the gesture recognition device of coupling separately and comprise that (834) show at least corresponding to some the one group of application icon opened in a plurality of application programs of opening simultaneously in first presumptive area of touch-sensitive display, and the while shows the subclass of one or more views of second software application at least.For example, among Fig. 7 H, the application program that the application icon 5004 in the presumptive area 716 is opened corresponding to electronic equipment the time.In certain embodiments, according to the sequence of the application program of opening, show the application icon 5004 in the presumptive area 716.In Fig. 7 H, electronic equipment shows the subclass of presumptive area 716 and Web-browser application view 712-6 simultaneously.
In certain embodiments, use is handled corresponding to the attitude processor of the gesture recognition device that matees separately and is touched one or more views that list entries comprises (828) demonstration first software application.For example, in response to referring to pinch attitude (Fig. 7 R), electronic equipment shows a picture 708 (Fig. 7 A) that begins more.In certain embodiments, show that one or more views of first software application comprise the one or more views that show first software application, and do not show view (for example, Fig. 7 A) simultaneously corresponding to other any software applications.
In certain embodiments; Use corresponding to the attitude processor of gesture recognition device of coupling separately handle touch list entries comprise (830) demonstration of one or more views of second software application is replaced with first software application one or more views demonstration (for example; Show beginning position picture 708, Fig. 7 A).Therefore, after the one or more views that show first software application, stop to show one or more views of second software application.In certain embodiments; The demonstration that the demonstration of one or more views of second software application is replaced with one or more views of first software application comprises; Show one or more views of first software application, and do not show view (Fig. 7 A) simultaneously corresponding to other any software applications.
In certain embodiments, electronic equipment is carried out (832) first software applications, second software application and the 3rd software application simultaneously.In certain embodiments; Use is handled corresponding to the attitude processor of gesture recognition device of coupling separately and is touched list entries and comprise, the view of one or more demonstrations of second software application is replaced with one or more views of the 3rd software application.For example, hit attitude in response to many commanders, electronic equipment replaces with the demonstration of Web-browser application view 712-6 the demonstration (Fig. 7 J-7K) of weather application view 712-5.In some application programs; The one or more views that the view of one or more demonstrations of second software application replaced with the 3rd software application comprise; Show one or more views of the 3rd software application, and do not show view simultaneously corresponding to other any software applications.In certain embodiments, the 3rd software application is right after after second software application in the sequence of the application program of opening.
In certain embodiments, use to handle and touch list entries and comprise starting application program is set corresponding to the attitude processor of the gesture recognition device of coupling separately.For example, rap attitude in response to ten fingers, the electronic equipment startup is provided with application program.
Note, also be applicable to the method 900 that describes below in a similar fashion about the details of the said process of method 800.For succinctly, will no longer repeat these details below.
Fig. 9 A-9C is an illustration according to the process flow diagram of the event recognition method 900 of some embodiment.Method 900 is carried out (902) in having the electronic equipment of touch-sensitive display.Said electronic configurations becomes to carry out at least first software application and second software application.First software application package is drawn together first group of one or more gesture recognition device, and second software application package is drawn together one or more views and second group of one or more gesture recognition device.Gesture recognition utensil separately has corresponding attitude processor.In certain embodiments, first group of one or more gesture recognition device is different from second group of one or more gesture recognition device.
Method 900 allows the user to use attitude to control the current application program of hiding of opening (for example, first software application) that on the display of electronic equipment, does not show, such as the application program of background application, hang-up or the application program of dormancy.Therefore; The user can carry out and by the application program on the current display that is presented at electronic equipment (for example not be; Second software application) that provides but by an operation that provides in the application program of front opening (for example, use attitude to show beginning position picture or switch to next software application) for the applied program ignitor software application of hiding.
In certain embodiments, first software application (904) is applied program ignitor (for example a, starting point).In certain embodiments, first software application is (906) operating system application program.In the following description of method 900, applied program ignitor is as the first exemplary software application, and Web-browser application is as the second exemplary software application.
Electronic equipment shows (908) first groups of one or more views (for example, Web-browser application view 712-6, Fig. 7 G).First group of one or more view comprises the subclass of one or more views of second software application at least.For example, second software application can have a plurality of application view (for example, the application view 317 of application program 133-1, Fig. 3 D), and electronic equipment shows at least one view in a plurality of application view.In certain embodiments, subclass comprises whole one or more views of second software application.
In certain embodiments, show that first group of one or more view comprises that (910) show first group of one or more view and do not show any view (for example, Web-browser application view 712-6, Fig. 7 G) of first software application.
According to some embodiment, show that first group of one or more view comprises that (912) show first group of one or more view and do not show the view of any other software application.For example, in Fig. 7 G, one or more views of a display web page browser application.
When showing first group of one or more view; Electronic equipment detects the touch list entries on (914) touch-sensitive display, and confirm (920) whether at least one the gesture recognition device in first group of one or more gesture recognition device discern the first of one or more touches inputs.For example, when display web page browser application view 712-6 (Fig. 7 G), equipment confirms to be used for the gesture recognition device first whether recognizing touch operation is imported of applied program ignitor.Touch list entries and comprise the first of one or more touch inputs and the second portion (that is, second portion is after first) of the one or more touch inputs after the first.
In certain embodiments, touch list entries (916) at least one in the view of one or more demonstrations of second software application that overlap at least in part.For example, applied program ignitor receives the first of the touch input on the Web-browser application view 712-6 (Fig. 7 G), although applied program ignitor is not shown.
In certain embodiments; At least one gesture recognition device in confirming first group of one or more gesture recognition device is discerned before the first of one or more touch inputs, and electronic equipment transmits the first of (918) one or more touch inputs simultaneously to first software application and second software application.For example, before at least one the gesture recognition device recognizing touch operation in confirming applied program ignitor was put down incident, the two all received applied program ignitor and Web-browser application to point and contacts 707,709 and 711 touch and put down incident (Fig. 7 G).
According to definite (922 of the first that discerns one or more touch inputs about at least one the gesture recognition device in first group of one or more gesture recognition device; Fig. 9 B); Electronic equipment transmits (924) touch list entries and does not transmit the touch list entries to first software application to second software application; Confirm (926) whether at least one gesture recognition device recognizing touch operation list entries in first group of one or more gesture recognition device; And according to about the confirming of at least one the gesture recognition device recognizing touch operation list entries in first group of one or more gesture recognition device, use at least one gesture recognition device of first group of recognizing touch operation list entries in one or more gesture recognition devices to handle (928) and touch list entries.
For example, when mobile (Fig. 7 G) put down and touch in the touch that on touch-screen 156, detects three finger contacts 707,709 and 711, electronic equipment assert that three of applied program ignitor at least refers to the gesture recognition device recognizing touch operation input of hitting.After this, electronic equipment transmits subsequently touch event (for example, finger contact 707,709 and 711 be lifted away from) to applied program ignitor, and does not transmit subsequently touch event to Web-browser application.Electronic equipment assert that further three refer to hit gesture recognition device recognizing touch operation list entries, and uses corresponding to the attitude processor of the gesture recognition device of hitting on three fingers and handle the touch list entries.
In certain embodiments, use at least one the gesture recognition device in first group of one or more gesture recognition device to handle one or more views that the touch list entries comprises (930) demonstration first software application.For example, refer to pinch attitude (Fig. 7 R) in response to detecting, electronic equipment shows beginning position picture 708 (Fig. 7 A) more.
In certain embodiments; Use at least one gesture recognition device in first group of one or more gesture recognition device to handle to touch list entries comprise (932) demonstration of first group of one or more view is replaced with first software application one or more views demonstration (for example; Show beginning position picture 708; Fig. 7 A, beginning position picture 708 is parts of applied program ignitor software application).
In certain embodiments, electronic equipment is carried out first software application, second software application and the 3rd software application simultaneously; And use at least one gesture recognition device in first group of one or more gesture recognition device to handle to touch list entries and comprise that (934) replace with first group of one or more view on one or more views of the 3rd software application.In certain embodiments; The one or more views that first group of one or more view replaced with the 3rd software application comprise; Show one or more views of the 3rd software application, and do not show view simultaneously corresponding to other any software applications.For example, hit attitude in response to many commanders, electronic equipment replaces with the demonstration of Web-browser application view 712-6 the demonstration (Fig. 7 J-7K) of weather application view 712-5.
In certain embodiments; Use at least one gesture recognition device in first group of one or more gesture recognition device to handle to touch list entries and comprise (936); In first presumptive area of touch-sensitive display, show at least corresponding to some the one group of application icon opened in a plurality of application programs of opening simultaneously, and the subclass that shows first group of one or more view simultaneously at least.For example, in Fig. 7 H, the application program that the application icon 5004 in the presumptive area 716 is opened corresponding to electronic equipment the time.In certain embodiments, according to the sequence of the application program of opening, show the application icon 5004 in the presumptive area 716.In Fig. 7 H, electronic equipment shows the subclass of presumptive area 716 and Web-browser application view 712-6 simultaneously.
According to about do not have in first group of one or more gesture recognition device the gesture recognition device discern one or more touches input first confirm (938; Fig. 9 C); Electronic equipment transmits (940) and touches list entries to second software application; Confirm (942) whether at least one gesture recognition device recognizing touch operation list entries in second group of one or more gesture recognition device; And according to about the confirming of at least one the gesture recognition device recognizing touch operation list entries in second group of one or more gesture recognition device, use at least one gesture recognition device of second group of recognizing touch operation list entries in one or more gesture recognition devices to handle (944) and touch list entries.
For example; When attitude (is for example rapped by the first of one or more touch inputs; 703, Fig. 7 G), and do not have in the applied program ignitor identification of gesture recognition device this when rapping attitude; Electronic equipment transmits the said attitude of rapping to Web-browser application, and determine whether Web-browser application at least one gesture recognition device identification this rap attitude.When on Web-browser application (or gesture recognition device of Web-browser application) the identification bookmark icon rap attitude 703 time, electronic equipment uses corresponding attitude processor to handle and raps attitude 703.
Figure 10 A-10B is an illustration according to the process flow diagram of the event recognition method of some embodiment.Note, also be applicable to the method 1000 that describes below in a similar fashion about the details of the said process of method 600,800 and 900.For succinctly, will no longer repeat these details below.
Method 1000 is carried out (1002) in the electronic equipment with internal state (for example, equipment/overall internal state 134, Fig. 1 C).Electronic configurations becomes to carry out the software that comprises the view hierarchical structure with a plurality of views.
In method 1000, at least one gesture recognition utensil has a plurality of attitude definition.This helps the gesture recognition device under distinct operator scheme, to work.For example, equipment can have normal manipulation mode and non-productive operation pattern.Under normal manipulation mode, next application program attitude is used between application program moving, and should next one application program attitude be defined as the three finger left sides attitude of hitting.Under the non-productive operation pattern, the three finger left sides attitude of hitting is used to carry out different functions.Thus, needing one to be different from three attitudes that refer to hit on a left side with corresponding to next application program attitude (for example, four finger left sides under the non-productive operation pattern hit attitude) under the non-productive operation pattern.Through making a plurality of attitude definition be associated with next application program attitude, equipment can be that next application program attitude is selected in the attitude definition based on current operator scheme.This provides the dirigibility of under different operating modes, using the gesture recognition device.In certain embodiments, a plurality of gesture recognition devices with a plurality of attitude definition are conditioned (for example, being carried out by four fingers under the non-productive operation pattern in the attitude of being carried out by three fingers under the normal manipulation mode) based on operator scheme.
In certain embodiments, internal state comprises that (1016) are used for one or more settings of non-productive operation pattern (for example, whether this internal state indicating equipment runs under the non-productive operation pattern).
In certain embodiments, software is that (1018) perhaps comprise applied program ignitor (for example, starting point).
In certain embodiments, software is (1020) or comprise operating system application program (for example, the integrated application program of operating system of equipment).
Electronic equipment shows the one or more views in (1004) view hierarchical structure.
Electronic equipment is carried out (1006) one or more software elements.Each software element is associated with specific view (for example, application program 133-1 has one or more application view 317, Fig. 3 D), and each particular figure comprises one or more event recognition devices (for example, event recognition device 325, Fig. 3 D).Each event recognition utensil has based on one or more event definitions of one or more subevents and event handler (for example, attitude definition 3035 and the reference that incident is transmitted corresponding event handler in the information 3039, Fig. 3 D).Event handler is specified the action to target; And be configured in response to the event recognition device detect with one or more event definitions in particular event define corresponding incident and send and (for example move target; When the event recognition utensil has a plurality of event definition; The event definition of from one or more event definitions, selecting, the perhaps unique event definition when the event recognition device only has an event definition).
Electronic equipment detects (1008) one or more subevents sequence.
One in the view of electronic equipment identification (1010) view hierarchical structure as clicking view.Which view that this click view is established in the view hierarchical structure is the view that effectively relates to.
Electronic equipment transmits (1012) subevent separately to the event recognition device that is used for each view that effectively relates to of view hierarchical structure.In certain embodiments, the one or more views that effectively relate in the view hierarchical structure comprise the click view.In certain embodiments, the one or more views that effectively relate in the view hierarchical structure comprise default view (for example, the beginning position picture 708 of applied program ignitor).
At least one the event recognition utensil that is used for the view that the view hierarchical structure effectively relates to has (1014) a plurality of event definitions, and selects in these a plurality of event definitions according to the internal state of electronic equipment.For example, event recognition device 325-1 has a plurality of attitude definition (for example, 3037-1 and 3037-2, Fig. 3 D).In certain embodiments, event recognition device 325-1 selects in a plurality of attitude definition among the event recognition device 325-1 based on the one or more values in equipment/overall internal state 134 (Fig. 1 C).Then, according to selected event definition, before the next subevent, at least one event recognition device is handled subevent separately in handling the subevent sequence.In certain embodiments, each that is used in two or more event recognition devices of the view that the view hierarchical structure effectively relates to has a plurality of event definitions, and selects in these a plurality of event definitions according to the internal state of electronic equipment.In such embodiment, according to selected event definition, before the next subevent, at least one in two or more event recognition devices handled subevent separately in handling the subevent sequence.
For example, Fig. 7 J-7K illustration begin to show the next application program attitude of the application view of next application program.In certain embodiments, applied program ignitor comprises next application program gesture recognition device, and this next one application program gesture recognition device comprises that coupling three refers to the attitude definition of the left attitude of hitting.From this routine purpose, suppose that this next one application program gesture recognition device comprises that also row should refer to that the attitude of the left attitude of hitting defines in four.When the one or more values in equipment/overall internal state 134 were set to default value, this next one application program gesture recognition device used the hit attitude definition of three finger left sides, and did not use the attitude definition of hitting of four finger left sides.When the one or more values in equipment/overall internal state 134 were modified (for example, through using supplementary module 127, Fig. 1 C), this next one application program gesture recognition device used the attitude definition of hitting of four finger left sides, and did not use the attitude definition of hitting of three finger left sides.Therefore, in this example, when the one or more values in equipment/overall internal state 134 were modified, four referred to that the left attitude of hitting begins to show the application view of next application program.
Similarly, Fig. 7 R-7S illustration, pinch attitude in response to detecting the five fingers, beginning position picture attitude begins with scale down display web page browser application view 712-6 and a part that shows beginning position picture 708 at least.Based on the attitude definition in equipment/overall internal state 134 and the beginning position picture gesture recognition device, four fingers are pinched attitude, three fingers pinch attitude or any other attitude that is fit to can be used to begin with scale down display web page browser application view 712-6 and a part that shows a beginning picture 708 at least.
In certain embodiments, a plurality of event definitions comprise that (1020) are corresponding to first hit first event definition of attitude and corresponding to having and hit second event definition of attitude of second of the different second finger number of the first finger number with first finger number.For example, a plurality of event definitions of gesture recognition device can comprise that three commanders hit attitude and four commanders hit attitude separately.
In certain embodiments; A plurality of event definitions comprise with corresponding first event definition of first attitude of the first kind with first finger number and with have with corresponding second event definition of second attitude of the first kind of the different second finger number of the first finger number (for example, a finger rap attitude and two fingers rap attitude, two fingers pinch attitude and three fingers pinch attitude or the like).
In certain embodiments, a plurality of event definitions comprise corresponding to first event definition of first attitude and corresponding to second event definition of second attitude different with first attitude (for example, hit attitude with pinch attitude, the attitude of hitting and rap attitude or the like).
In certain embodiments; Do not correspond to the confirming of event definition of any incident recognizer of the view that is used for effectively relating to except event recognition device separately according to the internal state of electronic equipment and (making) about separately event definition, select the definition separately in (1022) a plurality of event definitions to event recognition device separately by electronic equipment.
For example; Gesture recognition device separately can have two event definitions: with three finger left sides that are generally used for normal manipulation mode corresponding first event definition of attitude of hitting, and with four finger left sides that are generally used for the non-productive operation pattern corresponding second event definition of attitude of hitting.When so that this electronic equipment operates in mode under the auxiliary mode when internal state of electronic equipment is set, any other event recognition device that electronics confirm to be used for four of second event definition view whether attitude be used to effectively relate to that refers to hit at a left side uses.If any other event recognition device of the view that is used for effectively relating to does not use the four finger left sides attitude of hitting, select this four fingers left side attitude of hitting for the gesture recognition device separately under the non-productive operation pattern so.On the other hand, if any other event recognition device of the view that is used for effectively relating to has used the four finger left sides attitude of hitting, even in the non-productive operation pattern, also use three to refer to that the left side attitude of hitting is used for gesture recognition device separately so.Prevented that like this two or more gesture recognition devices are not desirably in response to same attitude.
In certain embodiments; Do not correspond to the confirming of event definition of any incident recognizer except event recognition device separately (comprising the view that is used for effectively relating to and the event recognition device of any other view) according to the internal state of electronic equipment and (making) about separately event definition, select an event definition separately in a plurality of event definitions to an event recognition device separately by electronic equipment.
In certain embodiments; Each that is used in two or more event recognition devices of the view that the view hierarchical structure effectively relates to all has (1024) a plurality of event definitions separately; Do not correspond to confirming of any event definition of selecting to any incident recognizer except event recognition device separately according to the internal state of electronic equipment and (making) about separately event definition, select an event definition separately in a plurality of event definitions separately to an event recognition device separately with two or more event definitions by electronic equipment.
For example, the view that effectively relates to can have the first gesture recognition device and the second gesture recognition device.In this example, the first gesture recognition utensil has: with three finger left sides that are generally used for normal manipulation mode corresponding first event definition of attitude of hitting, and with four finger left sides that are generally used for the non-productive operation pattern corresponding second event definition of attitude of hitting.The second gesture recognition utensil has: with two finger left sides that are generally used for normal manipulation mode corresponding the 3rd event definition of attitude of hitting, and with four finger left sides that are generally used for the non-productive operation pattern corresponding the 4th event definition of attitude of hitting.When so that this electronic equipment operates in mode under the auxiliary mode when internal state of electronic equipment is set; Electronic equipment determines whether to the four finger left sides that any other event recognition device (for example, the second incident gesture recognition device) with two or more event definitions select to satisfy second event definition attitude of hitting.If do not select the four finger left sides attitude of hitting, select this four fingers left side attitude of hitting to the first gesture recognition device under the non-productive operation pattern so to any other event recognition device with two or more event definitions.The result is, do not select the four finger left sides attitude of hitting to the second gesture recognition device, because selected the four finger left sides attitude of hitting to the first gesture recognition device.Instead, select the two finger left sides attitude of hitting, because do not select the two finger left sides attitude of hitting to any other gesture recognition device that comprises the first gesture recognition device with two or more event definitions to the second gesture recognition device.In another example, the view that effectively relates to has the first gesture recognition device and the 3rd gesture recognition device and does not have the second gesture recognition device.The 3rd gesture recognition utensil has the 3rd event definition that is generally used for normal manipulation mode (corresponding to the two finger left sides attitude of hitting) and three refers to hit the 5th event definition of attitude of a left side corresponding to what be generally used for the non-productive operation pattern.Under the non-productive operation pattern, can select the three finger left sides attitude of hitting to the 3rd gesture recognition device, because three refer to that a left side other any gesture recognition device that has two or more event definitions when attitude does not have pin of hitting selects.
Although above-mentioned example is described about the left attitude of hitting of many fingers; What but said method was applicable to any direction (for example hits attitude; The right side hit attitude, on hit attitude, hit attitude and/or any attitude of tiltedly hitting down) or the attitude of any other kind (for example, rap attitude, pinch attitude, the attitude or the like of scattering).
In certain embodiments; (for example handle one or more views that separately subevent comprises that (1026) show first software application different with the software that comprises the view hierarchical structure according to selected event definition; The part and the part of beginning position picture 708 that show simultaneously the user interface 712-6 of the one or more views that comprise software at least, Fig. 7 S).
In certain embodiments; One or more views that at least one event recognition device replaces with first software application different with the software that comprises the view hierarchical structure through the demonstration with one or more views of view hierarchical structure (for example; A beginning position picture 708, Fig. 7 A) demonstration and handle (1028) subevent separately.
In certain embodiments, at least one event recognition device is through following operational processes (1030) subevent separately: show in first presumptive area of the display in electronic equipment at least corresponding to some the one group of application icon opened in a plurality of application programs of opening simultaneously; And the subclass (for example, the application icon of opening 5004 and at least a portion of user interface 712-6, Fig. 7 H) that shows one or more views of view hierarchical structure simultaneously at least.For example, the attitude of hitting on four fingers under refer in response to three under the normal manipulation mode to hit attitude and the non-productive operation pattern, electronic equipment shows the application icon that this group is opened and the subclass of one or more views of view hierarchical structure at least simultaneously.
According to some embodiment, Figure 11 has shown the functional block diagram according to the electronic equipment 1100 of above-mentioned inventive principle configuration.The functional block of equipment can be realized by the combination of hardware, software or software and hardware, in order to carry out principle of the present invention.It will be appreciated by those skilled in the art that the functional block of describing among Figure 11 can merge or be divided into submodule, in order to realize above-mentioned principle of the present invention.Therefore, the description of this paper can be supported any possible merging, division or the further definition of functional block described herein.
Shown in figure 11, electronic equipment 1100 comprises that configuration receives the touch-sensitive display unit 1102 that touches input; And the processing unit 1106 that is couple to touch-sensitive display unit 1102.In certain embodiments, processing unit 1106 comprises that performance element 1108, demonstration enable unit 1110, detecting unit 1112, delivery unit 1114, assert unit 1116 and touch input processing unit 1118.
Processing unit 1106 is configured to: carry out first software application and second software application (for example, using performance element 1108) at least.First software application package is drawn together first group of one or more gesture recognition device, and second software application package is drawn together one or more views and second group of one or more gesture recognition device.Gesture recognition utensil separately has corresponding attitude processor.Processing unit 1106 is arranged such that the subclass (for example, using demonstration to enable unit 1110, on touch-sensitive display unit 1102) that can show one or more views of second software application at least.Processing unit 1106 is configured to: when the subclass of the one or more views that show second software application at least: the touch list entries on the responsive display unit 1102 of senses touch (for example, using detecting unit 1112).Touch list entries and comprise the first of one or more touch inputs and the second portion of the one or more touch inputs after the first.Processing unit 1106 is configured to, and during the phase one of senses touch list entries: the first that transmits one or more touch inputs is to first software application and second software application (for example, using delivery unit 1114); Assert the gesture recognition device (for example, using identification unit 1116) of one or more couplings of the first that the one or more touches of identification are imported in the gesture recognition device from first group; And use the first's (for example, use touch input processing unit 1118) that handles one or more touches inputs corresponding to one or more attitude processors of the gesture recognition device of one or more couplings.
In certain embodiments; Processing unit 1106 is configured to; When the input of the touch in the first of one or more touches input overlaps at least one in the view of demonstration of second software application at least in part, senses touch list entries (for example, using detecting unit 1112).
In certain embodiments; Processing unit 1106 is configured to, and makes it possible to show at least the subclass of one or more views of second software application, and any view that does not show first software application (for example; Use demonstration to enable unit 1110, on touch-sensitive display unit 1102).
In certain embodiments; Processing unit 1106 is configured to, and makes it possible to show at least the subclass of one or more views of second software application, and the view that does not show any other application program (for example; Use demonstration to enable unit 1110, on touch-sensitive display unit 1102).
In certain embodiments; Processing unit 1106 is configured to; After the phase one; During the subordinate phase of senses touch list entries: the second portion that transmits one or more touches input is to first software application, and the second portion that does not transmit one or more touches inputs is to second software application (for example, using delivery unit 1114); From the gesture recognition device of one or more couplings, assert the gesture recognition device (for example, using identification unit 1116) of second coupling of recognizing touch operation list entries; And use corresponding to the attitude processor of the gesture recognition device that matees separately and handle touch list entries (for example, using touch input processing unit 1118).
In certain embodiments; Processing unit 1106 is configured to; One or more views through making it possible to show first software application (for example; Use demonstration to enable unit 1110, on touch-sensitive display unit 1102) and use corresponding to the attitude processor of the gesture recognition device that matees separately and handle the touch list entries.
In certain embodiments; Processing unit 1106 is configured to; Replace with through demonstration one or more views of second software application first software application one or more views demonstration (for example; Use demonstration to enable unit 1110, on touch-sensitive display unit 1102) and use corresponding to the attitude processor of the gesture recognition device that matees separately and handle the touch list entries.
In certain embodiments, processing unit 1106 is configured to: carry out first software application, second software application and the 3rd software application (for example, using performance element 1108) simultaneously; And the one or more views that replace with the 3rd software application through the view with one or more demonstrations of second software application (for example; Use demonstration to enable unit 1110, on touch-sensitive display unit 1102) and use corresponding to the attitude processor of the gesture recognition device that matees separately and handle the touch list entries.
In certain embodiments; Processing unit 1106 is configured to: make it possible in first presumptive area of touch-sensitive display unit 1102 to show at least corresponding to some the one group of application icon opened (for example, use show enable unit 1110) in a plurality of application programs of opening simultaneously; And make it possible to show at least the subclass (for example, use show enable unit 1110) of one or more views of second software application.
In certain embodiments, first software application is an applied program ignitor.
In certain embodiments, first software application is the operating system application program.
According to some embodiment, Figure 12 has shown the functional block diagram according to the electronic equipment 1200 of above-mentioned principle configuration of the present invention.The functional block of equipment can be realized by the combination of hardware, software or software and hardware, in order to carry out principle of the present invention.It will be appreciated by those skilled in the art that the functional block of describing among Figure 12 can merge or be divided into submodule, in order to realize above-mentioned principle of the present invention.Therefore, the description of this paper can be supported any possible merging, division or the further definition of functional block described herein.
Shown in figure 12, electronic equipment 1200 comprises the touch-sensitive display unit 1202 that is configured to receive the touch input; And the processing unit 1206 that is couple to touch-sensitive display unit 1202.In certain embodiments, processing unit 1206 comprise performance element 1208, show enable unit 1210, detecting unit 1212, confirm unit 1214, delivery unit 1216 and touch input processing unit 1218.
Processing unit 1206 is configured to, and carries out first software application and second software application (for example, using performance element 1208) at least.First software application package is drawn together first group of one or more gesture recognition device, and second software application package is drawn together one or more views and second group of one or more gesture recognition device.Gesture recognition utensil separately has corresponding attitude processor.Processing unit 1206 is arranged such that and can shows first group of one or more view (for example, using demonstration to enable unit 1210).First group of one or more view comprises the subclass of one or more views of second software application at least.Processing unit 1206 is configured to, when showing first group of one or more view, and the touch list entries on the responsive display unit of senses touch (for example, using detecting unit 1212).Touch list entries and comprise the first of one or more touch inputs and the second portion of the one or more touch inputs after the first.Processing unit 1206 is configured to, and determines whether that at least one the gesture recognition device in first group of one or more gesture recognition device is discerned first's (for example, using definite unit 1214) that one or more touches are imported.Processing unit 1206 is configured to; According to confirming of the first that discerns one or more touches input about at least one the gesture recognition device in first group of one or more gesture recognition device: transmit and touch list entries to first software application; Do not touch list entries to second software application (for example, using delivery unit 1216) and do not transmit; Determine whether at least one the gesture recognition device recognizing touch operation list entries (for example, using definite unit 1214) in first group of one or more gesture recognition device.Processing unit 1206 is configured to; According to confirming about at least one the gesture recognition device recognizing touch operation list entries in first group of one or more gesture recognition device; Use at least one gesture recognition device of first group of recognizing touch operation list entries in one or more gesture recognition devices to handle touch list entries (for example, using touch input processing unit 1218).Processing unit 1206 is configured to; According to about there not being the gesture recognition device to discern the confirming of first of one or more touches input in first group of one or more gesture recognition device: transmit and touch list entries to second software application (for example, using delivery unit 1216); And determine whether at least one the gesture recognition device recognizing touch operation list entries (for example, use confirm unit 1214) in second group of one or more gesture recognition device.Processing unit 1206 is configured to; According to confirming about at least one the gesture recognition device recognizing touch operation list entries in second group of one or more gesture recognition device; Use at least one gesture recognition device of second group of recognizing touch operation list entries in one or more gesture recognition devices to handle touch list entries (for example, using touch input processing unit 1218).
In certain embodiments, touch in the view of one or more demonstrations that list entries overlaps on second software application at least in part at least one.
In certain embodiments, processing unit 1206 is configured to, and makes it possible to show first group of one or more view, and does not show any view (for example, using demonstration to enable unit 1210, on touch-sensitive display unit 1202) of first software application.
In certain embodiments, processing unit 1206 is configured to, and makes it possible to show first group of one or more view, and does not show the view (for example, using demonstration to enable unit 1210, on touch-sensitive display unit 1202) of any other software application.
In certain embodiments; At least one gesture recognition device in confirming first group of one or more gesture recognition device is discerned before the first of one or more touch inputs; Processing unit 1206 is configured to; The first that transmits one or more touch inputs simultaneously is to first software application and second software application (for example, using delivery unit 1216).
In certain embodiments, first software application is an applied program ignitor.
In certain embodiments, first software application is the operating system application program.
In certain embodiments; Processing unit 1206 is configured to; Through the one or more views (for example, using demonstration to enable unit 1208, on touch-sensitive display unit 1202) that make it possible to show first software application; And use at least one the gesture recognition device in first group of one or more gesture recognition device, handle the touch list entries.
In certain embodiments; Processing unit 1206 is configured to; Replace with the demonstration (for example, using demonstration to enable unit 1208, on touch-sensitive display unit 1202) of one or more views of first software application through demonstration with first group of one or more view; And use at least one the gesture recognition device in first group of one or more gesture recognition device, handle the touch list entries.
In certain embodiments, processing unit 1206 is configured to, and carries out first software application, second software application and the 3rd software application (for example, using performance element 1208) simultaneously.Processing unit 1206 is configured to; One or more views through first group of one or more view being replaced with the 3rd software application (for example; Use demonstration to enable unit 1210, on touch-sensitive display unit 1202) and use at least one the gesture recognition device in first group of one or more gesture recognition device to handle the touch list entries.
In certain embodiments; Processing unit 1206 is configured to: make it possible in first presumptive area of touch-sensitive display unit 1202 to show at least corresponding to some the one group of application icon opened (for example, use show enable unit 1210) in a plurality of application programs of opening simultaneously; And the while shows the subclass (for example, using demonstration to enable unit 1210) of first group of one or more view at least.
According to some embodiment, Figure 13 has shown the functional block diagram according to the electronic equipment 1300 of above-mentioned principle of the invention configuration.The functional block of equipment can be realized by the combination of hardware, software or software and hardware, in order to carry out principle of the present invention.It will be appreciated by those skilled in the art that the functional block of describing among Figure 13 can merge or be divided into submodule, in order to realize above-mentioned principle of the present invention.Therefore, the description of this paper can be supported any possible merging, division or the further definition of functional block described herein.
Shown in figure 13, electronic equipment 1300 comprises the display unit 1302 that is configured to show one or more views; Be configured to store the memory cell 1304 of internal state; And the processing unit 1306 that is couple to display unit 1302 and memory cell 1304.In certain embodiments, processing unit 1306 comprises that performance element 1308, demonstration enable unit 1310, detecting unit 1312, assert unit 1314, delivery unit 1316 and incident/subevent processing unit 1318.In certain embodiments, processing unit 1306 comprises memory cell 1304.
Processing unit 1306 is configured to: carry out the software (for example, using performance element 1308) that comprises the view hierarchical structure with a plurality of views; Make it possible to show the one or more views (for example, using demonstration to enable unit 1310, on display unit 1302) in the view hierarchical structure; And carry out one or more software elements (for example, use performance element 1308).Each software element is associated with specific view, and each particular figure comprises one or more event recognition devices.Each event recognition utensil has: based on one or more event definitions of one or more subevents, and event handler.Event handler is specified the action to target, and be configured in response to the event recognition device detect with one or more event definitions in particular event define corresponding incident and send and move target.Processing unit 1306 is configured to: the sequence (for example, using detecting unit 1312) that detects one or more subevents; And a view in the identification view hierarchical structure is as clicking view (for example, using identification unit 1314).Which view of clicking in the view establishment view hierarchical structure is the view that effectively relates to.Processing unit 1306 is configured to, and transmits separately subevent to each view that effectively relates to that is used for the view hierarchical structure, event recognition device (for example, using delivery unit 1316).At least one the event recognition utensil that is used for the view that the view hierarchical structure effectively relates to has a plurality of event definitions; And according to one in these a plurality of event definitions of internal state selection of electronic equipment; And according to selected event definition; Before the next subevent in handling the subevent sequence, at least one event recognition device is handled subevent (for example, use incident/subevent processing unit 1318) separately.
In certain embodiments; A plurality of event definitions comprise and first corresponding first event definition of attitude of hitting with first finger number, and with second corresponding second event definition of attitude of hitting with second finger number different with the first finger number.
In certain embodiments, internal state comprises the one or more settings that are used for the non-productive operation pattern.
In certain embodiments; Do not correspond to the confirming of event definition of any incident recognizer of the view that is used for effectively relating to except event recognition device separately according to the internal state of electronic equipment and about separately event definition, select an event definition separately in a plurality of event definitions to an event recognition device separately.
In certain embodiments; Each that is used in two or more event recognition devices of the view that the view hierarchical structure effectively relates to all has a plurality of event definitions separately; Do not correspond to confirming of any event definition of selecting to any incident recognizer except event recognition device separately according to the internal state of electronic equipment and about separately event definition, select an event definition separately in a plurality of event definitions separately to an event recognition device separately with two or more event definitions.
In certain embodiments; Processing unit 1306 is configured to; One or more views through making it possible to show first software application different with the software that comprises the view hierarchical structure (for example; Use demonstration to enable unit 1310, on display unit 1302), handle subevent separately according to selected event definition.
In certain embodiments; Processing unit 1306 is configured to; Replace with through demonstration one or more views of view hierarchical structure first software application different with the software that comprises the view hierarchical structure one or more views demonstration (for example; Use demonstration to enable unit 1310, on display unit 1302), handle subevent separately.
In certain embodiments; Processing unit 1306 is configured to handle subevent separately through following operation: make it possible in first presumptive area of display unit 1302 to show at least corresponding to some the one group of application icon opened (for example, use show enable unit 1310) in a plurality of application programs of opening simultaneously; And the subclass (for example, using demonstration to enable unit 1310) that makes it possible to show at least simultaneously the one or more views in the view hierarchical structure.
In certain embodiments, software is applied program ignitor.
In certain embodiments, software is the operating system application program.
For explanatory purposes, provided top description about specific embodiment.Yet the discussion of above-mentioned exemplary is not intended to exhaustive, maybe will limit the invention to disclosed exact form.According to above-mentioned instruction, many modifications and variant all are possible.Selected and described embodiment is for principle of the present invention and practical application thereof are described best; Can make those skilled in the art use the present invention best thus, and use the different embodiment that have with the contemplated matched various modifications of practice.

Claims (44)

1. electronic equipment comprises:
Touch-sensitive display is configured to receive touch and imports; And
Processor; Be couple to said touch-sensitive display; And carry out first software application and second software application at least, and said first software application package is drawn together first group of one or more gesture recognition device, and said second software application package is drawn together one or more views and second group of one or more gesture recognition device; And wherein separately gesture recognition utensil has corresponding attitude processor, and said processor further is configured to:
Make it possible to show at least the subclass of one or more views of said second software application; And
When the subclass of the one or more views that show said second software application at least:
Detect the touch list entries on the said touch-sensitive display, said touch list entries comprises the first of one or more touch inputs and the second portion of the one or more touch inputs after the said first; And
During the phase one of detecting said touch list entries:
The said first that transmits one or more touch inputs is to said first software application and said second software application;
From said first group gesture recognition device, assert the gesture recognition device of one or more couplings of the said first that the one or more touches of identification are imported; And
The corresponding one or more attitude processors of gesture recognition device of use and said one or more couplings, the said first that handles one or more touches inputs.
2. electronic equipment as claimed in claim 1; Wherein, Said processor is configured to; When the input of the touch in the said first of one or more touches input overlaps at least one in the view of demonstration of said second software application at least in part, detect said touch list entries.
3. electronic equipment as claimed in claim 1, wherein, said processor is configured to, and makes it possible to show at least the subclass of one or more views of said second software application, and does not show any view of said first software application.
4. electronic equipment as claimed in claim 1, wherein, said processor is configured to, and makes it possible to show at least the subclass of one or more views of said second software application, and does not show the view of any other application program.
5. electronic equipment as claimed in claim 1, wherein, said processor further is configured to:
After the said phase one, during the subordinate phase that detects said touch list entries:
The said second portion that transmits one or more touch inputs is to said first software application, and the said second portion of one or more touches not being imported is sent to said second software application;
From the gesture recognition device of said one or more couplings, assert the second coupling gesture recognition device of the said touch list entries of identification; And
Use is handled said touch list entries corresponding to the attitude processor of the gesture recognition device of separately coupling.
6. electronic equipment as claimed in claim 5; Wherein, said processor is configured to, through making it possible to show one or more views of said first software application; And use attitude processor corresponding to the gesture recognition device of separately coupling, handle said touch list entries.
7. electronic equipment as claimed in claim 5; Wherein, Said processor is configured to; Replace with the demonstration of one or more views of said first software application through demonstration, and use attitude processor, handle said touch list entries corresponding to the gesture recognition device of separately coupling with one or more views of said second software application.
8. electronic equipment as claimed in claim 5, wherein, said processor is configured to, and carries out said first software application, said second software application and the 3rd software application simultaneously; And the one or more views that replace with said the 3rd software application through view with one or more demonstrations of said second software application; And use attitude processor corresponding to the gesture recognition device of separately coupling, handle said touch list entries.
9. electronic equipment as claimed in claim 5, wherein, said processor is configured to:
Make it possible in first presumptive area of said touch-sensitive display to show at least corresponding to some the one group of application icon opened in a plurality of application programs of opening simultaneously; And
Make it possible to show simultaneously the subclass of one or more views of said at least second software application.
10. electronic equipment as claimed in claim 1, wherein, said first software application is an applied program ignitor.
11. electronic equipment as claimed in claim 1, wherein, said first software application is the operating system application program.
12. an electronic equipment comprises
Touch-sensitive display is configured to receive touch and imports;
Show and enable the unit; Be used for showing at least the subclass of one or more views of second software application; Said second software application package is drawn together second group of one or more gesture recognition device, and wherein separately gesture recognition utensil has corresponding attitude processor; And
Detecting unit; When being used for the subclass when the one or more views that show said second software application at least; Detect the touch list entries on the said touch-sensitive display, said touch list entries comprises the first of one or more touch inputs and the second portion of the one or more touch inputs after the said first; And
Delivery unit; Be used for during the phase one of detecting said touch list entries; The said first that transmits one or more touch inputs is to first software application and said second software application, and said first software application package is drawn together first group of one or more gesture recognition device;
Assert the unit, be used for during the said phase one of detecting said touch list entries, from said first group gesture recognition device, assert the gesture recognition device of one or more couplings of the said first that the one or more touches of identification are imported; And
Touch input processing unit; Be used for during the said phase one of detecting said touch list entries; Use is handled the said first of one or more touch inputs corresponding to one or more attitude processors of the gesture recognition device of said one or more couplings.
13. electronic equipment as claimed in claim 12; Wherein, Said detecting unit comprises, detects the device of said touch list entries when being used at least one in the view of demonstration that touch input when the said first of one or more touches inputs overlaps on said second software application at least in part.
14. electronic equipment as claimed in claim 12; Wherein, Said demonstration enables the unit and comprises, be used to make it possible to show at least said second software application one or more views subclass and do not show the device of any view of said first software application.
15. electronic equipment as claimed in claim 12 comprises:
Be used for after the said phase one; During the subordinate phase that detects said touch list entries, the said second portion that transmits one or more touch inputs is not sent to the said second portion of one or more touch inputs the device of said second software application to said first software application;
Be used for after the said phase one, during the said subordinate phase that detects said touch list entries, from the gesture recognition device of said one or more couplings, assert the device of the gesture recognition device that second of the said touch list entries of identification matees; And
Be used for after the said phase one, during the said subordinate phase that detects said touch list entries, use the device of handling said touch list entries corresponding to the attitude processor of the gesture recognition device that matees separately.
16. electronic equipment as claimed in claim 12, wherein, said touch input processing unit comprises:
Being used for making it possible to first presumptive area at touch-sensitive display shows at least corresponding to some the one group of application program image target device opened in a plurality of application programs of opening simultaneously; And
Be used to make it possible to show simultaneously the device of subclass of one or more views of said at least second software application.
17. electronic equipment as claimed in claim 12, wherein, said demonstration enables the unit and comprises, be used to make it possible to show at least said second software application one or more views subclass and do not show the device of the view of any other application program.
18. electronic equipment as claimed in claim 15; Wherein, Saidly be used for after the said phase one during the said subordinate phase that detects said touch list entries, using the device of handling said touch list entries corresponding to the attitude processor of the gesture recognition device of coupling separately to comprise, be used to make it possible to show the device of one or more views of said first software application.
19. electronic equipment as claimed in claim 15; Wherein, Saidly be used for after the said phase one during the said subordinate phase that detects said touch list entries, using the device of handling said touch list entries corresponding to the attitude processor of the gesture recognition device of coupling separately to comprise, be used for the demonstration of one or more views of said second software application is replaced with the device shown of one or more views of said first software application.
20. electronic equipment as claimed in claim 15, wherein, said first software application, said second software application and the 3rd software application are carried out simultaneously; And wherein; Saidly be used for after the said phase one during the said subordinate phase that detects said touch list entries, using the device of handling said touch list entries corresponding to the attitude processor of the gesture recognition device of coupling separately to comprise, be used for the view of one or more demonstrations of said second software application is replaced with the device of one or more views of said the 3rd software application.
21. electronic equipment as claimed in claim 12, wherein, said first software application is an applied program ignitor.
22. electronic equipment as claimed in claim 12, wherein, said first software application is the operating system application program.
23. a signal conditioning package that is used to have the electronic equipment of touch-sensitive display comprises:
Show and enable the unit; Be used for showing at least the subclass of one or more views of second software application; Said second software application package is drawn together second group of one or more gesture recognition device, and wherein separately gesture recognition utensil has corresponding attitude processor;
Detecting unit; When being used for the subclass when the one or more views that show said second software application at least; Detect the touch list entries on the said touch-sensitive display, said touch list entries comprises the first of one or more touch inputs and the second portion of the one or more touch inputs after the said first;
Delivery unit; Be used for during the phase one of detecting said touch list entries; The said first that transmits one or more touch inputs is to first software application and said second software application, and said first software application package is drawn together first group of one or more gesture recognition device;
Assert the unit, be used for during the said phase one of detecting said touch list entries, from said first group gesture recognition device, assert the gesture recognition device of one or more couplings of the said first that the one or more touches of identification are imported; And
Touch input processing unit; Be used for during the phase one of detecting said touch list entries, the said first of one or more touches inputs is handled in use corresponding to one or more attitude processors of the gesture recognition device of said one or more couplings.
24. signal conditioning package as claimed in claim 23; Wherein, Said detecting unit comprises; When being used at least one in the view of demonstration that touch input when the said first of one or more touches input overlaps on said second software application at least in part, detect the device of said touch list entries.
25. signal conditioning package as claimed in claim 23; Wherein, Said demonstration enables the unit and comprises, be used to make it possible to show at least said second software application one or more views subclass and do not show the device of any view of said first software application.
26. signal conditioning package as claimed in claim 23 comprises:
Be used for after the said phase one; During the subordinate phase that detects said touch list entries, the said second portion that transmits one or more touch inputs is not sent to the said second portion of one or more touch inputs the device of said second software application to said first software application;
Be used for after the said phase one, during the said subordinate phase that detects said touch list entries, from the gesture recognition device of said one or more couplings, assert the device of the gesture recognition device that second of the said touch list entries of identification matees; And
Be used for after the said phase one, during the said subordinate phase that detects said touch list entries, use the device of handling said touch list entries corresponding to the attitude processor of the gesture recognition device that matees separately.
27. signal conditioning package as claimed in claim 23, wherein, said touch input processing unit comprises:
Being used for making it possible to first presumptive area at said touch-sensitive display shows at least corresponding to some the one group of application program image target device opened in a plurality of application programs of opening simultaneously; And
Be used to make it possible to show simultaneously the device of subclass of one or more views of said at least second software application.
28. signal conditioning package as claimed in claim 23; Wherein, Said demonstration enables the unit and comprises, be used to make it possible to show at least said second software application one or more views subclass and do not show the device of the view of any other application program.
29. signal conditioning package as claimed in claim 26; Wherein, Said be used for after the said phase one during the said subordinate phase that detects said touch list entries, using corresponding to being equipped with the device of handling said touch list entries from the attitude processor of the gesture recognition device of coupling comprise, be used to make it possible to show the device of one or more views of said first software application.
30. signal conditioning package as claimed in claim 26; Wherein, Saidly be used for after the said phase one during the said subordinate phase that detects said touch list entries, using the device of handling said touch list entries corresponding to the attitude processor of the gesture recognition device of coupling separately to comprise, be used for the demonstration of one or more views of said second software application is replaced with the device shown of one or more views of said first software application.
31. signal conditioning package as claimed in claim 26, wherein, said first software application, said second software application and the 3rd software application are carried out simultaneously; And wherein; Saidly be used for after the said phase one during the said subordinate phase that detects said touch list entries, using the device of handling said touch list entries corresponding to the attitude processor of the gesture recognition device of coupling separately to comprise, be used for the view of one or more demonstrations of said second software application is replaced with the device of one or more views of said the 3rd software application.
32. signal conditioning package as claimed in claim 23, wherein, said first software application is an applied program ignitor.
33. signal conditioning package as claimed in claim 23, wherein, said first software application is the operating system application program.
34. method of in having the electronic equipment of touch-sensitive display, carrying out; Said electronic configurations becomes to carry out at least first software application and second software application; Said first software application package is drawn together first group of one or more gesture recognition device; Said second software application package is drawn together one or more views and second group of one or more gesture recognition device, and wherein separately gesture recognition utensil has corresponding attitude processor, and said method comprises:
At least the subclass that shows one or more views of said second software application; And
When the subclass of the one or more views that show said second software application at least:
Detect the touch list entries on the said touch-sensitive display, said touch list entries comprises the first of one or more touch inputs and the second portion of the one or more touch inputs after the said first; And
During the phase one of detecting said touch list entries:
The said first that transmits one or more touch inputs is to said first software application and said second software application;
From said first group gesture recognition device, assert the gesture recognition device of one or more couplings of the said first that the one or more touches of identification are imported; And
The said first of one or more touch inputs is handled in use corresponding to one or more attitude processors of the gesture recognition device of said one or more couplings.
35. method as claimed in claim 34, wherein, said detection occurs in when the input of the touch in the said first of one or more touches input overlaps at least one in the view of demonstration of said second software application at least in part.
36. method as claimed in claim 34, wherein, said demonstration comprises: show the subclass of one or more views of said second software application at least, and do not show any view of said first software application.
37. method as claimed in claim 34, wherein, said demonstration comprises: show the subclass of one or more views of said second software application at least, and do not show the view of any other application program.
38. method as claimed in claim 34 comprises:
After the said phase one, during the subordinate phase that detects said touch list entries:
The said second portion that transmits one or more touch inputs is to said first software application, and the said second portion of one or more touches not being imported is sent to said second software application;
From the gesture recognition device of said one or more couplings, assert the gesture recognition device of second coupling of the said touch list entries of identification; And
Use is handled said touch list entries corresponding to the attitude processor of the gesture recognition device that matees separately.
39. method as claimed in claim 38, wherein, use is handled said touch list entries corresponding to the attitude processor of the gesture recognition device that matees separately and is comprised: the one or more views that show said first software application.
40. method as claimed in claim 38; Wherein, use and to handle said touch list entries corresponding to the attitude processor of the gesture recognition device of coupling separately and comprise: the demonstration of one or more views of said second software application is replaced with the demonstration of one or more views of said first software application.
41. method as claimed in claim 38, wherein, said electronic equipment is carried out said first software application, said second software application and the 3rd software application simultaneously; And use and to handle said touch list entries corresponding to the attitude processor of the gesture recognition device of coupling separately and comprise: the one or more views that the view of one or more demonstrations of said second software application replaced with said the 3rd software application.
42. method as claimed in claim 38, wherein, use is handled said touch list entries corresponding to the attitude processor of the gesture recognition device that matees separately and is comprised:
In first presumptive area of said touch-sensitive display, show at least corresponding to some the one group of application icon opened in a plurality of application programs of opening simultaneously; And
The subclass that shows one or more views of said at least second software application simultaneously.
43. method as claimed in claim 34, wherein, said first software application is an applied program ignitor.
44. method as claimed in claim 34, wherein, said first software application is the operating system application program.
CN201110463262.8A 2010-12-20 2011-12-20 Identification of events Active CN102768608B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610383388.7A CN106095418B (en) 2010-12-20 2011-12-20 Event recognition

Applications Claiming Priority (8)

Application Number Priority Date Filing Date Title
US201061425222P 2010-12-20 2010-12-20
US61/425,222 2010-12-20
US13/077,927 US8566045B2 (en) 2009-03-16 2011-03-31 Event recognition
US13/077,927 2011-03-31
US13/077,931 2011-03-31
US13/077,524 2011-03-31
US13/077,931 US9311112B2 (en) 2009-03-16 2011-03-31 Event recognition
US13/077,524 US9244606B2 (en) 2010-12-20 2011-03-31 Device, method, and graphical user interface for navigation of concurrently open software applications

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN201610383388.7A Division CN106095418B (en) 2010-12-20 2011-12-20 Event recognition

Publications (2)

Publication Number Publication Date
CN102768608A true CN102768608A (en) 2012-11-07
CN102768608B CN102768608B (en) 2016-05-04

Family

ID=47096020

Family Applications (3)

Application Number Title Priority Date Filing Date
CN201110463262.8A Active CN102768608B (en) 2010-12-20 2011-12-20 Identification of events
CN2011205800185U Expired - Lifetime CN203287883U (en) 2010-12-20 2011-12-20 Electronic equipment and information processing device thereof
CN201610383388.7A Active CN106095418B (en) 2010-12-20 2011-12-20 Event recognition

Family Applications After (2)

Application Number Title Priority Date Filing Date
CN2011205800185U Expired - Lifetime CN203287883U (en) 2010-12-20 2011-12-20 Electronic equipment and information processing device thereof
CN201610383388.7A Active CN106095418B (en) 2010-12-20 2011-12-20 Event recognition

Country Status (2)

Country Link
CN (3) CN102768608B (en)
HK (1) HK1177519A1 (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105339900A (en) * 2013-06-09 2016-02-17 苹果公司 Proxy gesture recognizer
US10175876B2 (en) 2007-01-07 2019-01-08 Apple Inc. Application programming interfaces for gesture operations
US10216408B2 (en) 2010-06-14 2019-02-26 Apple Inc. Devices and methods for identifying user interface objects based on view hierarchy
US10521109B2 (en) 2008-03-04 2019-12-31 Apple Inc. Touch event model
US10719225B2 (en) 2009-03-16 2020-07-21 Apple Inc. Event recognition
US10732997B2 (en) 2010-01-26 2020-08-04 Apple Inc. Gesture recognizers with delegates for controlling and modifying gesture recognition
US10963142B2 (en) 2007-01-07 2021-03-30 Apple Inc. Application programming interfaces for scrolling
US11372538B2 (en) 2012-06-22 2022-06-28 Sony Corporation Detection device and detection method

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105700784A (en) * 2014-11-28 2016-06-22 神讯电脑(昆山)有限公司 Touch input method and electronic apparatus
JP2017149225A (en) * 2016-02-23 2017-08-31 京セラ株式会社 Control unit for vehicle
CN107566879A (en) * 2017-08-08 2018-01-09 武汉斗鱼网络科技有限公司 A kind of management method, device and the electronic equipment of application view frame
CN108388393B (en) * 2018-01-02 2020-08-28 阿里巴巴集团控股有限公司 Identification method and device for mobile terminal click event
CN110196743A (en) * 2018-12-17 2019-09-03 腾讯科技(深圳)有限公司 Method, apparatus, storage medium and the electronic device of event triggering
CN113326352B (en) * 2021-06-18 2022-05-24 哈尔滨工业大学 Sub-event relation identification method based on heterogeneous event graph

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060077183A1 (en) * 2004-10-08 2006-04-13 Studt Peter C Methods and systems for converting touchscreen events into application formatted data
CN1967458A (en) * 2005-11-16 2007-05-23 联发科技股份有限公司 Method for controlling a touch screen user interface and device thereof
US20070177803A1 (en) * 2006-01-30 2007-08-02 Apple Computer, Inc Multi-touch gesture dictionary
CN101410781A (en) * 2006-01-30 2009-04-15 苹果公司 Gesturing with a multipoint sensing device
CN101526880A (en) * 2008-03-04 2009-09-09 苹果公司 Touch event model
US20100235118A1 (en) * 2009-03-16 2010-09-16 Bradford Allen Moore Event Recognition

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9292111B2 (en) * 1998-01-26 2016-03-22 Apple Inc. Gesturing with a multipoint sensing device
US20020171675A1 (en) * 2001-05-15 2002-11-21 International Business Machines Corporation Method and system for graphical user interface (GUI) widget having user-selectable mass
US8261190B2 (en) * 2008-04-24 2012-09-04 Burlington Education Ltd. Displaying help sensitive areas of a computer application
CN101853105A (en) * 2010-06-02 2010-10-06 友达光电股份有限公司 Computer with touch screen and operating method thereof

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060077183A1 (en) * 2004-10-08 2006-04-13 Studt Peter C Methods and systems for converting touchscreen events into application formatted data
CN1967458A (en) * 2005-11-16 2007-05-23 联发科技股份有限公司 Method for controlling a touch screen user interface and device thereof
US20070177803A1 (en) * 2006-01-30 2007-08-02 Apple Computer, Inc Multi-touch gesture dictionary
CN101410781A (en) * 2006-01-30 2009-04-15 苹果公司 Gesturing with a multipoint sensing device
CN101526880A (en) * 2008-03-04 2009-09-09 苹果公司 Touch event model
US20100235118A1 (en) * 2009-03-16 2010-09-16 Bradford Allen Moore Event Recognition

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10613741B2 (en) 2007-01-07 2020-04-07 Apple Inc. Application programming interface for gesture operations
US10175876B2 (en) 2007-01-07 2019-01-08 Apple Inc. Application programming interfaces for gesture operations
US11954322B2 (en) 2007-01-07 2024-04-09 Apple Inc. Application programming interface for gesture operations
US11449217B2 (en) 2007-01-07 2022-09-20 Apple Inc. Application programming interfaces for gesture operations
US10963142B2 (en) 2007-01-07 2021-03-30 Apple Inc. Application programming interfaces for scrolling
US10521109B2 (en) 2008-03-04 2019-12-31 Apple Inc. Touch event model
US10936190B2 (en) 2008-03-04 2021-03-02 Apple Inc. Devices, methods, and user interfaces for processing touch events
US11740725B2 (en) 2008-03-04 2023-08-29 Apple Inc. Devices, methods, and user interfaces for processing touch events
US11755196B2 (en) 2009-03-16 2023-09-12 Apple Inc. Event recognition
US10719225B2 (en) 2009-03-16 2020-07-21 Apple Inc. Event recognition
US11163440B2 (en) 2009-03-16 2021-11-02 Apple Inc. Event recognition
US10732997B2 (en) 2010-01-26 2020-08-04 Apple Inc. Gesture recognizers with delegates for controlling and modifying gesture recognition
US10216408B2 (en) 2010-06-14 2019-02-26 Apple Inc. Devices and methods for identifying user interface objects based on view hierarchy
US11372538B2 (en) 2012-06-22 2022-06-28 Sony Corporation Detection device and detection method
US11429190B2 (en) 2013-06-09 2022-08-30 Apple Inc. Proxy gesture recognizer
CN105339900A (en) * 2013-06-09 2016-02-17 苹果公司 Proxy gesture recognizer
CN110362414A (en) * 2013-06-09 2019-10-22 苹果公司 Act on behalf of gesture recognition
CN110362414B (en) * 2013-06-09 2024-02-20 苹果公司 Proxy gesture recognizer
CN105339900B (en) * 2013-06-09 2019-06-14 苹果公司 Act on behalf of gesture recognition

Also Published As

Publication number Publication date
CN106095418A (en) 2016-11-09
CN102768608B (en) 2016-05-04
CN106095418B (en) 2019-09-13
CN203287883U (en) 2013-11-13
HK1177519A1 (en) 2013-08-23

Similar Documents

Publication Publication Date Title
CN203287883U (en) Electronic equipment and information processing device thereof
CN102422264B (en) Event recognition
CN105339900B (en) Act on behalf of gesture recognition
CN103558983B (en) Method, equipment and electronic equipment for gesture identification
KR101523866B1 (en) Event recognition
JP2015018572A (en) Control selection approximation
CN107111415A (en) Equipment, method and graphic user interface for Mobile solution interface element
TWI493434B (en) Electrcal device and adjustment method of application interface

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 1177519

Country of ref document: HK

C14 Grant of patent or utility model
GR01 Patent grant
REG Reference to a national code

Ref country code: HK

Ref legal event code: GR

Ref document number: 1177519

Country of ref document: HK