CN101611373B - Controlling, manipulating, and editing gestures of media files using touch sensitive devices - Google Patents

Controlling, manipulating, and editing gestures of media files using touch sensitive devices Download PDF

Info

Publication number
CN101611373B
CN101611373B CN200780051755.2A CN200780051755A CN101611373B CN 101611373 B CN101611373 B CN 101611373B CN 200780051755 A CN200780051755 A CN 200780051755A CN 101611373 B CN101611373 B CN 101611373B
Authority
CN
China
Prior art keywords
touch
progress bar
playing progress
attitude
finger
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN200780051755.2A
Other languages
Chinese (zh)
Other versions
CN101611373A (en
Inventor
G·克里斯蒂
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Apple Inc
Original Assignee
Apple Computer Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US11/818,342 external-priority patent/US7956847B2/en
Application filed by Apple Computer Inc filed Critical Apple Computer Inc
Priority to CN201310719094.3A priority Critical patent/CN103631496B/en
Publication of CN101611373A publication Critical patent/CN101611373A/en
Application granted granted Critical
Publication of CN101611373B publication Critical patent/CN101611373B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen

Abstract

Embodiments of the invention are directed to a system, method, and software for implementing gestures with touch sensitive devices (such as a touch sensitive display) for managing and editing media files on a computing device or system. Specifically, gestural inputs of a human hand over a touch/proximity sensitive device can be used to control, edit, and manipulate files, such as media files including without limitation graphical files, photo files and video files.

Description

Utilize the attitude of touch-sensitive device control, manipulation and editing media file
Cross reference to related application
The present invention requires the rights and interests based on 35 USC 119 (e) of the U.S. Provisional Patent Application that the sequence number of submission on January 5th, 2007 is 60/878,754, and the content of this temporary patent application is incorporated herein by reference.
Technical field
The present invention relates to by utilizing hand attitude in touch-sensitive device to manage, handle and edit the system and method for the media object the Drawing Object on display.
Background technology
Permitted now eurypalynous input media in computer system executable operations.Described operation is conventionally corresponding to mobile cursor with select on display screen.Described operation also can comprise page turning, rolling, translation, convergent-divergent, etc.For example, input media can comprise button, switch, keyboard, mouse, trace ball, touch pad, operating rod and touch-screen, etc.Each in these equipment all has the merits and demerits of considering when designing a calculating machine system.
Button and switch in fact normally machinery and to mobile cursor with select the control that provides limited.For example, they are conventionally exclusively used in and in specific side, move up cursor (such as arrow key) or specifically select (such as carriage return, deletion, numeral etc.).
When using mouse apparatus, the movement of input pointer on display is conventionally corresponding to when user's the relatively moving of mouse during along surperficial rolling mouse.When using Trackball device, the movement of input pointer on display is conventionally corresponding to when user's the relatively moving of trace ball during motion track ball in the enclosure.Mouse and Trackball device also comprise one or more buttons for selecting conventionally.Mouse apparatus also can comprise roller, and it allows user by shown content that roller rolling is forward or backward rolled.
Utilize touch panel device, such as the touch pad on individual laptop computer, the movement of input pointer on display be relatively moving corresponding to user's finger (or stylus) when user's finger (or stylus) moves along the surface of touch pad conventionally.On the other hand, touch-screen can be the display screen of a type, and it generally includes the touch-sensitive transparent panel (or " skin ") that covers display screen.When using touch-screen, user is conventionally directly put and selects on display screen to being presented at the object (such as GUI object) on screen by (conventionally using stylus or finger).
For additional function is provided, utilized some above-mentioned input media to realize hand attitude.For example, on touch pad, in the time can one or more rapping being detected in touchpad surface, can select.In some cases, can rap any part of touch pad, and in other cases, can rap the private part of touch pad.Except selecting, can also be by utilizing the finger motion at touch pad edge to start rolling.
Transfer Apple Computer, U.S. Patent No. 5,612,719 and the No.5 of Inc., 590,219 have described some other uses of attitude.U.S. Patent No. 5,612,719 disclose the button on a kind of screen, and it is in response on screen or approach at least two different button attitudes that this button place carries out.U.S. Patent No. 5,590,219 disclose a kind of for being identified in the method for the ellipse attitude input on the display screen of computer system.
Recently, realized more senior attitude.For example, can start rolling by place four fingers on touch pad so that roll attitude is identified and then on touch pad mobile these point to carry out rolling event.But, for realizing the method for these senior attitudes, may be limited, and not directly perceived in many cases.In some applications, especially, in relating to the application that utilizes computer system management or editing media file, utilize the hand attitude of touch-screen can allow user more effectively and realize more accurately the operation of expecting.
Based on the above, need to improve the mode that attitude can be performed in touch-sensitive device, especially for management and editing media file.
Summary of the invention
The present invention relates to a kind of system, method and software, for realizing the attitude of utilizing touch-sensitive device (such as touch-sensitive display) for management and editing media file in computer system.Especially, staff can be used for controlling, edits and handle file in the attitude input that touches/approach on sensitive equipment, and such as media file, described media file includes but not limited to photo files and video file.
According to an embodiment, the attitude input on touch-sensitive computer desktop application display is for realizing the action of traditional mouse/trace ball, such as definite target (target), selection, right click action, rolling etc.
According to another embodiment, the attitude input on touch-sensitive display can be used for realizing the edit commands of the image file such as photo files in order to editor.Can, via user interface (" UI ") element, such as slider bar, identify attitude input.Attitude input via UI element can change by changing the quantity of the contact point on described UI element.
According to another embodiment, the activation that (invoke) UI element is enabled in attitude input, then, can realize further function alternately with the attitude of enabled UI element.
Accompanying drawing explanation
Fig. 1 is the block diagram of the computer system of one exemplary embodiment according to the present invention.
Fig. 2 illustrates another computer system of another exemplary embodiment according to the present invention.
Fig. 3 is a kind of Multipoint process method of one exemplary embodiment according to the present invention.
Fig. 4 A and 4B illustrate the touch image detecting according to an embodiment of the invention.
Fig. 5 illustrates a stack features according to an embodiment of the invention.
Fig. 6 is a kind of calculation method of parameters according to an embodiment of the invention.
Fig. 7 A-7E and 7I-7K illustrate according to an embodiment of the invention for carrying out the various attitudes of set the tasks target and/or the task of selection.
Fig. 7 F-7H illustrates for identifying and realize the figure of method of the attitude input of Fig. 7 A-E.
Fig. 8 A-8G illustrates rotation attitude according to an embodiment of the invention.
Fig. 9 is the figure of a kind of method based on touching according to an embodiment of the invention.
Figure 10 is the figure of a kind of method based on touching according to an embodiment of the invention.
Figure 11 is the figure of a kind of method based on touching according to an embodiment of the invention.
Figure 12 is the figure of a kind of convergent-divergent attitude method according to an embodiment of the invention.
Figure 13 A-13H illustrates a series of convergent-divergent according to an embodiment of the invention.
Figure 14 is the figure of a kind of shift method according to an embodiment of the invention.
Figure 15 A-15D illustrates a series of translation according to an embodiment of the invention.
Figure 16 is the figure of a kind of spinning solution according to an embodiment of the invention.
Figure 17 A-17C illustrates a series of rotation according to an embodiment of the invention.
Figure 17 D and 17E illustrate according to an embodiment of the invention for rotating optional order calibration method.
Figure 18 A and 18B illustrate the input of the attitude for editing photo document according to an embodiment of the invention.
Figure 18 C illustrates for identifying and realize the figure of method of the attitude input of Figure 18 A and 18B.
Figure 18 D and 18E illustrate according to an embodiment of the invention for amplify and dwindle the attitude input of photo files in photo application.
Figure 19 A-19D illustrates according to an embodiment of the invention for rolling across the attitude input of the contiguous file of playback.
Figure 19 E and 19F illustrate according to an embodiment of the invention for roll across the attitude input of the photo files of playback on digital camera display.
Figure 19 G illustrates according to an embodiment of the invention for the attitude input in playback duration mark or deletion photo files.
Figure 19 H illustrate according to another embodiment of the present invention at playback duration mark or delete a kind of interchangeable attitude input of photo files.
Figure 20 is illustrating for realizing the sketch plan of method of the method for Figure 18 A-19F according to the application embodiment.
Figure 21 A-21D illustrates according to an embodiment of the invention for utilizing Video Applications to control and/or the attitude input of editing video.
Figure 22 A and 22B are for realizing the figure of method of the attitude input of Figure 21 A-21D.
Figure 23 illustrates according to an embodiment of the invention for utilizing voice applications to control and/or edit the attitude input of audio frequency.
Embodiment
Below in description of preferred embodiments with reference to accompanying drawing, wherein accompanying drawing is as this instructions part, and the mode that can put into practice specific embodiments of the invention by illustrating illustrates accompanying drawing.Should be appreciated that and can use other embodiment, and can produce structural change and not depart from the scope of the preferred embodiment of the present invention.
Fig. 1 is the block diagram of exemplary computer system 50 according to an embodiment of the invention.Computer system 50 can be corresponding to personal computer system, such as desktop computer, laptop computer, flat computer or handheld computer.Computer system also can be corresponding to calculation element, such as cell phone, PDA, specialized media player and consumer-elcetronics devices, etc.
Exemplary computer system 50 shown in Fig. 1 comprises processor 56, and this processor 56 is configured to the operation of carrying out instruction and carrying out being associated with computer system 50.For example, use the instruction of for example retrieving from storer, processor 56 can be controlled reception and the manipulation for the input and output data between the parts of computing system 50.Processor 56 can be realized on single-chip, multi-chip or a plurality of electric parts.For example, can be processor 56 and use various structures, comprise special use or flush bonding processor, single goal processor, controller and ASIC, etc.
In most cases, processor 56 operates with object computer code and generation and usage data together with operating system.Operating system is normally well-known, will be not for a more detailed description here.For example, operating system can be corresponding to OS/2, DOS, Unix, Linux, Palm OS, etc.Operating system can also be special purpose operating system, such as can be for those operating systems that limit purposes device type computing equipment.Operating system, other computer code and data can reside in the memory block 58 that can operationally be couple to processor 56.Memory block 58 is provided for the storage computer code that can be used by computer system 50 and the space of data conventionally.For example, memory block 58 can comprise ROM (read-only memory) (ROM), random access memory (RAM), hard disk drive, etc.Information also can reside on movable storage medium, and is written into when needed or is installed in computer system 50.Movable storage medium comprises for example CD-ROM, PC-CARD, storage card, floppy disk, tape and network components.
Computer system 50 also can comprise the display device 68 that can operationally be couple to processor 56.Display device 68 can be liquid crystal display (LCD) (for example active matrix, passive matrix, etc.).Alternatively, display device 68 can be the monitor such as monochrome display, cga (CGA) display, enhanced graphics adapter (EGA) display, changeable graphics array (VGA) display, Super VGA display, cathode-ray tube (CRT) (CRT) etc.Display device also can be corresponding to plasma scope or the display of realizing with electric ink.
Display device 68 can be configured to display graphics user interface (GUI) 69 conventionally, the wieldy interface between the user that described graphic user interface 69 provides in computer system and operating system or the application that moves on it.Generally speaking, GUI 69 represents to have program, file and the option of operation of graph image, object or vector representation.Graph image can comprise window, field, dialog box, menu, icon, button, cursor, scroll bar, etc.These images can be arranged as predetermined layout, or can be by dynamic creation to serve the ongoing specific operation of user.During operation, user can select and/or activate various graph images to start function associated with it and task.For example, user can select for opening, close, minimize or the button of maximized window or for initiating the icon of (launch) specific program.GUI 69 can additionally or alternatively show information for user on display device 68, such as noninteractive text and figure.
Computer system 50 also can comprise the input equipment 70 that can operationally be couple to processor 56.Input equipment 70 can be configured to transfer data to computer system 50 from the external world.Input equipment 70 can for example be followed the tracks of and select for the GUI 69 on display 68 for carrying out.Input equipment 70 is also used in issue an order in computer system 50.Input equipment 70 can comprise the touch-sensing equipment that is configured to receive the input touching from user and this information is sent to processor 56.For example, touch-sensing equipment can be corresponding to touch pad or touch-screen.Under many circumstances, position and the amplitude of the touch of touch-sensing recognition of devices and the touch in touch sensitive surface.Touch-sensing equipment Inspection touches and reports described touch to processor 56, and processor 56 is explained described touch according to its program design.For example, processor 56 can be according to specific touch initiating task.A kind of application specific processor is used in local processing and touches and reduce the needs to the primary processor of computer system.
Touch-sensing equipment can be based on including but not limited to capacitive sensing, resistive sensing, surface acoustic wave sensing, pressure-sensing and/or light sensing etc. detection technology.And described touch-sensing means can be based on single point sensing or multipoint sensing.Single point sensing only can be distinguished single touch, and multipoint sensing may can be distinguished simultaneous a plurality of touch.
As discussed above, input equipment 70 can be positioned at display 68 above or above with the integrated touch-screen of display device 68, or can be the independent parts such as touch pad.
Computer system 50 also preferably includes the ability that is couple to one or more I/O equipment 80.For example, I/O equipment 80 can be corresponding to keyboard, printer, scanner, camera, microphone and/or loudspeaker, etc.I/O equipment 80 can be integrated with computer system 50, or they can be independent parts (for example peripherals).In some cases, I/O equipment 80 can for example, be connected to computer system 50 by wired connection (cable/port).In other cases, I/O equipment 80 can be connected to computer system 50 by wireless connections.For example, data link can be corresponding to PS/2, USB, IR, Firewire, RF or bluetooth, etc.
According to one embodiment of present invention, computer system 50 is designed to the each side that identification is applied to the attitude 85 of input equipment 70 and controls computer system 50 based on described attitude 85.In some cases, attitude may be defined as can be mapped to one or more certain calculation operation mutual with stylize (stylized) input equipment.Can produce attitude 85 by various hands, particularly finger motion.Alternatively or additionally, available stylus produces attitude.In all these situations, input equipment 70 receives attitude 85, and processor 56 is carried out the operation of instruction to carry out being associated with attitude 85.In addition, memory block 58 can comprise attitude running program 88, and it can be a part for operating system or independent application.Attitude running program 88 can comprise one group of instruction conventionally, the generation of its identification attitude 85 action (or a plurality of action) of notifying attitudes 85 and/or will taking in response to attitude 85 to one or more agengs.Discuss about can be used as the other details of the various attitudes of input command below.
According to a preferred embodiment, once user has carried out one or more attitudes, input equipment 70 just passes to attitude information processor 56.Use, from the instruction of storer 58, more particularly, is used attitude running program 88, and processor 56 is explained attitude 85 and based on attitude 85, controlled the different parts of computer system 50, such as storer 58, display 68 and I/O equipment 80.Image object, modification that attitude 85 can be identified as being presented at for performing an action in the application being stored in storer 58, revising on display 68 are stored in the data in storer 58 and/or the order for performing an action at I/O equipment 80.
In addition,, although Fig. 1 is expressed as two independent frames for illustrated object by input equipment 70 and display 68, these two frames can be realized on an equipment.
Fig. 2 illustrates an exemplary computer system 10, and it uses multi-touch (multi-touch) panel 24 as the input equipment of attitude; Described multi-touch panel 24 can be display pannel simultaneously.Computing system 10 can comprise the one or more multi-touch panel processor 12 that are exclusively used in multi-touch subsystem 27.Alternatively, multi-touch panel processor function can be realized by the special logic such as state machine.Peripherals 11 can include but not limited to storer or memory device and the watchdog timer etc. of random access memory (RAM) or other type.Multi-touch subsystem 27 can include but not limited to one or more analog channels 17, channel scan logic 18 and driver logic 19.Channel scan logic 18 can access RAM 16, independently from analog channel reading out data, and provides control for analog channel.This control can comprise being respectively listed as and being multiplexed to analog channel 17 multi-touch panel 24.In addition, channel scan logic 18 can be controlled driver logic and by selectivity, is applied to the pumping signal of each row of multi-touch panel 24.In certain embodiments, multi-touch subsystem 27, multi-touch panel processor 12 and peripherals 11 can be integrated in single asic (ASIC).
Driver logic 19 can provide a plurality of multi-touch subsystem output 20, and can provide the special purpose interface that drives high-voltage drive, this high-voltage drive preferably include demoder 21 with and subsequent level displacement shifter and driver-level 22, but level shift function also can be carried out before decoder function.Level displacement shifter and driver 22 can provide for example, level shift from low voltage level (CMOS level) to high voltage level, thereby compare for noise reduction object provides good noise (S/N).Demoder 21 can will drive interface signal to be decoded to one of N output, and N can be the maximum number of lines in panel.Demoder 21 can be used for reducing the number of drive wire required between described high-voltage drive and multi-touch panel 24.Each multi-touch panel row input 23 a line or multirow that can drive in multi-touch panel 24.It should be noted that driver 22 and demoder 21 also can be integrated in single ASIC, be integrated in driver logic 19, or be unnecessary in some cases.
Multi-touch panel 24 can comprise the capacitive sensing medium with a plurality of row traces or drive wire (driving line) and a plurality of row trace or sense wire (sensing line), but also can use other sensed media.Row and column trace can be formed by transparent conductive medium, such as tin indium oxide (ITO) or antimony tin (ATO), but also can use other transparent and opaque material, such as copper.In certain embodiments, row and column trace can form on the opposite face of dielectric substance, and can be perpendicular to one another, but in other embodiments, may be other non-Cartesian orientations.For example, in polar coordinate system, sense wire can be concentric circles, and drive wire can be extension line (otherwise or) radially.Therefore, be to be understood that, here the term that used " row " and " row ", " the first dimension " and " the second dimension ", or " the first axle " and " the second axle ", be intended that and not only comprise orthogonal grid, also comprise the crossing trace (concentric line and radial line that for example polar coordinates are arranged) of other geometric configuration with the first and second dimensions.Row and column can be formed on the one side of substrate, or can be formed on two substrates that separate being separated by dielectric substance.In some cases, additional dielectric capping layers can be placed on row or column trace, with reinforced structure and the whole assembly of protection, exempts from damaged.
" intersection point " at the trace of multi-touch panel 24 located, wherein trace is not in above and below each other by (intersection) (but directly electrically contacting each other), and trace forms in fact two electrodes (intersecting but also may have more than two traces).Each intersection point of row traces and row trace can represent a capacitive sensing node, and can be regarded as pictorial element (pixel) 26, and this may be particularly useful when multi-touch panel 24 is used as catching " image " touching.(in other words, at multi-touch subsystem 27, determined whether that each the touch sensor place in multi-touch panel has detected after touch event, the pattern (pattern) that the touch sensor at touch event place occurs in multi-touch panel can be regarded as touching " image " (for example pattern of finger touch panel).) electric capacity between row and column electrode shows as the stray capacitance listing all, and when this given row is encouraged by AC signal, show as mutual capacitance Csig when this given row remains on DC.Can be by detecting finger or other object proximity or be positioned at the existence on multi-touch panel measuring to the change of Csig.The row of multi-touch panel 24 can drive the one or more analog channels 17 (here also referred to as event detection and demodulator circuit) in multi-touch subsystem 27.In some embodiments, each row can be couple to a specialized simulation passage 17.But in other embodiments, each row can be couple to via analog switch the analog channel 17 of lesser amt.
Computing system 10 also can comprise primary processor 14, it is for receiving from the output of multi-touch panel processor 12 and performing an action based on described output, described output can include but not limited to: mobile object such as cursor or pointer, roll or translation, and regulate to control to arrange, open file or document, check menu, select, carry out instruction, be operationally connected to the peripherals of main process equipment, etc.Primary processor 14, it can be personal computer CPU, also can carry out and may process irrelevant additional function with multi-touch panel, and can be couple to program storage 15 and such as LCD display for the display device 13 of user interface (UI) is provided to equipment user.
Although it should be noted that Fig. 2 illustrates special-purpose MT panel processor 12, can directly control multi-touch subsystem by primary processor 14.In addition, shall also be noted that multi-touch panel 24 and display device 13 can be integrated in single touch-screen display device.Multi-touch sensor detects the more details of---comprising the proximity test by touch panel---and is described in the common pending application of following common transfer, comprise: as the open disclosed application of the No.US2006/0097991 No.10/840 of United States Patent (USP), 862, as the open disclosed application of the No.US2006/0238522 No.11/428 of United States Patent (USP), the application of " the Proximity and Multi-Touch Sensor Detection and Demodulation " by name submitting on January 3rd, 522 and 2007, its all the elements are incorporated herein by reference.
Fig. 3 illustrates a kind of according to an embodiment of the invention Multipoint process method 100.Multipoint process method 100 can for example be carried out in the system shown in Fig. 1 or Fig. 2.Multipoint process method 100 conventionally from piece 102, at piece 102, can be from multiple spot input equipment---more particularly, from multi-point touch panel---reading images.Although use term " image ", it should be noted that data can occur with other form.In most cases, each sensing points that the image reading from touch-screen is touch-screen or pixel provide the function of amplitude (Z) as position (x and y).Amplitude can for example be reflected in the electric capacity of each point measurement.
Determining after 102, Multipoint process method 100 proceeds to piece 104, and at piece 104, described image can be converted into set or the list of feature.Each feature represents a different input, such as a touch.In most cases, each feature can comprise its oneself unique identifier (ID), x coordinate, y coordinate, Z amplitude, angle Θ, area A, etc.For example, Fig. 4 A and 4B illustrate specific image 120 sometime.In image 120, may have based on two different two features 122 that touch.Described touch can for example touch described touch-screen by pair of finger and form.As directed, each feature 122 can comprise unique identifier (ID), x coordinate, y coordinate, Z amplitude, angle Θ and area A.More particularly, First Characteristic 122A can be by ID 1, X 1, Y 1, Z 1, Θ 1, A 1represent, and Second Characteristic 122B can be by ID 2, X 2, Y 2, Z 2, Θ 2, A 2represent.These data can for example be utilized multi-touch agreement and export.
Conversion from data or image to feature can utilize as the disclosed common unsettled U.S. Patent application No.10/840 of the open No.US2006/007991 of United States Patent (USP), and the method for describing in 862 completes, and this application is combined in this again by reference.As wherein disclosed, raw data is conventionally received with digitized forms, and can comprise the value of each node of touch-screen.Described value can be between 0 and 256, and wherein 0 is equivalent to no touch pressure, and 256 are equivalent to full touch pressure.Then, raw data can be filtered with noise decrease.Once filter, just can generate gradient data (gradient data), it indicates the topology of the every group of point that is connected.Then, can calculate based on gradient data the border (that is, can determine which point can gather together and form each touch area) of touch area.For example, can use watershed algorithm.For example, once determine border, can calculate the data (X, Y, Z, Θ, A) of each touch area.
After piece 104, Multipoint process method 100 proceeds to piece 106, at piece 106, can carry out tagsort and in groups.In assorting process, can determine the identity of each feature.For example, can by tagsort, be specific finger, thumb, palm or other object.Once be classified, feature just can be in groups.The mode of formation group can extensively change.In most cases, feature can for example, based on some criterion (they possess similar attribute) and in groups.For example, two features shown in Fig. 4 A and Fig. 4 B can gather together, because the position of each in these features can be located adjacent one another or because they are from identical hand.Other filters can to comprise in groups a certain level, is not the feature of a part for touch event with filtering.When filtering, can refuse one or more features, because they meet some predefined criterion or do not meet some criterion because of them.For example, one of feature can be classified as the thumb that is positioned at dull and stereotyped PC edge.Because this thumb is for gripping device rather than be used for and execute the task, the feature therefore generating is thus rejected, that is, this feature is not considered to a part for the touch event processed.
After piece 106, Multipoint process method 100 proceeds to piece 108, at piece 108, and key parameter that can calculated characteristics group.Key parameter can comprise distance between feature, the stagnation pressure pressure of centre of form place (for example) of the characteristic x/y centre of form (centroid), feature rotation, group, etc.As shown in Figure 5, described calculating can comprise and finds centre of form C, draws dummy line 130 from centre of form C to each feature, defines each dummy line (D 1and D 2) distance and then get distance D 1and D 2mean value.Once calculate described parameter, just can Report Parameters value.Conventionally utilize the number of features (being three in this embodiment) in group identifier (GID) and each group to carry out Report Parameters value.In most cases, initial parameter value and parameter current value are all reported.Initial parameter value can be based on putting down (set down), when user is placed on its finger on touch-screen, and any point of currency in can the stroke (stroke) based on generation putting down after.
Should be appreciated that piece 102-108 can carry out repeatedly during user's stroke, thereby generate the signal of a plurality of arranged in order.Can in step after a while, compare initial parameter and parameter current to perform an action in system.
After piece 108, process streams proceeds to piece 110, at piece 110, by group or can by group be associated with user interface (UI) element.UI element can be button frame, list, slide block, wheel, knob, etc.Parts at each UI element representative of consumer interface or control.The application behind of UI element can be accessed the supplemental characteristic calculating in piece 108.In one embodiment, described application follows the degree of correlation of UI element corresponding thereto to sort to touch data.Sequence can be based on some predetermined criterion.Sequence can comprise generation quality factor, and whichever UI element has the highest quality factor, all gives its independent access to described group.Even can also there is history (once a UI element is advocated the control to this group, this group just adheres to (stick with) until another UI element has much higher sequence with this UI element) to a certain degree.For example, sequence can comprise that definite centre of form (or feature) is to the degree of approach of the image object being associated with UI element.
After piece 110, Multipoint process method 100 proceeds to piece 112 and 114. Piece 112 and 114 can roughly be carried out simultaneously.In one embodiment, from user perspective, piece 112 and 114 looks like simultaneously to be carried out.At piece 112, can the difference based between initial parameter value and parameter current value carry out one or more actions, and the one or more action can also be based on they associated UI elements arriving, if there is such UI element.At piece 114, can provide the user feedback of the described one or more actions about carrying out.For example, user feedback can comprise demonstration, audio frequency and/or tactile feedback, etc.
Fig. 6 is a kind of according to an embodiment of the invention calculation method of parameters 150.Calculation method of parameters 150 can be for example corresponding to the piece 108 shown in Fig. 3.Calculation method of parameters 150 from piece 152, at piece 152, can receive a stack features conventionally.After piece 152, calculation method of parameters 150 proceeds to piece 154, at piece 154, can determine whether the number of features in this stack features changes.For example, the number of feature may change because user lifts or place another finger.May need different fingers to carry out different control (for example, follow the tracks of, gesture).If the number of feature changes, calculation method of parameters 150 proceeds to piece 156, at piece 156, can calculate initial parameter value.If the number of feature remains unchanged, calculation method of parameters 150 proceeds to piece 158, at piece 158, can calculate parameter current value.Then, calculation method of parameters 150 proceeds to piece 150, at piece 150, can report initial and parameter current value.For example, the average initial distance (or Distance (AVG) initial) of initial parameter value between can comprising a little, and the average current distance (or Distance (AVG) current) of parameter current value between can comprising a little.These can be compared to control the each side of computer system in step below.
Said method and technology can be used for realizing gui interface object and the action of arbitrary number.For example, can create attitude to detect and completing user order, thereby adjust window size, roll display, target rotation, zoom in or out shown view, deletion or insertion word or other object, etc.
The attitude of a base class should allow user to input the normal commands that can input by the mouse with traditional or Trackball device.Fig. 7 F illustrates for the treatment of the process flow diagram to the detection of mouse click action.From piece 710, can detect one or two touch of finger.If detected touch can be determined that 711 is a finger, can determine whether 712 these touches follow certain the demonstration image object being associated with selectable file object to have the predetermined degree of approach, if carry out 714 selection actions.If detect 716 be associated with selectable object rap action twice, can enable 718 double-clicks actions.Can point to leave touch-screen and again touch immediately this touch-screen by detection and determine for twice and rap action twice.According to interchangeable embodiment, if the touch of finger on selected object detected, keep surpassing a predetermined amount of time, also can enable and double-click action.
As shown in Figure 7 G, if the finger touch detecting is not associated with selectable file object, but determined that 720 for to be associated with a network address hyperlink, can activate this hyperlink thereby can enable click action.If this hyperlink is touched in non-browser environment, also will initiate browser application.
If 711 two finger touch detected, if at least one contact point is associated with selectable file object 713 so, select 715 these objects.If the one or many of one of finger described in 717 detected on touch-sensitive display when keeping this contact point, rap, can enable and click mouse action by right key.
According to a preferred embodiment, if the one or more touches that detect are not associated with any selectable file object or hyperlink, as shown in Fig. 7 H, can determine that whether 722 these contact points (or a plurality of contact point) are associated with a scrollable field with/possibility, all text editing in this way of described scrollable field application window, listed files window or internet webpage.
Rolling is usually directed to the data of demonstration or image to move past the region of checking on display screen, to can check and check new one group of data in region at this.In most cases, once check that region is full, new each group data just appear at checks that the edge in region and all other group data remove a position.That is,, for shifting out each group data of checking region, there are one group of new data.In fact, these functions allow user to check at present and check extra-regional continuous multi-group data.In most cases, user, by move its finger with larger speed, can accelerate the traversal to data group.The example that rolls across list can find in following United States Patent (USP) is open: No.2003/0076303A1, No.2003/0076301A1, No.2003/0095096A1, these patents disclose incorporated herein by reference.
If contact point (or a plurality of contact point) is arranged in/may be positioned at a scrollable field, and can enable 723 and be similar to the scroll actions of depressing roller on traditional mouse apparatus.For example, if this scrollable field only can roll in one direction (upper and lower), the scroll actions enabled will be that one direction is rolled.If this scrollable field can roll in two dimensions, the scroll actions enabled will be omnidirectional.
In rolling, may be limited in the one direction scroll actions of longitudinal (being Y-axis), the longitudinal vector component that the touch of only following the tracks of is to some extent moved will be used as for realizing the input of longitudinal rolling.Similarly, in rolling, may be limited in the one direction scroll actions of horizontal (being X-axis), the horizontal vector component that the touch of only following the tracks of is to some extent moved will be used as for realizing the input of horizontal rolling.If this scroll actions is omnidirectional, the scroll actions that realized is by the movement of the touch of following the tracks of.
According to a preferred embodiment, if the touch detecting is a finger touch, can prepare to carry out 724 scroll actions with normal or 1X speed.Once and if the finger of contact start to move on touch-screen, can carry out scroll actions by following the tracks of the movement of contact point on touch-screen.If the touch detecting is two finger touch, can carry out 725 scroll actions with twice or 2X speed.Also can increase other finger to carry out scroll actions faster, wherein, in multi-page document window, four finger touch be detected and can be interpreted as " pg up (upwards page turning) " or " pg dn (page turning downwards) " order.
According to another embodiment, even when finger is removed from touch-screen, the data of demonstration also continue mobile.This continuous motion motion based on before at least partly.For example, can continue to roll with identical direction and speed.In some cases, roll and slow down in time, that is, traversal becomes more and more slower by the speed of media item, thereby finally stops staying static list until roll.For example, enter and check that each new media items in region can little by little reduce speed.Alternatively or additionally, when finger is put back on touch-screen, the data of demonstration stop mobile.That is, finger is put back on touch-screen and can be realized braking (braking), the motion that it stops or slowing down and carry out continuously.
Illustrate attitude action discussed above, as shown in Figure 7 A, utilize touch-screen (all multi-touch screens 24 as shown in Figure 2), single finger with finger 501 for example, on image object (listed files 500) raps may be interpreted as and is equivalent to mouse-click, in this example, this can indicate selection, and described selection represents by highlighting selected file or image object conventionally.Rapping for twice on image object detected and may be interpreted as and be equivalent to mouse and double-click, this can enable the initiation of the application being associated with rapped image object.For example, on screen, rap listed files twice, such as photo files, can make to initiate photo and check and apply and open this photo files.
Can by the image that is associated with the object that will put down (drop) with at least one finger touch and by keep this touch with figure this object is drawn to the place of putting down of expectation, enable drag-and-drop function, as shown in Figure 7 B, illustrate listed files 500 is dragged and dropped into folder window 503 from folder window 502.
Some mouse function may need two touches to complete.For example, as shown in Fig. 7 C, can enable " clicking by right key " attitude by two fingers, as contacting finger 506, second finger 507 is rapped screen at least one times to one of them finger, to indicate, clicks action by right key.Fig. 7 D illustrates, and after may completing and clicking action by right key, can enable action window 504, and then, the first finger can move on to enabled window 504 to select and to use single finger 506 to rap action item 505.According to one embodiment of present invention, only have when rapping of detecting very approaching with contacting of detecting, and when rapping of detecting is positioned at the left side (seeing it is the right of contact finger from user's angle) of contact finger, just may realizes and click action by right key.
Can only utilize touch action to carry out generally needs the alternative document of Combined mouse and keyboard action selection function.For example, in Microsoft Windows environment, in order to select a plurality of files in file window 502, user need to pull mouse icon conventionally when keeping pressing shift button on a series of files that will select.Do not keep pressing shift button, the towing of mouse icon may be interpreted as ole Server OLE.As shown in Fig. 7 E, according to one embodiment of present invention, detect two of listed files tight associated touches are pulled and can be considered to for selecting the multiselect action of one group of file 508.For fear of being another order such as spinning movement by the mistranslation of the described pair of touch action, only have when detecting these two while touching toward each other close proximity, just preferably enable this pair of touch multiselect function.
With reference to the scroll actions of describing in figure 7H, and as shown in Fig. 7 I and 7J, one or two finger contact in can rolling window can make the displaying contents of this window roll with friction speed.Particularly, once enable 723 scroll actions, if determine and a finger (or a contact point) only detected on touch-sensitive display, just with 1X speed 724, roll, if two fingers (or two contact points) detected, with 2X speed, roll.According to a preferred embodiment, during scroll actions, scroll bar 727 and 728 and rotating direction consistently move.
Finally, utilization can be carried out the multi-touch display of proximity test, the common pending application No.10/840 of common transfer as the aforementioned all and incorporated herein by reference, the panel of describing in the application of " the Proximity and Multi-Touch Sensor Detection and Demodulation " by name of 862 (being disclosed as the open No.US2006/0097991 of United States Patent (USP)) and submission on January 3rd, 2007, finger gesture also can be used for enabling hovering (hover) action that can be equivalent to the hovering of mouse icon on image object.
For example, with reference to figure 7K, in desktop computer 729, user is pointed to 501 approaching detections on application icon 731 and may be interpreted as hovering action, its scrolling of enabling the application icon 730 of hovering ejects (rolling popup).If user touches the icon of this ejection, thereby can enable to double-click to move, can initiate this application.Similarly concept can be applicable to the situation of application-specific, when being presented in photo management software with thumbnail image format when photo files, hovering action is enabled in approaching detection to finger on thumbnail, thereby can increase the size (rather than selection) of the photo thumbnail of this hovering.
Attitude also can be used for enabling and handling virtual controlling interface, such as volume knob, switch, slide block, keyboard and can create for helping other mutual virtual interfaces of people and computing system or consumer electronics.Take and utilize attitude to enable virtual controlling interface as example, and with reference to figure 8A-8H, use description to control the rotation attitude of virtual volume knob 170 on the gui interface 172 of the display 174 of dull and stereotyped PC 175.In order to start knob 170, user is pointed 176 and is placed on multi-point touch panel 178.May show virtual controlling knob, or the specific quantity of the finger putting down, orientation or profile, or immediately following the movement of finger thereafter, or a certain combination of these and other feature of user interactions, can enable the virtual controlling knob that will show.In either case, computing system is associated with finger group this virtual controlling knob and determines that user intends to use this virtual volume knob.
This association is pattern or current state when inputting based on calculation element partly.For example, for identical attitude, if current just played songs on calculation element, this attitude may be interpreted as volume knob attitude, and if carrying out object editing application, this attitude may be interpreted as rotate command.Other user feedback can be provided, comprise for example sense of hearing or tactile feedback.
Once such knob 170 that shown as shown in Figure 8 A, user's finger 176 just can be placed on knob 170 around, just as it is actual knob or dial (of a telephone), and then can be around knob 170 rotations so that simulation turning knob 170.And, when knob 170 possibility " rotation ", can provide the audio feedback for example with noise made in coughing or vomiting loudspeaker sound form or the tactile feedback with vibration mode.User also can hold dull and stereotyped PC 175 with his or her another hand.
As shown in Figure 8 B, multi-point touch panel 178 detects at least one pair of image.Especially, when putting down, create the first image 180, and can create at least one other image 182 when finger 176 rotation.Although only show two images, in most cases, will have more images to increase progressively appearance between these two images.Each image represents the profile of the finger contacting with touch-screen in a particular moment.These images also can be described as touch image.Will be appreciated that, term " image " does not also mean that this profile is displayed on (but by the imaging of touch-sensing equipment) on screen 178.Although also it should be noted that and use term " image ", data can have other form of the touch plane that represents each time.
As shown in Figure 8 C, the set of each the be transformed into feature 184 in image 180 and 182.Each feature 184 can be associated with a specific touch, and this touch is for example from the finger tip of each finger 176 around knob 170 and for holding the thumb of the another hand 177 of dull and stereotyped PC 175.
As shown in Fig. 8 D, feature 184 is classified---being that each finger/thumb is identified---and for each image 180 and 182 in groups.In this instantiation, the feature 184A being associated with the knob 170 formation group 188 that can gather together, and the feature 184B being associated with thumb can be filtered off.In interchangeable layout, thumb feature 184B can be regarded separately feature (or being arranged in another group) separately, for example to change input or the operator scheme of system or to realize another attitude, for example be presented at screen on the slide block attitude that is associated of balanced device slide block in thumb (or other finger) region.
As shown in Fig. 8 E, it can be the key parameter of each image 180 and 182 calculated characteristics groups 188.The key parameter being associated with the first image 180 represents original state, and the key parameter of the second image 182 represents current state.
As shown in Fig. 8 E, because feature group 188 is close to knob 170, knob 170 is the UI elements that are associated with feature group 188 equally.Then, as shown in Figure 8 F, can be relatively from the key parameter value of the feature group 188 of each image 180 and 182 to determine rotating vector, that is, and feature group five (5) degree that turn clockwise from original state to current state.In Fig. 8 F, initial characteristics group (image 180) is shown in broken lines, and current feature group (image 182) illustrates with solid line.
As shown in Fig. 8 G, based on rotating vector, the loudspeaker 192 of dull and stereotyped PC 175 increases (or reducing) its output according to the rotation amount of finger 176, that is, the rotation based on 5 degree increases by 5% by volume.The display 174 of dull and stereotyped PC also can carry out the rotation of adjusting knob 170 according to the rotation amount of finger 176, that is, and and the position of knob 170 rotation five (5) degree.In most cases, the rotation of knob and the rotation of finger occur simultaneously,, are once pointing every rotation that is, and knob was once rotating.In fact, virtual controlling knob is followed the attitude appearing on screen.Further, the audio unit 194 of dull and stereotyped PC can provide noise made in coughing or vomiting loudspeaker sound one time to per unit rotation, for example, based on five degree rotations, provides noise made in coughing or vomiting loudspeaker five times.Further, the haptic unit 196 of dull and stereotyped PC 175 can provide certain vibratory output or other tactile feedback to each noise made in coughing or vomiting loudspeaker, thereby simulates actual knob.
It should be noted that and can carry out other attitude with virtual controlling knob attitude simultaneously.For example, can utilize both hands to control more than one virtual controlling knob, that is simultaneously, and virtual controlling knob of a hand control.Alternatively or additionally, can control one or more slider bars with virtual controlling knob simultaneously, an i.e. manual manipulation virtual controlling knob, at least one finger of another hand (alternatively simultaneously, more than one finger) operate at least one slider bar (alternatively, more than one slider bar), for example each points a slider bar.
Although shall also be noted that this embodiment utilizes virtual controlling knob to describe, in another embodiment, UI element can be virtual scroll wheel.As an example, virtual scroll wheel can be imitated actual roller, such as disclosing at United States Patent (USP) those that describe in No.US2003/0076303A1, No.US2003/0076301A1 and No.US2003/0095096A1, all these patents disclose incorporated herein by reference.
Fig. 9 is the figure of a kind of method 200 based on touching according to an embodiment of the invention.The method from piece 202, at piece 202, can detect the user's input on present multipoint sensing device conventionally.User's input can comprise one or more touch inputs, and each touches input and has a unique identifier.After piece 202, should the method 200 based on touching proceed to piece 204, at piece 204, if user's input may comprise single unique identifier (touches input), user's input can be classified as and follow the tracks of or select input, if or user input may comprise at least two unique identifiers (more than one touch input), user's input can be classified as attitude input.If user's input can be classified as, follow the tracks of input, the method 200 based on touching proceeds to piece 206, at piece 206, can carry out with user and input corresponding tracking.
If user's input is classified as attitude input, the method 200 based on touching proceeds to piece 208, at piece 208, can carry out with user and input corresponding one or more attitude control actions.The variation that attitude control action can occur based at least two unique identifiers at least in part or the variation occurring between at least two unique identifiers.
Figure 10 is the figure of a kind of method 250 based on touching according to an embodiment of the invention.Method 250 based on touching from piece 252, at piece 252, can catch initial pictures conventionally input stroke in touch sensitive surface during.After piece 252, the method 250 based on touching proceeds to piece 254, at piece 254, can determine touch mode based on initial pictures.For example, if initial pictures comprises single unique identifier, touch mode can be corresponding to following the tracks of or preference pattern.
On the other hand, if initial pictures comprises more than one unique identifier, touch mode can be corresponding to gesture mode.
After piece 254, the method 250 based on touching proceeds to piece 256, at piece 256, can input stroke in touch sensitive surface during, catch next image.During stroke, conventionally catch serially image, therefore may have a plurality of images that are associated with this stroke.
After piece 256, the method 250 based on touching proceeds to piece 258, at piece 258, can determine at seizure initial pictures and catch touch mode between next image whether change.If touch mode changes, the method 250 based on touching proceeds to piece 260, and at piece 260, next image can be set to initial pictures, and then the initial pictures based on new is determined touch mode again at piece 254.If touch mode remains unchanged, the method 250 based on touching proceeds to piece 262, at piece 262, can compare initial pictures and next image, and can relatively generate one or more control signals based on this.
Figure 11 is the figure of a kind of method 300 based on touching according to an embodiment of the invention.Method 300 based on touching from piece 302, at piece 302, exportable image object, it can be GUI object.For example, processor can show specific image object by indication display.After piece 302, the method 300 based on touching proceeds to piece 304, at piece 304, is received in the attitude input on this image object.For example, user can place in attitude mode or mobile its finger in touch screen surface, and its finger is on shown image object simultaneously.Attitude input can comprise one or more single attitude or simultaneous a plurality of attitudes that in succession occur.Each attitude has particular order associated therewith, motion or orientation conventionally.For example, attitude can comprise that finger opens (spread apart) or finger closes up (close together), pivoting finger and/or translation finger, etc.
After piece 304, the method 300 based on touching proceeds to piece 306, at piece 306, can as one man revise image object based on attitude input and with this attitude input.By revising, mean that image object changes according to particular pose or a plurality of particular pose carried out.By making it consistent, mean roughly when attitude or a plurality of attitude are being carried out and change.In most cases, in attitude (a plurality of attitude) with between the variation of image object place generation, have one-to-one relationship, and their occur substantially simultaneously.In fact, image object is followed the motion of finger.For example, finger can amplify object when opening, and finger can reduce image object when closing up, can target rotation in the time of pivoting finger, in the time of translation finger, can allow image object translation or rolling.
In one embodiment, piece 306 can comprise determines which image object is associated with the attitude of carrying out, and then shown object is locked onto to finger placed on it, so that this image object is according to attitude, input changes.By finger is locked or be associated with image object, this image object can regulate himself continuously according to point the operation that just carry out on touch-screen.Conventionally, describedly determine and locking occurs in while putting down, that is, and when finger is placed on touch-screen.
Figure 12 is the figure of convergent-divergent attitude method 350 according to an embodiment of the invention.Convergent-divergent attitude can be carried out on the multi-point touch panel of all multi-touch panels 24 as shown in Figure 2 and so on.Convergent-divergent attitude method 350 conventionally from piece 352, at piece 352, detect at least the first finger and second finger in touch sensitive surface in existence.The existence of at least two fingers can be used for showing that this touch is that attitude touches rather than the tracking based on a finger touches.In some cases, only the existence of two fingers shows that this touch is that attitude touches.In other cases, the finger more than the arbitrary number of two shows that this touch is that attitude touches.In fact, be no matter that two, three, four or more fingers are touching, even and during described attitude number change, described attitude touches and all can work, and, during described attitude, whenever only needs minimum two fingers that is.
After piece 352, convergent-divergent attitude method 350 proceeds to piece 354, at piece 354, and the distance between more at least two fingers.This distance can be from finger to finger, or from each finger for example, to certain other reference point, the centre of form.If the distance between two fingers increases (opening), can generate amplifying signal, as shown in piece 356.If the distance between two fingers reduces (closing up), can generate and dwindle signal, as shown in piece 358.In most cases, finger puts down finger association or locks onto a specific image object showing.For example, touch sensitive surface can be touch-screen, and image object may be displayed on this touch-screen.This occurs when described at least one, finger is positioned on this image object conventionally.Thereby when finger moves separately, amplifying signal can be used for increasing the size of the embedding feature in image object, and when finger is retracted to together, dwindle the size that signal can be used for reducing the embedding feature in this object.Described convergent-divergent usually occurs in predefined border, such as the edge of the circumference of display, the circumference of window and/or this image object, etc.Embed feature and can be formed on a plurality of layers above, every one deck represents the convergent-divergent of different stage.
In most cases, the amount of convergent-divergent changes according to the distance between two objects.And convergent-divergent can substantially side by side occur with the motion of object conventionally.For example, along with finger opens or closes up, object just zooms in or out simultaneously.Although the method for convergent-divergent, should be noted that it also can be used for increasing or reducing.Convergent-divergent attitude method 350 is particularly useful for the graphic package such as publication, photo and plotter program.And convergent-divergent can be used for controlling the peripherals such as camera, that is, when finger opens, camera dwindles, and when finger closes up, camera amplifies.
Figure 13 A-13H illustrates a series of convergent-divergents that utilize above-described method.Figure 13 A illustrates the display of the image object 364 that presents ground, North America diagram form, and wherein this image object 364 has scalable embedding rank.In some cases, as shown in the figure, within image object can be positioned at the window on the border that forms image object 364.Figure 13 B illustrates user and is pointed 366 and be placed on a region in North America 368,, on the U.S. 370, is specifically more specifically on California 372.In order to amplify on California 372, user starts to open its finger 366, as shown in Figure 13 C.As shown in Figure 13 D-13H, along with finger 366 further opens (distance detecting increases), map further amplifies on Northern California 374, then arrive a specific region of Northern California 374, then arrive Bayarea 376, then arrive the peninsula 378 (for example, the region between San Francisco and San Jose region), then arrive the city San Carlos380 between San Francisco and San Jose.In order to dwindle San Carlos 380 and to get back to North America 368, point 366 and close up back together according to the reverse order of said sequence.
Figure 14 is the figure of a kind of shift method 400 according to an embodiment of the invention.Translation attitude can be carried out on multi-point touch panel.Shift method 400 conventionally from piece 402, at piece 402, can detect at least the first object and second object and exist in touch sensitive surface.The existence of at least two fingers can be used for showing that this touch is that attitude touches rather than the tracking based on a finger touches.In some cases, only exist two fingers to show that this touch is that attitude touches.In other cases, the finger more than the arbitrary number of two shows that this touch is that attitude touches.In fact, be no matter that two, three, four or more fingers are touching, even and number during described attitude, change, described attitude touches and all can work, and, only needs minimum two fingers that is.
After piece 402, shift method 400 proceeds to piece 404, at piece 404, and the position of monitoring these two objects when two objects move past touch-screen together.After piece 404, shift method 400 proceeds to piece 406, at piece 406, when the position of these two objects changes with respect to initial position, can generate translation signal.In most cases, finger puts down finger association or locks onto the specific image object being presented on touch-screen.When conventionally, at least one in described finger is positioned on the position on this image object.Thereby when finger moves past touch-screen together, translation signal can be used for making image translation on finger orientation.In most cases, the distance that the amount of translation moves according to two objects changes.And translation can substantially side by side occur with the motion of object conventionally.For example, along with finger is mobile, object is along with finger moves simultaneously.
Figure 15 A-15D illustrates a series of translations based on shift method 400 described above.Utilize the map of Figure 13 A, Figure 15 A illustrates user and is pointed 366 and be placed on map.Once put down, point 366 and just lock onto map.As shown in Figure 15 B, when finger 366 is when longitudinally move up, entirely Figure 36 4 can move up, thus make map 364 before the part seen be placed in and check outside region, and within the part of not seeing of map 364 is placed in and checks region.As shown in Figure 15 C, when finger 366 is flatly during transverse shifting, entirely Figure 36 4 can transverse shifting, thus make map 364 before the part seen be placed in and check outside region, and within the part of not seeing of map 364 is placed in and checks region.As shown in Figure 15 D, oblique when mobile when finger 366, entirely Figure 36 4 can oblique movement, thus make map 364 before the part seen be placed in and check outside region, and within the part of not seeing of map 364 is placed in and checks region.Should be appreciated that the motion of the motion accompanying finger 366 of map 364.This process is similar to along desk slip a piece of paper.The pressure that finger puts on paper locks onto finger by paper, and when finger slips over desk, this paper is along with finger is mobile.
Figure 16 is the figure of spinning solution 450 according to an embodiment of the invention.Rotation attitude can be carried out on multi-point touch panel.Spinning solution 450 from piece 452, at piece 452, exists when can detect the first object and second object conventionally.The existence of at least two fingers can be used for showing that this touch is that attitude touches rather than the tracking based on a finger touches.In some cases, only exist two fingers to show that this touch is that attitude touches.In other cases, the finger more than the arbitrary number of two shows that this touch is that attitude touches.In some other situations, be no matter that two, three, four or more fingers are touching, even and during described attitude number change, described attitude touches and all can work, and, only needs minimum two fingers that is.
After piece 452, spinning solution 450 proceeds to piece 454, at piece 454, the angle of each finger is set.Described angle can be determined with respect to reference point conventionally.After piece 454, spinning solution 450 proceeds to piece 456, at piece 456, when the angle of object described at least one changes with respect to reference point, can generate rotating signal.In most cases, finger puts down finger association or locks onto the specific image object being presented on touch-screen.Conventionally, when at least one in described finger is positioned on the position on this image object, this image object is by associated or lock onto described finger.For example, thereby when pointing rotation, rotating signal is used in (, clockwise, counterclockwise) in the direction of pointing rotation and rotates this object.In most cases, the rotation amount of object changes according to the rotation amount of finger, that is, if mobile 5 degree of finger, object also will be so.And rotation can substantially side by side occur with the motion of finger conventionally.For example, in finger rotation, object is along with finger rotation.
Figure 17 A-17C illustrates a series of rotations based on above-described method.Utilize the map of Figure 13 A, Figure 17 A illustrates user and is pointed 366 and be placed on map 364.Once put down, point 366 and just lock onto map 364.As shown in Figure 17 B, when finger 366 rotates in a clockwise direction, entirely Figure 36 4 can rotate in a clockwise direction according to the finger 366 of rotation.As shown in Figure 17 C, when finger 366 rotates in a counter-clockwise direction, entirely Figure 36 4 can rotate in a counter-clockwise direction according to the finger 366 of rotation.
Although it should be noted that Figure 17 A-17C illustrates with thumb and forefinger enables rotation attitude, uses two fingers except thumb, such as forefinger and middle finger, also can be used for enabling rotation attitude.
And, in some special applications, may not require that two fingers enable rotation attitude.For example, according to a preferred embodiment and as shown in Figure 17 D and 17E, can utilize single finger gesture photo thumbnail to be rotated to the orientation (for example from laterally to longitudinally) of expectation.Especially, once the touch being associated with selectable photo thumbnail icons 741 be detected, and wherein this touch input is such attitude,, the touch that detects forms rotation or the radial arc around thumbnail core, and this input is interpreted as for according to the instruction of the direction rotation thumbnail of described rotation or radial arc.According to a preferred embodiment, the rotation of thumbnail icons will also make corresponding file object change orientation arrangement.According to another embodiment, in photo management application, rotation attitude detected and also will enable snap order, so that photo thumbnail is turn 90 degrees towards sense of rotation automatic rotary.
Figure 18 A and 18B illustrate according to another examples one exemplary embodiment of the present invention of describing in Figure 10 before, that utilization is edited the media file such as photo by the attitude input of UI element.Especially, as shown in Figure 18 A, in photo editor environment 750, can be provided for the UI element 751 of the each side of editing photo, wherein, in this photo editor environment 750, can open picture image file (for example jpeg file) 752 for editor.UI element 751 can be level sliders, for regulating the rank of some aspect of photo.In the example shown in Figure 18 A, UI element 751 can be to touch attitude to regulate the interface of the gray scale of photo for receiving.Especially, along with followed the tracks of finger touch moves on to the left side of described, gray scale is reduced, and if the touch of following the tracks of moves on to the right of this UI element, gray scale increase.According to an embodiment, UI element is preferably transparent, so that user still can see the photograph image after UI element.In another embodiment, the photo size being presented on screen can reduce so that for separating the UI element vacating space showing, place the below of the photo that wherein this UI element can be and then shown.
Figure 18 B illustrates by optionally utilizing single or multiple contact points, switches the ability of attitude input pattern via UI element 751.Especially, as shown in Figure 18 B, the second contact point detecting on UI element 751 will make operator scheme regulate and become contrast level adjusting from gray scale.In this case, two contact points move the contrast level that makes respectively photo are reduced or increased to the left or to the right.Other contact point (such as three or four fingers) being detected also can be interpreted as for being switched to the instruction of other operator scheme (such as convergent-divergent, tone adjusting, gamma rank etc.).Notice, although Figure 18 A and 18B illustrate by UI element 751, come brightness adjusting and contrast level, user can programme or customize UI element 751 contact point number is interpreted as referring to other form of operator scheme.Shall also be noted that slider bar UI element 751 can adopt other form, such as virtual scroll wheel.
Figure 18 C is the process flow diagram that the algorithm being associated with the specific example of discussing in Figure 18 A and 18B is above shown.Especially, as shown in Figure 18 C, on screen, export 760UI element 751.If 761 attitude inputs detected, touch, can further determine 762-765 has how many contact points to be associated with described touch.According to the number of the contact point detecting, can activate corresponding operator scheme at 767-769.Once activate suitable operator scheme, just detect the tracking of 770 butt contacts (or a plurality of contact point), to realize 771 according to operator scheme, regulate accordingly.It should be noted that arbitrary time point that operator scheme can be during editing process switches, reason is that this process turns back to and determines that 762-764 is to activate new operator scheme if the number of 772 contact point (or a plurality of contact point) being detected changes.
Figure 18 D and 18E illustrate and utilize identical UI element 751 discussed above, by inputting other attitude command, enable other action.Especially, when regulating the gray scale of shown photo, second finger can be used for realization and zooms in or out action.Amplify and dwindle action and can enable by the variation of the distance degree of approach between the second contact point and two contact points being detected.Variable in distance between two contact points can according to shown in Figure 12 and method discussed above be interpreted as zooming in or out action.It should be noted that according to an embodiment, if the second contact point detecting and first make contact keep constant distance, will can not enable zoom action; In this case, this attitude for example will be interpreted as, for activating the input of the second operator scheme (regulating and becoming contrast level adjusting from gray scale as shown in Figure 18 A and 18B).
Figure 19 A and 19B illustrate and utilize attitude input to roll across media file---such as the photo files being presented in photo editor---example.Especially, as shown in Figure 19 A and 19B, touch detection zone 754 and can be exclusively used in scroll actions, thus finger moving up and down attitude and can be interpreted as for being rolled to the attitude input of next photo 753 on the shown photo 752 of touch-screen 750.According to a preferred embodiment, needn't show that UI unit usually enables rolling operation pattern; On the contrary, the down sliding action (downward tracking that contact point for example, detected move) of finger in touching detection zone 754 detected and can be enough to automatically enable scroll actions.According to an alternative embodiment, UI element can be shown as for indicating virtual longitudinal sliding motion bar that scroll actions has been activated to user and for continuing the region of the touch detection zone 754 of scroll actions on screen.
According to a preferred embodiment, for example, if the downward tracking movement detecting has more than one contact point (both hands refer to sliding gesture), can carry out and roll with 2X speed, be similar to above-described about enable the mode of scroll actions in scrollable field.
Figure 19 C and 19D illustrate the another kind of form of UI element, and virtual scroll wheel 755, for receiving the roll demonstration of photo of attitude input.In this embodiment, can enable virtual scroll wheel by carry out the circular contact that touches this simple attitude or three fingers on photo with a finger.Once can present virtual scroll wheel UI element 755, user just can " " this virtual scroll wheel be to roll across photo in rotation.In this particular example, rolling speed be can't help how many contact points to be detected on roller 755 and is controlled, but is controlled around the speed of virtual scroll wheel 755 central rotations by contact point.
Figure 19 E and 19F are illustrated in Figure 19 A on the display screen 781 of digital camera 780 and the conception of 19B.According to a preferred embodiment, the display screen 781 of digital camera 780 can be made by the responsive panel of multi-touch, such as the responsive panel 2 of the multi-touch of describing in Fig. 2 above.
Figure 19 E illustrates an embodiment, wherein, in the replay mode of digital camera 780, longitudinally downwards the hit attitude input of at least one finger in touching detection zone 782 detected and enables playback scroll actions, thereby can show next photo.According to another embodiment, the downward attitude input in display screen 781 any parts can be enabled scroll actions automatically.
Figure 19 F illustrates an alternative embodiment of Figure 19 E, wherein needs to detect two touches and rolls to enable to reset.Especially, in contact region, the combination of 783 contact point and 782 places or near the down sliding it input in contact region can be enabled scroll actions to show next photo.It should be noted that, Figure 19 A does not require specific form factor (formfactor) to the method for describing in 19E, because the method can or have on the equipment of any type of touch-screen at PC monitor, laptop computer monitor, digital camera, realizes.
Figure 19 G illustrates the other attitude that can input according to the playback duration at the media file such as photo files of another embodiment.Especially, be similar to the embodiment shown in Figure 18 A and 18B, by distinguishing the number (i.e. the number of finger) of contact point on touch-sensitive display, identical movement can differently be explained.In this example, the attitude of longitudinally hitting downwards of two fingers can be interpreted as for deleting the attitude of photo files, mark photo files (for the object such as establishment photograph album) or any other useful order.
Figure 19 H illustrates other that utilize touch-sensitive display and specifies UI district to detect other other attitude.In this example, contact point in another designation area 756, detected and can be interpreted as referring to deletion, mark or other useful order.According to an embodiment, a plurality of contact regions can be shown as the transparent covering of photo files.
Although it should be noted that Figure 19 is illustrated in the attitude of hitting in longitudinal downward direction, it will also be appreciated that, hitting in longitudinal upward direction or horizontal direction also can be designated as the attitude input of same commands.
Figure 20 illustrates for realizing a kind of possible algorithm of the method shown in Figure 19 A-19F.Especially, at first step, on touch-sensitive display, show one of more than 790 photo.If 791 touches on display screen detected, can determine whether 792 these touches are attitude inputs, and receive the type (for example follow the tracks of sliding action, the action of circular tracking rotary downwards, etc.) of 793 attitudes inputs.According to the attitude input detecting, can export as required 794UI element (for example slider bar or virtual scroll wheel), then can enable 795 actions corresponding with the use of this UI element or the input of this attitude.
It should be noted that the method for describing in Figure 18-20 also can realize in video environment.Especially, during video file playback, also can enable and show the UI element of all lateral slide-bar as shown in Figure 18 A and so on, thereby, according to the contact point number detecting, can activate for changing the operator scheme of some the adjustable aspect such as brightness, contrast of video.Meanwhile, can also realize in a similar manner the rolling shown in Figure 19 A-19F and Zoom method, but be to refund and fast forward action by what carry out, but not roll.
Can utilize attitude input on some existing control element to realize the other editor/playback of video file.According to a preferred embodiment, can be by optionally shrinking or launching playback time top-stitching bar and realize the Nonlinear Time of video file and reset.Especially, Figure 21 A illustrates Video Applications 790 (such as video playback application), its display video playback 791 and progress bar 792, the time schedule that replay queue 793 instruction videos on progress bar 792 are reset.
According to a preferred embodiment, replay queue 793 can be mobile to realize the F.F. of video and to refund forward or backward on progress bar 792.This queue also can remain on same position or with nonlinear velocity adjustment, to realize the variable velocity of video, resets or suspend.According to a preferred embodiment, Video Applications 790 may be displayed on touch-sensitive display, and the position of replay queue 793 is handled in this queue of position manual palpation that can show on screen in this queue by the finger by hand 501.That is, replay queue 793 both can be used as progress indicator, also can be used as for controlling the speed of video playback and the UI element of time location.
According to a preferred embodiment, whole progress bar 792 can be used as a UI element, thereby user can realize by one or more parts of launching or shrink progress bar the non-linear playback of video.Especially, as shown in Figure 21 B, can zoom in or out attitude (as discussed for Figure 12) above by two fingers and handle UI element progress bar 792.In the example shown in Figure 21 B, amplify attitude and enable the expansion of playback duration between 60 minutes marks and 80 minutes marks.In the example shown in Figure 21 B, the playback speed of video becomes nonlinear, and reason is that the playback speed of video can slow down during the time period between 60 and 80 minutes marks.Alternatively, the playback speed of video can accelerated between 0 and 60 minute mark and after 80 minutes marks, and between mark, is normal playback speed at 60 and 80 minutes.
Figure 21 C illustrates the other UI element 794 being presented in Video Applications 790.In this embodiment, UI element 794 can be virtual scroll wheel, and user can borrow it further to control the playback speed of video.With the manipulation of progress bar 792 is combined, user's part that first playback speed will slow down in designated, thus, user can further adjust replay queue 793 to control playback direction and/or the speed of video with roller 794.
Figure 21 D illustrate rise to Video Applications 790 for editing other other touch-sensitive UI element of object.For example, as shown in Figure 21 D, can increase slider bar UI element 796 to detect the attitude input regulating for enabling rank, the adjusting of the types such as the described rank all translational adjustment in this way of adjusting or brightness, contrast, tone, gamma.Be similar to the UI element 751 of discussing with reference to figure 18A-18E, the contact point number that slider bar UI element 796 can be used for by changing on slider bar UI element 796 is enabled different operator schemes.
UI element 795 also can be presented in Video Applications 790 to realize the sound-editing of video.Especially, UI element 795 can comprise for regulating from a plurality of ranks of the different passages of video mix or the record of sound or music or playback.
According to a preferred embodiment, the user of Video Applications 790 can customize to show which UI element, and can arrange the function of described UI element carry out desired.
Figure 22 A and 22B illustrate for realizing an exemplary algorithm 800 of the method for describing about Figure 21 A-21D.Especially, as shown in Figure 22 A, Video Applications 790 can be initiated video playback to be provided and/or to edit 802.Can show 803 progress bars 792.If 804 touches detected on progress bar 792, can determine that 805 these touches are amplify or dwindle order.If this touch is not detected as, do not zoom in or out order, can handle this replay queue according to followed the tracks of touch input.If it is convergent-divergent attitude that this touch is detected as, thereby the part that touch detected in progress bar can be handled according to this attitude input and launches or shrink.
At Figure 22 B, step 808-810 can be performed to show alternatively respectively other UI element, such as roller, sound mixer and slider bar rank, regulates.Can detect and touch (a plurality of touch) at step 811-813, then can enable suitable function 814-818.
Figure 23 illustrates another embodiment of the present invention, its playback for manipulation of audio or music file and record.As shown in figure 23, music application 830 can show a pair of virtual rotating disk 842 and 843, is playing two music disc 834 and 835 on it, and described disc is one of single (single) or LP disc.Disc 834 and 835 can be the diagrammatic representation of the digital music file (for example song A and song B) of resetting by music application 830.In other words, disc can be the figure marking of music file, just looks like that music file is stamped on physics disc.
Be similar to a pair of physics rotating disk, stylus 844 and stylus 855 can be the graphic icons indications of replay queue, and the position of this replay queue can change by the desired locations that touches this queue and this icon is drawn on figure disc on touch-sensitive display screen.The movement of stylus is by causing the redirect of the playback point of respective songs, just as on physics rotating disk.
Be similar to equally a pair of physics rotating disk, can be by one or more finger touch start/stop buttons 838 and 839, since switchback change beginning or the stop/pause that song is reproduced.Velocity variations bar 840 and 841 can be by linear regulation to control the playback speed of song.Window 831 and 833 can reproduce the frequency representation of reproduce song figure, window 832 can show the frequency representation of the actual output of music application 832 simultaneously, this actual output can be only one of song of reproducing, or the mixing/combination of song.Two songs that mixing/translation bar 850 can be handled to modulate or demodulation is being reproduced.
At song reproduction period, can be similar to physics disc and handle like that disc 834 and 835.For example, the moving around fast of disc can cause the sound effect of disc " scraping (scratching) ", just as music program host usually on physics rotating disk, do.
It should be noted that method described above can realize during identical attitude stroke simultaneously.That is, selection, tracking, convergent-divergent, rotation and translation can all be carried out during an attitude stroke, and attitude stroke can comprise opens, rotates and the finger that slides.For example, once put down at least two fingers, shown object (map) just can be associated or be locked onto this two fingers.For convergent-divergent, user can open or close up its finger.In order to rotate, user can rotate its finger.For translation, user's its finger that can slide.Can in motion, there is each in these actions continuously at one simultaneously.For example, user can open and close up its finger, rotates and its finger that slides crosses touch-screen simultaneously.Alternatively, user can be cut apart each in these motions and needn't reset this attitude stroke.For example, first user can open its finger, then rotates its finger, then closes up its its finger of pointing, then slide, etc.
Shall also be noted that and not necessarily always want end user's finger to realize attitude input.If possible, use the pointing apparatus such as stylus to be also enough to realize attitude input.
As United States Patent (USP), the disclosed common pending application No.10/903 of No.US2006/0026521 is disclosed what jointly transfer the possession of, 964 and as the open disclosed application of the No.US2006/0026535 No.11/038 of United States Patent (USP), in 590, illustrated and described can be used as input, in order to realize interface command the additional example of the attitude stroke of---comprise with UI element mutual---, above-mentioned application is all incorporated herein by reference.
Those skilled in the art can carry out many variations and modification and not depart from the spirit and scope of the present invention.Therefore, it must be understood that, it is only for exemplary purposes that the embodiment illustrating is set forth, and they should not be considered to the restriction of the present invention to being defined by the following claims.For example, although described many embodiment of the present invention for personal computing devices here, be to be understood that, the invention is not restricted to desktop computer or laptop computer, but can be applicable at large other computing application, such as mobile communication equipment, multimedia reproducing apparatus independently, etc.
That in this instructions, uses not only will understand from the implication of its common restriction for describing the word of the present invention and each embodiment thereof, and will comprise structure, material or the action outside specific definitions, that exceed common restriction implication in this manual scope.Therefore, if an element can be understood to include more than one implication in the context of the present specification, its use in the claims must be understood to support for this instructions and this word itself likely implication be all general.
Therefore, word in claims or the definition of element are restricted to the combination of the element that not only comprises literal upper elaboration in this manual, and comprise for carry out all equivalent structures, material or the action that basic identical function obtains essentially identical result in essentially identical mode.In this meaning, thereby can expect can be with any one element in two or more element equivalent substitute claims, or can replace two or more elements in claim with individual element.
The unsubstantiality that those skilled in the art expect is derived from advocated theme changes, and is no matter now known or find out later, is all thought to be clearly equivalent within the scope of the claims.Therefore, present or obvious replacement well known by persons skilled in the art is later restricted in the scope of limited claim element.
Therefore, this claim to be understood to include above illustrate especially and describe, conceptive equivalence and can be obviously things instead.For example, the term of recording in claim " computing machine " or " computer system " should at least comprise desk-top computer, laptop computer or for example, for example, such as any mobile computing device of mobile communication equipment (cell phone or Wi-Fi/Skype phone, E-mail communication apparatus, personal digital assistant device) and multimedia reproducing apparatus (, iPod, MP3 player or any digital figure/photo reproducer) and so on.

Claims (6)

1. there is the method on the computing equipment of touch-sensitive display, comprising:
Playing video file on described touch-sensitive display;
The progress bar that displays the play on described touch-sensitive display, described playing progress bar comprises the timeline of the time span of indicating in progress described video file;
The corresponding initial position of detection in described playing progress bar carried out two touch points of initial contact;
Detect two touch points that are moved away from each other in described playing progress bar; And
In response to two touch points that are moved away from each other that detect in described playing progress bar:
According to the movement of described two touch points, launch a part for described playing progress bar, the part being unfolded of wherein said playing progress bar is to be determined by the corresponding initial position of described two touch points; And
Shrink the part outside the part being unfolded of described playing progress bar.
2. the method for claim 1, also comprises:
The playback speed that makes described video file in the time range corresponding with the part being unfolded of described playing progress bar than with the corresponding time range of the part outside the described part being unfolded of described playing progress bar in playback speed slow.
3. the method for claim 1, also comprises:
Detect in described playing progress bar towards mobile each other described two touch points; And
In response to two touch points that in described playing progress bar, court moves each other being detected:
According to a mobile part of shrinking described playing progress bar for described two touch points, the part being retracted of wherein said playing progress bar is to be determined by the corresponding initial position of described two touch points, and
Expand the part outside the described part being retracted of described playing progress bar.
4. there is the device on the computing equipment of touch-sensitive display, comprising:
Device for playing video file on described touch-sensitive display;
For the device of the progress bar that displays the play on described touch-sensitive display, described playing progress bar comprises the timeline of the time span of indicating in progress described video file;
For detection of the corresponding initial position in described playing progress bar, carry out the device of two touch points of initial contact;
Device for detection of two touch points that are moved away from each other in described playing progress bar; And
Be used in response to two touch points that are moved away from each other that detect in described playing progress bar, according to a mobile part of launching described playing progress bar for described two touch points, and the device that shrinks the part outside the part being unfolded of described playing progress bar, the part being unfolded of wherein said playing progress bar is to be determined by the corresponding initial position of described two touch points.
5. device as claimed in claim 4, also comprises:
For the playback speed that makes described video file in the time range corresponding with the part being unfolded of described playing progress bar than with the corresponding time range of the part outside the described part being unfolded of described playing progress bar in the slow device of playback speed.
6. device as claimed in claim 4, also comprises:
Device for detection of described two touch points that in described playing progress bar, court moves each other; And
Be used in response to detecting in described playing progress bar towards two mobile each other touch points, according to a mobile part for described playing progress bar and the device of expanding the part outside the described part being retracted of described playing progress bar of shrinking of described two touch points, the part being retracted of wherein said playing progress bar is to be determined by the corresponding initial position of described two touch points.
CN200780051755.2A 2007-01-05 2007-12-28 Controlling, manipulating, and editing gestures of media files using touch sensitive devices Active CN101611373B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310719094.3A CN103631496B (en) 2007-01-05 2007-12-28 Attitude using touch-sensitive device control, manipulation and editing media file

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
US87875407P 2007-01-05 2007-01-05
US60/878,754 2007-01-05
US11/818,342 2007-06-13
US11/818,342 US7956847B2 (en) 2007-01-05 2007-06-13 Gestures for controlling, manipulating, and editing of media files using touch sensitive devices
PCT/US2007/089162 WO2008083360A1 (en) 2007-01-05 2007-12-28 Gestures for controlling, manipulating, and editing of media files using touch sensitive devices

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN201310719094.3A Division CN103631496B (en) 2007-01-05 2007-12-28 Attitude using touch-sensitive device control, manipulation and editing media file

Publications (2)

Publication Number Publication Date
CN101611373A CN101611373A (en) 2009-12-23
CN101611373B true CN101611373B (en) 2014-01-29

Family

ID=38860004

Family Applications (2)

Application Number Title Priority Date Filing Date
CN 200720194296 Expired - Lifetime CN201181467Y (en) 2007-01-05 2007-12-05 Hand-hold mobile communicating device
CN200780051755.2A Active CN101611373B (en) 2007-01-05 2007-12-28 Controlling, manipulating, and editing gestures of media files using touch sensitive devices

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN 200720194296 Expired - Lifetime CN201181467Y (en) 2007-01-05 2007-12-05 Hand-hold mobile communicating device

Country Status (2)

Country Link
CN (2) CN201181467Y (en)
DE (1) DE202007014957U1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105224220A (en) * 2015-09-08 2016-01-06 深圳市金立通信设备有限公司 A kind of control method of media play and device

Families Citing this family (107)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8018440B2 (en) 2005-12-30 2011-09-13 Microsoft Corporation Unintentional touch rejection
TWI363983B (en) * 2008-04-25 2012-05-11 Benq Corp Interactive electronic apparatus and interaction method thereof
DE102008032451C5 (en) * 2008-07-10 2017-10-19 Rational Ag Display method and cooking appliance therefor
DE102008032448B4 (en) 2008-07-10 2023-11-02 Rational Ag Display method and cooking device therefor
EP2187290A1 (en) * 2008-11-18 2010-05-19 Studer Professional Audio GmbH Input device and method of detecting a user input with an input device
US8957865B2 (en) 2009-01-05 2015-02-17 Apple Inc. Device, method, and graphical user interface for manipulating a user interface object
US8330733B2 (en) * 2009-01-21 2012-12-11 Microsoft Corporation Bi-modal multiscreen interactivity
KR20100086678A (en) * 2009-01-23 2010-08-02 삼성전자주식회사 Apparatus and method for playing of multimedia item
CN101799727B (en) * 2009-02-11 2012-11-07 晨星软件研发(深圳)有限公司 Signal processing device and method of multipoint touch interface and selecting method of user interface image
CN101847055A (en) * 2009-03-24 2010-09-29 鸿富锦精密工业(深圳)有限公司 Input method based on touch screen
KR101640463B1 (en) 2009-05-19 2016-07-18 삼성전자 주식회사 Operation Method And Apparatus For Portable Device
KR101646922B1 (en) 2009-05-19 2016-08-23 삼성전자 주식회사 Operation Method of associated with a communication function And Portable Device supporting the same
US8373669B2 (en) * 2009-07-21 2013-02-12 Cisco Technology, Inc. Gradual proximity touch screen
KR101608770B1 (en) * 2009-08-03 2016-04-04 엘지전자 주식회사 Mobile terminal and method for controlling the same
AU2010297695A1 (en) * 2009-09-23 2012-05-03 Dingnan Han Method and interface for man-machine interaction
TWI430150B (en) * 2009-10-09 2014-03-11 Egalax Empia Technology Inc Method and device for analyzing two dimension sensing information
CN102043507B (en) 2009-10-09 2013-10-23 禾瑞亚科技股份有限公司 Method and device for analyzing positions
TWI457795B (en) 2009-10-09 2014-10-21 Egalax Empia Technology Inc Method and device for position detection
US8842079B2 (en) 2009-10-09 2014-09-23 Egalax—Empia Technology Inc. Method and device for determining a touch or touches
EP2503432A4 (en) 2009-10-09 2014-07-23 Egalax Empia Technology Inc Method and device for dual-differential sensing
US9864471B2 (en) 2009-10-09 2018-01-09 Egalax_Empia Technology Inc. Method and processor for analyzing two-dimension information
CN102043552B (en) 2009-10-09 2016-04-20 禾瑞亚科技股份有限公司 The method and apparatus of capacitive position detection
TWI464623B (en) 2009-10-09 2014-12-11 Egalax Empia Technology Inc Method and device for transoforming sensing information
JP2011107912A (en) * 2009-11-16 2011-06-02 Sony Corp Apparatus, method and program for processing information
US8786559B2 (en) * 2010-01-06 2014-07-22 Apple Inc. Device, method, and graphical user interface for manipulating tables using multi-contact gestures
KR101691938B1 (en) 2010-01-06 2017-01-03 삼성전자주식회사 Method and apparatus for setting of repeat playing section in a portable terminal
US9411504B2 (en) 2010-01-28 2016-08-09 Microsoft Technology Licensing, Llc Copy and staple gestures
US8261213B2 (en) 2010-01-28 2012-09-04 Microsoft Corporation Brush, carbon-copy, and fill gestures
US9519356B2 (en) 2010-02-04 2016-12-13 Microsoft Technology Licensing, Llc Link gestures
US9274682B2 (en) 2010-02-19 2016-03-01 Microsoft Technology Licensing, Llc Off-screen gestures to create on-screen input
US9965165B2 (en) 2010-02-19 2018-05-08 Microsoft Technology Licensing, Llc Multi-finger gestures
US9367205B2 (en) 2010-02-19 2016-06-14 Microsoft Technolgoy Licensing, Llc Radial menus with bezel gestures
US9310994B2 (en) 2010-02-19 2016-04-12 Microsoft Technology Licensing, Llc Use of bezel as an input mechanism
US9454304B2 (en) 2010-02-25 2016-09-27 Microsoft Technology Licensing, Llc Multi-screen dual tap gesture
US8539384B2 (en) * 2010-02-25 2013-09-17 Microsoft Corporation Multi-screen pinch and expand gestures
US9075522B2 (en) 2010-02-25 2015-07-07 Microsoft Technology Licensing, Llc Multi-screen bookmark hold gesture
CN102193714A (en) * 2010-03-11 2011-09-21 龙旗科技(上海)有限公司 Man-machine interactive mode for data grouping management of mobile terminal
JP5557314B2 (en) * 2010-03-24 2014-07-23 Necカシオモバイルコミュニケーションズ株式会社 Terminal device and program
US9990062B2 (en) * 2010-03-26 2018-06-05 Nokia Technologies Oy Apparatus and method for proximity based input
CN101853128A (en) * 2010-05-08 2010-10-06 杭州惠道科技有限公司 Multi-touch method for human-computer interface of slide-wheel
US20110298720A1 (en) * 2010-06-02 2011-12-08 Rockwell Automation Technologies, Inc. System and method for the operation of a touch screen
CN102270081B (en) * 2010-06-03 2015-09-23 腾讯科技(深圳)有限公司 A kind of method and device adjusting size of list element
EP2395440A3 (en) * 2010-06-14 2012-01-11 Lg Electronics Inc. Mobile terminal and conrolling method thereof
CN101957718A (en) * 2010-06-22 2011-01-26 宇龙计算机通信科技(深圳)有限公司 Method and device for moving icons and digital terminal
US8773370B2 (en) 2010-07-13 2014-07-08 Apple Inc. Table editing systems with gesture-based insertion and deletion of columns and rows
CN101986249A (en) * 2010-07-14 2011-03-16 上海无戒空间信息技术有限公司 Method for controlling computer by using gesture object and corresponding computer system
US8982060B2 (en) * 2010-08-27 2015-03-17 Apple Inc. Touch and hover sensor compensation
KR20120020247A (en) * 2010-08-27 2012-03-08 삼성전자주식회사 Portable electronic device, apparatus and method for playing contents
US8972467B2 (en) 2010-08-31 2015-03-03 Sovanta Ag Method for selecting a data set from a plurality of data sets by means of an input device
US8767019B2 (en) 2010-08-31 2014-07-01 Sovanta Ag Computer-implemented method for specifying a processing operation
US9710154B2 (en) 2010-09-03 2017-07-18 Microsoft Technology Licensing, Llc Dynamic gesture parameters
CN101945499A (en) * 2010-09-06 2011-01-12 深圳市同洲电子股份有限公司 Method, terminal and system for transferring files
KR101685991B1 (en) * 2010-09-30 2016-12-13 엘지전자 주식회사 Mobile terminal and control method for mobile terminal
CN102467327A (en) * 2010-11-10 2012-05-23 上海无戒空间信息技术有限公司 Method for generating and editing gesture object and operation method of audio data
CN102025831A (en) * 2010-11-18 2011-04-20 华为终端有限公司 Multimedia playing method and terminal
KR20120075839A (en) * 2010-12-29 2012-07-09 삼성전자주식회사 Method and apparatus for providing mouse right click function in touch screen terminal
CA2823388A1 (en) * 2011-01-06 2012-07-12 Tivo Inc. Method and apparatus for gesture based controls
CN102681748B (en) * 2011-03-09 2015-01-28 联想(北京)有限公司 Information processing equipment and information processing method
CN102799299B (en) * 2011-05-27 2015-11-25 华硕电脑股份有限公司 The computer system of tool touch control screen and the disposal route of gesture thereof
US9281010B2 (en) * 2011-05-31 2016-03-08 Samsung Electronics Co., Ltd. Timeline-based content control method and apparatus using dynamic distortion of timeline bar, and method and apparatus for controlling video and audio clips using the same
JP5751030B2 (en) * 2011-06-03 2015-07-22 ソニー株式会社 Display control apparatus, display control method, and program
CN102243662A (en) * 2011-07-27 2011-11-16 北京风灵创景科技有限公司 Method for displaying browser interface on mobile equipment
CN102243573A (en) * 2011-07-27 2011-11-16 北京风灵创景科技有限公司 Method and device for managing element attribute in application program
EP3413575A1 (en) 2011-08-05 2018-12-12 Samsung Electronics Co., Ltd. Method for controlling electronic apparatus based on voice recognition and electronic apparatus applying the same
KR101262700B1 (en) * 2011-08-05 2013-05-08 삼성전자주식회사 Method for Controlling Electronic Apparatus based on Voice Recognition and Motion Recognition, and Electric Apparatus thereof
CN103001933A (en) * 2011-09-15 2013-03-27 北京同步科技有限公司 Interactive multimedia information distribution terminal and information distribution method thereof
CN102890694A (en) * 2011-09-22 2013-01-23 北京师科阳光信息技术有限公司 Time shaft system and implementation method thereof
US9052810B2 (en) 2011-09-28 2015-06-09 Sonos, Inc. Methods and apparatus to manage zones of a multi-zone media playback system
CN103092389A (en) * 2011-11-04 2013-05-08 德尔福技术有限公司 Touch screen device and method for achieving virtual mouse action
US9405463B2 (en) * 2011-11-25 2016-08-02 Samsung Electronics Co., Ltd. Device and method for gesturally changing object attributes
CN103247310A (en) * 2012-02-14 2013-08-14 索尼爱立信移动通讯有限公司 Multimedia playing control method, playing control module and playing terminal
CN102609143A (en) * 2012-02-15 2012-07-25 张群 Handheld electronic equipment and video playing and controlling method thereof
JP6004693B2 (en) * 2012-03-23 2016-10-12 キヤノン株式会社 Display control apparatus and control method thereof
US20130257792A1 (en) * 2012-04-02 2013-10-03 Synaptics Incorporated Systems and methods for determining user input using position information and force sensing
JP6004716B2 (en) * 2012-04-13 2016-10-12 キヤノン株式会社 Information processing apparatus, control method therefor, and computer program
CN102750096A (en) * 2012-06-15 2012-10-24 深圳乐投卡尔科技有限公司 Vehicle-mounted Android platform multi-point gesture control method
CN102866988B (en) * 2012-08-28 2015-10-21 中兴通讯股份有限公司 A kind of terminal and realization towing thereof copy the method for paste text
US20140109012A1 (en) * 2012-10-16 2014-04-17 Microsoft Corporation Thumbnail and document map based navigation in a document
US20140118265A1 (en) * 2012-10-29 2014-05-01 Compal Electronics, Inc. Electronic apparatus with proximity sensing capability and proximity sensing control method therefor
US9582122B2 (en) 2012-11-12 2017-02-28 Microsoft Technology Licensing, Llc Touch-sensitive bezel techniques
CN103035273B (en) * 2012-12-12 2016-05-25 宁波高新区百瑞音响科技有限公司 A kind of device that utilizes knob type digital code switch to switch audio file
KR102091077B1 (en) * 2012-12-14 2020-04-14 삼성전자주식회사 Mobile terminal and method for controlling feedback of an input unit, and the input unit and method therefor
CN103885623A (en) * 2012-12-24 2014-06-25 腾讯科技(深圳)有限公司 Mobile terminal, system and method for processing sliding event into editing gesture
CN103902173B (en) * 2012-12-26 2017-12-26 联想(北京)有限公司 Portable terminal and its information processing method and display processing method
CN103076985B (en) * 2013-01-31 2016-03-02 北京魔力时间科技有限公司 Accurately manipulate and display video playing progress rate device and using method based on touch screen
WO2014132893A1 (en) * 2013-02-27 2014-09-04 アルプス電気株式会社 Operation detection device
US11209975B2 (en) 2013-03-03 2021-12-28 Microsoft Technology Licensing, Llc Enhanced canvas environments
CN104035696B (en) * 2013-03-04 2017-12-19 观致汽车有限公司 Display methods and device of the vehicle-mounted message center in touch display interface
CN104123088B (en) * 2013-04-24 2018-01-02 华为技术有限公司 Mouse action implementation method and its device and touch screen terminal
US10180728B2 (en) 2013-05-17 2019-01-15 Citrix Systems, Inc. Remoting or localizing touch gestures at a virtualization client agent
CN104216625A (en) * 2013-05-31 2014-12-17 华为技术有限公司 Display object display position adjusting method and terminal equipment
CN103327247B (en) * 2013-06-17 2017-01-11 神思依图(北京)科技有限公司 Portrait collection operation device and method
JP6189680B2 (en) * 2013-08-23 2017-08-30 シャープ株式会社 Interface device, interface method, interface program, and computer-readable recording medium storing the program
US9519420B2 (en) * 2013-10-16 2016-12-13 Samsung Electronics Co., Ltd. Apparatus and method for editing synchronous media
CN104902331B (en) * 2014-03-07 2018-08-10 联想(北京)有限公司 A kind of playing progress rate adjusting method and electronic equipment
US9477337B2 (en) 2014-03-14 2016-10-25 Microsoft Technology Licensing, Llc Conductive trace routing for display and bezel sensors
CN104077028A (en) * 2014-05-28 2014-10-01 天津三星通信技术研究有限公司 Equipment and method for controlling display item in electronic equipment
CN105653111A (en) * 2014-11-14 2016-06-08 神讯电脑(昆山)有限公司 Touch control input method and electronic device thereof
CN104571871A (en) * 2015-01-26 2015-04-29 深圳市中兴移动通信有限公司 Method and system for selecting files
CN105045513B (en) * 2015-08-27 2019-02-12 Oppo广东移动通信有限公司 Touch operation method and handheld device
CN106612425B (en) * 2015-10-23 2019-04-12 腾讯科技(深圳)有限公司 Image adjusting method and terminal device
DE102015222164A1 (en) * 2015-11-11 2017-05-11 Kuka Roboter Gmbh Method and computer program for generating a graphical user interface of a manipulator program
CN105573616B (en) * 2015-12-10 2018-05-29 广东欧珀移动通信有限公司 A kind of playlist control method and mobile terminal
CN105573631A (en) * 2015-12-14 2016-05-11 联想(北京)有限公司 Touch display electronic device and control method thereof
CN106527917B (en) * 2016-09-23 2020-09-29 北京仁光科技有限公司 Multi-finger touch operation identification method for screen interaction system
CN107438839A (en) * 2016-10-25 2017-12-05 深圳市大疆创新科技有限公司 A kind of multimedia editing method, device and intelligent terminal
CN108616771B (en) * 2018-04-25 2021-01-15 维沃移动通信有限公司 Video playing method and mobile terminal

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5900875A (en) * 1997-01-29 1999-05-04 3Com Corporation Method and apparatus for interacting with a portable computer system

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5900875A (en) * 1997-01-29 1999-05-04 3Com Corporation Method and apparatus for interacting with a portable computer system

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105224220A (en) * 2015-09-08 2016-01-06 深圳市金立通信设备有限公司 A kind of control method of media play and device

Also Published As

Publication number Publication date
DE202007014957U1 (en) 2007-12-27
CN201181467Y (en) 2009-01-14
CN101611373A (en) 2009-12-23

Similar Documents

Publication Publication Date Title
CN101611373B (en) Controlling, manipulating, and editing gestures of media files using touch sensitive devices
CN103631496B (en) Attitude using touch-sensitive device control, manipulation and editing media file
CN201266371Y (en) Handhold mobile communication equipment
CN101482794B (en) Mode-based graphical user interfaces for touch sensitive input devices
KR102491683B1 (en) Devices and methods for navigating between user interfaces
EP3108350B1 (en) Music now playing user interface
US20180059928A1 (en) Detecting and interpreting real-world and security gestures on touch and hover sensitive devices
US9383898B2 (en) Information processing apparatus, information processing method, and program for changing layout of displayed objects
US8970503B2 (en) Gestures for devices having one or more touch sensitive surfaces
US7924271B2 (en) Detecting gestures on multi-event sensitive devices
US20100077333A1 (en) Method and apparatus for non-hierarchical input of file attributes
KR20120014067A (en) Gestures for touch sensitive input devices
AU2011253700B2 (en) Gestures for controlling, manipulating, and editing of media files using touch sensitive devices
US20200082465A1 (en) Method and system to generate a multi-panel ui based on hierarchy data corresponding to digital content

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant